A Global Universality of Two-Layer Neural Networks with ReLU Activations

Naoya Hatano, Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano

Research output: Contribution to journalArticlepeer-review

Abstract

In the present study, we investigate a universality of neural networks, which concerns a density of the set of two-layer neural networks in function spaces. There are many works that handle the convergence over compact sets. In the present paper, we consider a global convergence by introducing a norm suitably, so that our results will be uniform over any compact set.

Original languageEnglish
Article number6637220
JournalJournal of Function Spaces
Volume2021
DOIs
Publication statusPublished - 2021
Externally publishedYes

ASJC Scopus subject areas

  • Analysis

Fingerprint

Dive into the research topics of 'A Global Universality of Two-Layer Neural Networks with ReLU Activations'. Together they form a unique fingerprint.

Cite this