Aktivitäten pro Jahr
Abstract
We introduce so-called functional input neural networks defined on a possibly infinite dimensional weighted space with values also in a possibly infinite dimensional output space. To this end, we use an additive family as hidden layer maps and a non-linear activation function applied to each hidden layer. Relying on Stone-Weierstrass theorems on weighted spaces, we can prove a global universal approximation result for generalizations of continuous functions going beyond the usual approximation on compact sets. This then applies in particular to approximation of (non-anticipative) path space functionals via functional input neural networks. As a further application of the weighted Stone-Weierstrass theorem we prove a global universal approximation result for linear functions of the signature. We also introduce the viewpoint of Gaussian process regression in this setting and show that the reproducing kernel Hilbert space of the signature kernels are Cameron-Martin spaces of certain Gaussian processes. This paves the way towards uncertainty quantification for signature kernel regression.
| Originalsprache | Englisch |
|---|---|
| Seitenumfang | 57 |
| Publikationsstatus | Eingereicht - 5 Juni 2023 |
ÖFOS 2012
- 102019 Machine Learning
- 101024 Wahrscheinlichkeitstheorie
- 101007 Finanzmathematik
Fingerprint
Untersuchen Sie die Forschungsthemen von „Global universal approximation of functional input maps on weighted spaces“. Zusammen bilden sie einen einzigartigen Fingerprint.Aktivitäten
- 4 Vortrag
-
Global universal approximation of functional input maps on weighted spaces
Cuchiero, C. (Invited speaker)
28 Nov. 2023Aktivität: Vorträge › Vortrag › Science to Science
-
Global universal approximation of functional input maps on weighted spaces
Cuchiero, C. (Invited speaker)
17 Okt. 2023Aktivität: Vorträge › Vortrag › Science to Science
-
Global universal approximation of functional input maps on weighted spaces
Cuchiero, C. (Invited speaker)
22 Sept. 2023 → …Aktivität: Vorträge › Vortrag › Science to Science