Abstract: |
Neural networks have proven their versatility in approximating continuous functions, but their capabilities extend far beyond. In this talk, we delve into the realm of functional neural networks, which offer a promising approach for approximating nonlinear smooth functionals. By investigating the convergence rates of approximation and generalization errors under different regularity conditions, we gain insights into the theoretical properties of these networks under the empirical risk minimization framework. This analysis contributes to a deeper understanding of functional neural networks and opens up new possibilities for their effective application in domains such as functional data analysis and scientific machine learning. |
|