Abstract: |
In Optimal Recovery, the task of learning a function from observational data is tackled deterministically by adopting a worst-case perspective tied to an explicit model assumption made on the functions to be learned. Working in the framework of Hilbert spaces, we considers a model assumption based on approximability and the observational inaccuracies modeled via additive errors bounded in either $\ell_2$ or $\ell_1$. This talk shows how to construct the recovery procedure, which can be chosen as linear maps under our problem setting. |
|