The data driven extrapolation requires the definition of a functional model depending on the available data and has the application scope of providing reliable predictions on the unknown dynamics. Since data might be scattered, we drive our attention towards kernel models that have the advantage of being meshfree. Precisely, the proposed numerical method makes use of the so-called Variably Scaled Kernels (VSKs), which are introduced to implement a feature augmentation-like strategy based on discrete data. Due to the possible uncertainty on the data and since we are interested in modelling the behaviour of the target functions, we seek for a regularized solution by ridge regression. Focusing on polyharmonic splines, we investigate their implementation in the VSK setting and we provide error bounds in Beppo–Levi spaces. The performances of the method are then tested on functions showing exponential or rational decay. Comparisons with Support Vector Regression (SVR) are also carried out and highlight that the proposed approach is effective, particularly since it does not require to train complex architecture constructions.

Data-Driven Extrapolation Via Feature Augmentation Based on Variably Scaled Thin Plate Splines

Campagna R.
;
2021

Abstract

The data driven extrapolation requires the definition of a functional model depending on the available data and has the application scope of providing reliable predictions on the unknown dynamics. Since data might be scattered, we drive our attention towards kernel models that have the advantage of being meshfree. Precisely, the proposed numerical method makes use of the so-called Variably Scaled Kernels (VSKs), which are introduced to implement a feature augmentation-like strategy based on discrete data. Due to the possible uncertainty on the data and since we are interested in modelling the behaviour of the target functions, we seek for a regularized solution by ridge regression. Focusing on polyharmonic splines, we investigate their implementation in the VSK setting and we provide error bounds in Beppo–Levi spaces. The performances of the method are then tested on functions showing exponential or rational decay. Comparisons with Support Vector Regression (SVR) are also carried out and highlight that the proposed approach is effective, particularly since it does not require to train complex architecture constructions.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11591/459096
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 4
social impact