In this paper, we develop the theory for a family of neural network (NN) operators of the Kantorovich type, in the general setting of Orlicz spaces. In particular, a modular convergence theorem is established. In this way, we study the above family of operators in many instances of useful spaces by a unique general approach. The above NN operators provide a constructive approximation process, in which the coefficients, the weights, and the thresholds of the networks needed in order to approximate a given function f , are known. At the end of the paper, several examples of Orlicz spaces, and of sigmoidal activation functions for which the present theory can be applied, are studied in details.
Convergence for a family of neural network operators in Orlicz spaces
Costarelli, Danilo;Vinti, Gianluca
2017
Abstract
In this paper, we develop the theory for a family of neural network (NN) operators of the Kantorovich type, in the general setting of Orlicz spaces. In particular, a modular convergence theorem is established. In this way, we study the above family of operators in many instances of useful spaces by a unique general approach. The above NN operators provide a constructive approximation process, in which the coefficients, the weights, and the thresholds of the networks needed in order to approximate a given function f , are known. At the end of the paper, several examples of Orlicz spaces, and of sigmoidal activation functions for which the present theory can be applied, are studied in details.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.