In this paper, motivated by the recent development of the approximation capability of neural network operators, we aim to construct neural network (NN) operators for multivariate fractional integrals of order ������, which are equipped with a vector-valued function ������and its Jacobian matrix. In this system, the connection strengths between output neurons are represented by fractional mean values, which depend on the vector -valued function ������, of the approximated multivariate function. To estimate the rate of convergence (learning rate), the connection weights of NN are equipped with a decreasing sequence (������������). The activation function of the system is constructed using a linear collection ������������, which consists of density functions generated by multivariate sigmoidal functions. Our goal is to construct a flexible and productive hybrid system by leveraging a diverse selection of the function ������, the parameter ������, the sequence (������������) and the activation function ������������. The quantitative estimates of the operators are examined by means of the multivariate modulus of continuity. Moreover, we provide some illustrative examples with graphs to demonstrate the approximation performance of the operators through the selected activation functions. Finally, we present numerical results consisting of the maximum absolute errors of approximation for the proposed operators based on various selections of the vector-valued function psi

Neural network operators of generalized fractional integrals equipped with a vector-valued function

Costarelli D.;
2023

Abstract

In this paper, motivated by the recent development of the approximation capability of neural network operators, we aim to construct neural network (NN) operators for multivariate fractional integrals of order ������, which are equipped with a vector-valued function ������and its Jacobian matrix. In this system, the connection strengths between output neurons are represented by fractional mean values, which depend on the vector -valued function ������, of the approximated multivariate function. To estimate the rate of convergence (learning rate), the connection weights of NN are equipped with a decreasing sequence (������������). The activation function of the system is constructed using a linear collection ������������, which consists of density functions generated by multivariate sigmoidal functions. Our goal is to construct a flexible and productive hybrid system by leveraging a diverse selection of the function ������, the parameter ������, the sequence (������������) and the activation function ������������. The quantitative estimates of the operators are examined by means of the multivariate modulus of continuity. Moreover, we provide some illustrative examples with graphs to demonstrate the approximation performance of the operators through the selected activation functions. Finally, we present numerical results consisting of the maximum absolute errors of approximation for the proposed operators based on various selections of the vector-valued function psi
2023
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11391/1565197
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact