The main aim of the present paper is to provide a full asymptotic analysis of a family of neural network (NN) operators based on suitable density functions within the Lp\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$L<^>p$$\end{document}-setting, and in the space of continuous functions. Two approaches are pursued: the first employs the celebrated Hardy-Littlewood (HL) maximal inequality, while the second adopts a constructive, fully moment-based, method. A crucial step in the proof of the previous results is provided by achieving asymptotic estimates for the NN operators in the cases of functions belonging to Sobolev spaces. By means of the previously mentioned first approach, we are able to establish sharp estimates that can not be applied with p=1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$p=1$$\end{document}, since in that case, the HL maximal inequality fails. This justifies resorting to the second complementary approach, which is revealed to be very useful to cover the remaining case. The asymptotic analysis is finally completed by deducing the corresponding qualitative order of approximation for functions within suitable Lipschitz classes. At the end of the paper, several examples of density functions are also presented and discussed in relation to the previous results. Finally, we recall that NN operators based on the well-known ReLU or RePUs functions are also included in the present theory.
Asymptotic Analysis of Neural Network Operators Employing the Hardy-Littlewood Maximal Inequality
Costarelli, Danilo
;Piconi, Michele
2024
Abstract
The main aim of the present paper is to provide a full asymptotic analysis of a family of neural network (NN) operators based on suitable density functions within the Lp\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$L<^>p$$\end{document}-setting, and in the space of continuous functions. Two approaches are pursued: the first employs the celebrated Hardy-Littlewood (HL) maximal inequality, while the second adopts a constructive, fully moment-based, method. A crucial step in the proof of the previous results is provided by achieving asymptotic estimates for the NN operators in the cases of functions belonging to Sobolev spaces. By means of the previously mentioned first approach, we are able to establish sharp estimates that can not be applied with p=1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$p=1$$\end{document}, since in that case, the HL maximal inequality fails. This justifies resorting to the second complementary approach, which is revealed to be very useful to cover the remaining case. The asymptotic analysis is finally completed by deducing the corresponding qualitative order of approximation for functions within suitable Lipschitz classes. At the end of the paper, several examples of density functions are also presented and discussed in relation to the previous results. Finally, we recall that NN operators based on the well-known ReLU or RePUs functions are also included in the present theory.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.