Approximation by multivariate neural network operators in mixed norm space: theory and application

Priyanka Majethiya, Shivam Bajpeyi

Published: 2025/9/23

Abstract

On the one hand, the framework of mixed norm spaces has potential applications in different areas of mathematics. On the other hand, neural network (NN) operators are well established as approximators, attracting significant attention in the fields of approximation theory and signal analysis. In this article, we aim to integrate both these concepts. Here, we analyze multivariate Kantorovich-type NN operators as approximators based on sigmoidal activation function within various mixed norm structures, namely mixed norm Lebesgue spaces and mixed norm Orlicz spaces. The Orlicz spaces consist of various significant function spaces, such as classical \textit{Lebesgue spaces, Zygmund spaces and Exponential spaces}. We establish the boundedness and convergence of multivariate Kantorovich-type NN operators in these mixed norm function spaces. At the end, the approximation abilities are illustrated through graphical representations and error-estimates along with application in image processing, including image reconstruction, image scaling, image inpainting and image denoising. Finally, we address the beneficial impact of mixed norm structure on aforementioned image processing tasks.