A Novel Approach to Entropy and Divergence Estimation Using Giaccardi Inequality and Hermite Interpolating Polynomial
https://doi.org/10.5281/zenodo.18811214
Abstract
This paper develops a generalized analytical framework for divergence and entropy inequalities by introducing a Giacardi type structural refinement of Jensen’s inequality and applying it to Csiszár divergence, Kullback–Leibler divergence, Rényi divergence, Shannon entropy, and Rényi entropy. By constructing ordered intermediate functionals derived from convexity principles, we obtain sharper hierarchical bounds that preserve the intrinsic structure of divergence measures. The framework is further extended through generalized Montgomery identities to incorporate higher order convexity, leading to new identities and inequality representations for the associated nonnegative functionals. Applications to the Zipf–Mandelbrot law and its entropy-maximizing hybrid generalization demonstrate the effectiveness of the proposed approach in probabilistic models governed by power law distributions. The results unify convex analysis, information theory, and entropy maximization within a single Giacardi inequality structure and provide systematic tools for refined estimation of divergence and entropy quantities.