background
logo
ArxivPaperAI

Exact mutual information for lognormal random variables

Author:
Maurycy Chwiłka, Jan Karbowski
Keyword:
Condensed Matter, Disordered Systems and Neural Networks, Disordered Systems and Neural Networks (cond-mat.dis-nn), Statistics Theory (math.ST)
journal:
--
date:
2023-06-22 16:00:00
Abstract
Stochastic correlated observables with lognormal distribution are ubiquitous in nature, and hence they deserve an exact information-theoretic characterization. We derive a general analytical formula for mutual information between vectors of lognormally distributed random variables, and provide lower and upper bounds on its value. That formula and its bounds involve determinants and traces of high dimensional covariance matrices of these variables. Exact explicit forms of mutual information are calculated for some special cases and types of correlations. As an example, we provide an analytic formula for mutual information between neurons, relevant for neural networks in the brain.
PDF: Exact mutual information for lognormal random variables.pdf
Empowered by ChatGPT