Measuring Dependence via Mutual Information

Loading...
Thumbnail Image

Authors

Lu, Shan

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Considerable research has been done on measuring dependence between random variables. The correlation coefficient is the most widely studied linear measure of dependence. However, the limitation of linearity limits its application. The informational coefficient of correlation is defined in terms of mutual information. It also has some deficiencies, such as it is only normalized to continuous random variables.

Based on the concept of the informational coefficient of correlation, a new dependence measure, which we call the L-measure, is proposed in this work which generalizes Linfoot's measure for both continuous and discrete random variables. To further elucidate its properties, simulated models are used, and estimation algorithms are also discussed. Furthermore, another measure based on the L-measure, which we call the intrinsic L-measure, is studied for the purpose of studying nonlinear dependence. Based on criteria for a dependence measure presented by Renyi and simulation results in this thesis, we believe that the L-measure is satisfactory as a dependence measure.

Description

Thesis (Master, Mathematics & Statistics) -- Queen's University, 2011-09-30 14:29:35.153

Keywords

Mutual Information, Dependence Measure

Citation

Endorsement

Review

Supplemented By

Referenced By