Please use this identifier to cite or link to this item:
|Title:||COMPUTATION OF INFORMATION MEASURES|
|Abstract:||For well-behaved distributions, mutual information can computed using a simple identity with the two distribution’s marginal and conditional entropies. However, when these entropies are ill-defined, more powerful methods are required. This thesis aims to calculate the mutual information of one such distribution given by p(x) = 1/xlog2(x). This is the first known attempt to approximate mutual information of distributions such as these. While I was able to numerically approximate the mutual information of this distribution as well as find meaningful lower bounds, proving the existence of an upper bound remains an open problem.|
|Type of Material:||Princeton University Senior Theses|
|Appears in Collections:||Mathematics, 1934-2020|
Files in This Item:
|PUTheses2015-Zhan_Shuxin.pdf||689.96 kB||Adobe PDF||Request a copy|
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.