Skip navigation
Please use this identifier to cite or link to this item:
Title: Information Theory from A Functional Viewpoint
Authors: Liu, Jingbo
Advisors: Cuff, Paul
Verdu, Sergio
Contributors: Electrical Engineering Department
Keywords: common randomness
functional inequality
high dimensional
information theory
Subjects: Electrical engineering
Issue Date: 2018
Publisher: Princeton, NJ : Princeton University
Abstract: A perennial theme of information theory is to find new methods to determine the fundamental limits of various communication systems, which potentially helps the engineers to find better designs by eliminating the deficient ones. Traditional meth- ods have focused on the notion of “sets”: the method of types concerns the cardinality of subsets of the typical sets; the blowing-up lemma bounds the probability of the neighborhood of decoding sets; the single-shot (information-spectrum) approach uses the likelihood threshold to define sets. This thesis promotes the idea of deriving the fundamental limits using functional inequalities, where the central notion is “func- tions” instead of “sets”. A functional inequality follows from the entropic definition of an information measure by convex duality. For example, the Gibbs variational formula follows from the Legendre transform of the relative entropy. As a first example, we propose a new methodology of deriving converse (i.e. im- possibility) bounds based on convex duality and the reverse hypercontractivity of Markov semigroups. This methodology is broadly applicable to network information theory, and in particular resolves the optimal scaling of the second-order rate for the previously open “side-information problems”. As a second example, we use the functional inequality for the so-called Eγmetric to prove non-asymptotic achievability (i.e. existence) bounds for several problems including source coding, wiretap channels and mutual covering. Along the way, we derive general convex duality results leading to a unified treat- ment to many inequalities and information measures such as the Brascamp-Lieb in- equality and its reverse, strong data processing inequality, hypercontractivity and its reverse,transportation-cost inequalities, and Rényi divergences. Capitalizing on such dualities, we demonstrate information-theoretic approaches to certain properties of functional inequalities, such as the Gaussian optimality. This is the antithesis of the main thesis (functional approaches to information theory).
Alternate format: The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog:
Type of Material: Academic dissertations (Ph.D.)
Language: en
Appears in Collections:Electrical Engineering

Files in This Item:
File Description SizeFormat 
Liu_princeton_0181D_12396.pdf1.99 MBAdobe PDFView/Download

Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.