Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01gb19f8906
Title: A Direct Sum Theorem for Deterministic Compression under Uncertain Priors
Authors: Ye, Christopher
Advisors: Kol, Gillat
Department: Mathematics
Class Year: 2021
Abstract: We revisit the problem of Compression under Uncertain Priors: the sender knows a distribution P over [N] and the receiver knows a distribution Q over [N] such that P, Q are ∆-close: ∣ log 1 P (x) − log 1 Q(x) ∣ ≤ ∆ for all x ∈ [N]. A message m is sampled from [N] according to P, and the sender wants to communicate m to the receiver with error at most some small constant. When the parties have shared randomness, Juba et al. and Braverman and Juba provide a tight bound of Θ(H(P)+∆) on the expected number of bits required for communication. Haramaty and Sudan posed the question of the minimum number of bits required for a deterministic protocol and asked whether it requires a dependence on N. In this work, we study this deterministic problem in the multi-copy setting where there are k pairs of distributions, denoted {Pi , Qi} such that each pair Pi , Qi is ∆-close. The messages mi are sampled from Pi for all i, and the sender hopes to communicate all k messages, each with some constant error. In this setting, we prove a tight upper and lower bounds of Θ(H(P) + ∆) on the number of bits required per copy in expectation, matching the performance of the randomized setting. Here H(P) = 1 k ∑ k i=1 H(Pi), thus showing that dependence on N is not necessary in the multi-copy setting. In addition, we provide possible directions for tackling the single-copy setting and Haramaty and Sudan’s toy problem, where closing the gap between the upper and lower bounds remains an open problem. This includes a reduction to simultaneous communication complexity, a lower bound of isolating hash families, and a computation of the internal information of any successful communication protocol. We also briefly discuss the motivation behind the notion of ∆-similarity of distributions. So far we have discussed compression in noiseless channels. Finally, we consider the equivalent problem for coding in a noisy channel. We briefly describe an algorithm due to Shulman for channel coding where the receiver is ignorant of the channel law.
URI: http://arks.princeton.edu/ark:/88435/dsp01gb19f8906
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Mathematics, 1934-2023

Files in This Item:
File Description SizeFormat 
YE-CHRISTOPHER-THESIS.pdf487.32 kBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.