Information-Theoretic Limits of Randomness Generation
Randomness generation has numerous applications, for example, in computer simulations, Monte Carlo methods and cryptography. The seminal work of Knuth and Yao showed that any discrete random variable can be generated using fair coin flips, and the expected number of flips can be bounded by the entropy of the random variable plus 2, establishing a connection between randomness generation and information theory.
In this talk, we investigate several multi-user randomness generation settings from an information-theoretic point of view. First, we consider the distributed generation setting in which Alice and Bob share a common sequence of coin flips, and each of them would like to generate a random variable such that the two random variables follow a prescribed joint distribution. We design a scheme to generate any log-concave distribution (including multivariate Gaussian) using a small number of common coin flips. We then relate this setting to the latent variable model which is widely used in statistics and machine learning.
Next, we discuss the remote generation setting in which Alice, who knows the chosen distribution, wishes to send a message to Bob to allow him to generate a random variable according to this distribution. We design a scheme that applies to any continuous distribution. This scheme has applications in lossy compression, simulation of quantum entanglement, and mixed strategies in games with communication constraints.
Cheuk Ting Li received the B.Sc. degree in Mathematics and B.Eng. degree in Information Engineering from The Chinese University of Hong Kong in 2012, and the M.S. degree in Electrical Engineering from Stanford University in 2014. He is currently working toward the Ph.D. degree at Stanford University. His research interests include generation of random variables, non-asymptotic schemes in information theory, wireless communications and information-theoretic secrec