Deviation From Maximal Entanglement for Mid-Spectrum Eigenstates of Local Hamiltonians
In a spin chain governed by a local Hamiltonian, we consider a microcanonical ensemble in the middle of the energy spectrum and a contiguous subsystem whose length is a constant fraction of the system size. We prove that if the bandwidth of the ensemble is greater than a certain constant, then the average entanglement entropy (between the subsystem and the rest of the system) of eigenstates in the ensemble deviates from the maximum entropy by at least a positive constant.
Fisher Information Under Local Differential Privacy
We develop data processing inequalities that describe how Fisher information from statistical samples can scale with the privacy parameter $\varepsilon $ under local differential privacy constraints. These bounds are valid under general conditions on the distribution of the score of the statistical model, and they elucidate under which conditions the dependence on $\varepsilon $ is linear, quadratic, or exponential.
On the All-or-Nothing Behavior of Bernoulli Group Testing
In this article, we study the problem of non-adaptive group testing, in which one seeks to identify which items are defective given a set of suitably-designed tests whose outcomes indicate whether or not at least one defective item was included in the test. The most widespread recovery criterion seeks to exactly recover the entire defective set, and relaxed criteria such as approximate recovery and list decoding have also been considered.
Distributed Hypothesis Testing With Variable-Length Coding
The problem of distributed testing against independence with variable-length coding is considered when the average and not the maximum communication load is constrained as in previous works. The paper characterizes the optimum type-II error exponent of a single-sensor single-decision center system given a maximum type-I error probability when communication is either over a noise-free rate-R link or over a noisy discrete memoryless channel (DMC) with stop-feedback. Specifically, let E denote the maximum allowed type-I error probability.
Lower Bounds and a Near-Optimal Shrinkage Estimator for Least Squares Using Random Projections
We consider optimization using random projections as a statistical estimation problem, where the squared distance between the predictions from the estimator and the true solution is the error metric. In approximately solving a large-scale least squares problem using Gaussian sketches, we show that the sketched solution has a conditional Gaussian distribution with the true solution as its mean.
The Limiting Poisson Law of Massive MIMO Detection With Box Relaxation
Estimating a binary vector from noisy linear measurements is a prototypical problem for MIMO systems. A popular algorithm, called the box-relaxation decoder, estimates the target signal by solving a least squares problem with convex constraints. This article shows that the performance of the algorithm, measured by the number of incorrectly-decoded bits, has a limiting Poisson law. This occurs when the sampling ratio and noise variance, two key parameters of the problem, follow certain scalings as the system dimension grows.
Exact Asymptotics for Learning Tree-Structured Graphical Models With Side Information: Noiseless and Noisy Samples
Given side information that an Ising tree-structured graphical model is homogeneous and has no external field, we derive the exact asymptotics of learning its structure from independently drawn samples. Our results, which leverage the use of probabilistic tools from the theory of strong large deviations, refine the large deviation (error exponents) results of Tan et al. (2011) and strictly improve those of Bresler and Karzand (2020). In addition, we extend our results to the scenario in which the samples are observed in random noise.