We study the problem of compression of a Gaussian vector for the purpose of similarity identification, where similarity is defined by the mean square Euclidean distance between vectors. While the asymptotical fundamental limits of the problem - the minimal compression rate and the error exponent - were found in a previous work, in this paper we focus on the nonasymptotic domain. We first present a finite blocklength achievability bound based on shape-gain quantization: The gain (amplitude) of the vector is compressed via scalar quantization, and the shape (the projection on the unit sphere) is quantized using a spherical code. The results are numerically evaluated, and they converge to the asymptotic values as predicted by the error exponent. For a practical implementation of such a scheme, we use wrapped spherical codes, studied by Hamkins and Zeger, and use the Leech lattice as an example for an underlying lattice. As a side result, we obtain a bound on the covering angle of any wrapped spherical code, as a function of the covering radius of the underlying lattice.
«
We study the problem of compression of a Gaussian vector for the purpose of similarity identification, where similarity is defined by the mean square Euclidean distance between vectors. While the asymptotical fundamental limits of the problem - the minimal compression rate and the error exponent - were found in a previous work, in this paper we focus on the nonasymptotic domain. We first present a finite blocklength achievability bound based on shape-gain quantization: The gain (amplitude) of th...
»