## 2018 |

Koch, Tobias; Vazquez-Vilar, Gonzalo A Rigorous Approach to High-Resolution Entropy-Constrained Vector Quantization Journal Article IEEE Transactions on Information Theory, 64 (4), pp. 2609-2625, 2018, ISSN: 0018-9448. Links | BibTeX | Tags: Distortion, Distortion measurement, Entropy, Entropy constrained, high resolution, Probability density function, quantization, Rate-distortion, Rate-distortion theory, Vector quantization @article{koch-TIT2018a, title = {A Rigorous Approach to High-Resolution Entropy-Constrained Vector Quantization}, author = {Tobias Koch and Gonzalo Vazquez-Vilar}, doi = {10.1109/TIT.2018.2803064}, issn = {0018-9448}, year = {2018}, date = {2018-04-01}, journal = {IEEE Transactions on Information Theory}, volume = {64}, number = {4}, pages = {2609-2625}, keywords = {Distortion, Distortion measurement, Entropy, Entropy constrained, high resolution, Probability density function, quantization, Rate-distortion, Rate-distortion theory, Vector quantization}, pubstate = {published}, tppubtype = {article} } |

## 2016 |

Koch, Tobias The Shannon Lower Bound Is Asymptotically Tight Journal Article IEEE Transactions on Information Theory, 62 (11), pp. 6155–6161, 2016, ISSN: 0018-9448. Abstract | Links | BibTeX | Tags: Journal, R{é}nyi information dimension, Rate-distortion theory, Shannon lower bound @article{Koch2016b, title = {The Shannon Lower Bound Is Asymptotically Tight}, author = {Tobias Koch}, url = {http://ieeexplore.ieee.org/document/7556344/}, doi = {10.1109/TIT.2016.2604254}, issn = {0018-9448}, year = {2016}, date = {2016-11-01}, journal = {IEEE Transactions on Information Theory}, volume = {62}, number = {11}, pages = {6155--6161}, abstract = {The Shannon lower bound is one of the few lower bounds on the rate-distortion function that holds for a large class of sources. In this paper, which considers exclusively norm-based difference distortion measures, it is demonstrated that its gap to the rate-distortion function vanishes as the allowed distortion tends to zero for all sources having finite differential entropy and whose integer part has finite entropy. Conversely, it is demonstrated that if the integer part of the source has infinite entropy, then its rate-distortion function is infinite for every finite distortion level. Thus, the Shannon lower bound provides an asymptotically tight bound on the rate-distortion function if, and only if, the integer part of the source has finite entropy.}, keywords = {Journal, R{é}nyi information dimension, Rate-distortion theory, Shannon lower bound}, pubstate = {published}, tppubtype = {article} } The Shannon lower bound is one of the few lower bounds on the rate-distortion function that holds for a large class of sources. In this paper, which considers exclusively norm-based difference distortion measures, it is demonstrated that its gap to the rate-distortion function vanishes as the allowed distortion tends to zero for all sources having finite differential entropy and whose integer part has finite entropy. Conversely, it is demonstrated that if the integer part of the source has infinite entropy, then its rate-distortion function is infinite for every finite distortion level. Thus, the Shannon lower bound provides an asymptotically tight bound on the rate-distortion function if, and only if, the integer part of the source has finite entropy. |