Abstract
Upper bounds are obtained for the amount of information in functional classes with initial uncertainty. These classes are similar to the classes of images with bounded variance. We use the probability model of the theory of information complexity. Unlike the case of classesKH α0 , it turns out that, in this case, the method of differential pulse-code modulation does not give any advantage as compared to the deterministic case.
References
I. F. Traub, G. W. Wasilkowski, and H. Wozniakowski,Information-Based Complexity, Academic Press, Boston (1988).
I. Ya. Tyrygin, “Refined estimates for the ɛ-entropy of the classesKH α0 ,”Ukr. Mat. Zh.,46, No. 6, 760–764 (1994).
I. Ya. Tyrygin,Optimal Digital Coding of a Sequence of Complicated TV Images [in Russian], Preprint, Institute of Mathematics, Ukrainian Academy of Science, Kiev (1991).
A. G. Vitushkin,Estimation of Complexity for the Problem of Tabulation [in Russian], Fizmatgiz, Moscow (1959).
A. N. Kolmogorov and V. M. Tikhomirov, “E-entropy and ε-capacity of sets in function spaces,”Usp. Mat. Nauk,14, No. 2, 3–86 (1959).
I. Ya. Tyrygin, “The ɛ-entropy approach to the problem of compression of information,”Ukr. Mat. Zh.,44, No. 11, 1598–1604 (1992).
R. E. Krichevskii,Compression and Retrieval of Information [in Russian], Radio i Svyaz', Moscow (1989).
Author information
Authors and Affiliations
Additional information
Translated from Ukrainskii Matematicheskii Zhurnal, Vol. 47, No. 4, pp. 573–576, April, 1995.
Rights and permissions
About this article
Cite this article
Tyrygin, I.Y. Estimates of amount of information in the probability model of image coding. Ukr Math J 47, 665–669 (1995). https://doi.org/10.1007/BF01056057
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF01056057