Fano’s Inequality for Probability based on Renyl’s Information

Main Article Content

Abstract

An upper bound to error probability has been presented in terms of Shannon entropy [6]. In this paper, we obtain Fano's bound for probability based on Renyi's entropy [5]. Further, lower bound for average probability of error is calculated in terms of channel capacity.

References

Ash R., Information Theory, Interscience Publishers, New York 1968.

Erdogmus D. and Principe J.C., “Information Transfer Through Classifiers and its Relation to Probability of Error”, Intl. Joint Conf. On Neural Networks, pp. 50-54, July 2001.

Hellman M.E. and Raviv J., “Probability of Error, Equivocation and the Chernoff Bound”, IEEE Trans. Inform. Theory, vol. IT 16, pp. 368-372, 1970.

Lainiotis D.G., “A Class of Upper Bounds on Probability of Error for Multihypothesis Pattern Recognition”, IEEE Trans. Inform. Theory, vol. IT 15, pp. 730-731, 1969.

Renyi A., “On Measures of Entropy and Information”, in Proc. 4th Berkeley Symp. Math. And Probability, vol. 1, pp. 547-561, 1960.

Shannon C.E., “A Mathematical Theory of Communication”, Bell System Tech. Journal, vol. 27, 379-423, 1948.

RESEARCH ARTICLE