Refinement of Two Fundamental Tools in Information Theory
Event details
| Date | 24.06.2011 |
| Hour | 10:15 |
| Speaker | Prof. Raymond Yeung, The Chinese University of Hong Kong |
| Location | |
| Category | Conferences - Seminars |
In Shannon's original paper and textbooks in information theory, the entropy of a discrete random variable is assumed or shown to be a continuous function. However, we found that all Shannon's information measures including entropy and mutual information are discontinuous in the general case that the random variables take values in possibly countably infinite alphabets. This fundamental property explains why strong typicality and Fano's inequality can only be applied on finite alphabets. Note that strong typicality and Fano's inequality have wide applications in information theory so that it is important to extend them in full generality. In this talk, details about the discontinuity of Shannon's information measures will be given. We will show how these results lead to a new definition of typicality and an inequality tighter than Fano's inequality. Applications in network coding and information theoretic security will be discussed. Prof. Yeung's homepage
Practical information
- General public
- Free