- Understanding the inequality
*H*(*Y*) <*H*(*C*)- Most functions are information lossy. This is because entropy of outcome of a function is not necessarily the same as entropy of the compound symbol.

For instance*x*+_{i}*η*=_{j}*y*=_{k}*f*(*x*,_{i}*η*)._{j} - Confounding causes for information loss can be identified. For example,
*y*_{2}receives inputs from both*x*_{0}and*x*_{1}, and confounds the input.

- The confounding of input can be minimized by normalization to improve signal–to–noise ratio.

- Most functions are information lossy. This is because entropy of outcome of a function is not necessarily the same as entropy of the compound symbol.
- Quality of information is the consideration of information in terms of its usefulnes/uselessness. For example
- Presence of noise is higly undesirable in communication systems.
- Addition is information lossy (2 + 3 = 3 + 2 = 5). Unlike communication systems this is useful for adder as it is a controlled loss.

- Some information channels may be broken down to component information channels.
- Mutual information of erasure channel is greater than that of binary symmetric channel, i.e.,

*I*(*X*;*Y*^{3}) >*I*(*X*;*Y*^{2}).

This is because we can transmit without any information loss through the erasure channel at the cost of sending more data symbols.