• Concepts of Information:
    • Information is defined in terms of probability and surprise. The less likely a message is, the more information it carries.
    • Shannon’s definition of information as the base-two logarithm of the probability of a message appearing – ties into the concept of information as “surprise”
  • Entropy:
    • Shannon’s “entropy” is the average information per symbol in a message. This concept is essential in understanding the efficiency and capacity of communication systems.
    • Entropy measures the expected amount of information in a message, considering the probabilities of different symbols.
  • Coding Techniques:
    • Huffman coding is a popular method for coding messages with varying symbol probabilities, aiming to minimize the average length of codes used to represent symbols.
  • Error Handling:
    • The importance of designing systems that can handle errors and unreliable components.
    • Techniques to detect and correct errors in data transmission are crucial for maintaining the integrity of information in communication systems.