oHe is the father of information theory.
### Main Work
1. Turned circuit design from an art to a science. Starting point of **Digital Circuit Design**. Applied the mathematical disciple of Boolean Algebra to the analysis and synthesis of switching circuits
2. **Communication** -- Crafted a mathematical theory of communication, also how information is produced and transferred
3. Introduced **entropy rate of a probabilistic model** - Used in ergodic theory - the study of long-term behavior of dynamical systems
> Shannon’s theory has now become the standard framework underlying all modern-day communication systems: optical, underwater, even interplanetary.
### Model of Communication
- Transmitter encodes information into a signal
- Signal is corrupted by noise and then decoded by the receiver
2 key insights:
- isolated the information and noise sources from the communication system to be designed,
- modeled both of these sources probabilistically
Before Shannon, the problem of communication was primarily viewed as a deterministic signal-reconstruction problem:
- how to transform a received signal,
- distorted by the physical medium,
- to reconstruct the original as accurately as possible.
> Shannon’s genius lay in his observation that **the key to communication is uncertainty.** Modelling uncertainty by using probability
![[Pasted image 20210727191221.png]]
### Fundamental limits of communication
A bit as a basic unit of uncertainty
1. **[[Entropy]] Rate (H)**- Minimum no. of bits per second - Quantifies the uncertainty involved in determining which message the source will generate
- Lower the entropy rate, the lesser the uncertainty, the easier it is to compress
- For example, texting at the rate of 100 English letters per minute means sending one out of 26^100 possible messages every minute, each represented by a sequence of 100 letters. One could encode all these possibilities into 470 bits, since 2^470 ≈ 26^100. If the sequences were equally likely, then Shannon’s formula would say that the entropy rate is indeed 470 bits per minute.
2. **System's Capacity (C) ** - Max no. of bits that can be reliably communicated in the face of noise
- Speed limit for communication
- Max rate at which the receiver can resolve the message's uncertainty
3. Reliable communication is only possible if and only if _H_ < _C._
> Thus, information is like water: If the flow rate is less than the capacity of the pipe, then the stream gets through reliably.
> Information is the resolution of uncertainty
---
#### Notes
- **Science** seeks the basic laws of nature.
- **Mathematics** searches for new theorems to build upon the old.
- **Engineering** builds systems to solve human needs
**On Communication **
- From smoke signals to carrier pigeons to the telephone to television, humans have always sought methods that would allow them to communicate farther, faster and more reliably.
- Engineering of communication systems was always tied to a specific source / medium
---
##### Main Links
1. [Quanta](https://www.quantamagazine.org/how-claude-shannons-information-theory-invented-the-future-20201222/)
2. [Bitplayer](https://thebitplayer.com/)
3. [Mathematical Theory of Communication](http://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf)
#### Referece
[[Communication Standards]] | [[Future of Optical Networks]] | [[Unveiling the Subsea Secrets - The Intricate Dance of Cables, Control, and Connectivity in the Digital Age]]