Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"A Mathematical Theory of Communication" is a beautiful read. You easily find the PDF but once the latex source was available as well - in case you want to reformat for e-book reader: https://web.archive.org/web/20130129025547/http://cm.bell-la...


I cannot overstate this point and upvote it enough.

The reason Claude Shannon is a legend has a lot to do with the fact that his ideas are not just correct and draw from his multidisciplinary knowledge, but also expertly communicated. It is an incredibly readable paper and only absolutely requires a little bit of mathematics when he is describing how to convert a state machine for Morse code to a matrix which would allow someone to calculate how many bits per time unit can actually be transmitted in Morse code -- and this is not absolutely essential, it is calculated in a different way and also it is communicated that you can easily take his word for it that it is whatever numerical value it was. A lot of the arguments about fitting a stream of symbols into an encoding for noisy channels have these lovely diagrams that help to elucidate exactly what he's talking about.

If you want to make that same sort of impact, it is not just important to be a great mind, but to spend a bunch of time practicing how you communicate that information.


It was said that had the transistor not been discovered at Bell Labs, it would have developed somewhere else in the world within years. In fact, very similar work was taking place at Westinghouse in France under Mataré [0] and Heinrich Welker.

On the other hand, with Information Theory, Shannon was considered to have been ahead by decades.

The closest related work that is considered to come from an independent line of inquiry (taking place in the USSR), but similarly fundamental is Kolmogorov complexity (algorithmic entropy) by Andrey Kolmogorov [1] in 1963.

While Shannon made the huge leap on his own, it should be noted that Harry Nyquist [2] (his colleague at Bell Labs) laid essential foundations through his Nyquist Stability Criterion and studies on bandwidth. This came after harmonic analysis (Fourier transform, etc.) appeared in the 1800s.

[0] https://en.wikipedia.org/wiki/Herbert_Matar%C3%A9

[1] https://en.wikipedia.org/wiki/Andrey_Kolmogorov

[2] https://en.wikipedia.org/wiki/Harry_Nyquist


As I mentioned elsewhere, I recall reading Turing was starting to think about what we now refer to as information theory, but shortly thereafter he met Shannon. After hearing Shannon describe his work, Turing was so impressed he decided to leave information theory for Shannon and work elsewhere.


Exactly. Effective communication is what made Feynman a legend.

Sadly, many extremely smart and profound scientists are quite incapable of (or not interested in) conveying clearly their thoughts to general audiences.


I guess it's fitting that a man who created a theory of communicating information was effective at communicating information.


It’s actually not that common and a common fallacy. We even have a phrase “the tailors kids have the worst shoes” indicating that experts and geniuses don’t necessarily apply their own logic because application is different from theory.


If this is you, you should find a partner a level or few below you in mathematical/scientific prowess and above in communication skills, who can learn your theories and popularize them.


Ironically Feynman's biggest contribution -- the path integral formalism -- was mainly regarded as incomprehensible until Dyson explained its relationship to the canonical formalism.


I don't think it's his biggest contribution to physics; how about QED?

The path integral formalism or the diagrams, albeit being mathematically obscure at first where quite clear from the intuitive viewpoint once he explained them.


The ones I know who are both capable of and interested in it do not usually have the time to do so, and if they do, it is because they became teachers who sell their output to students.


>If you want to make that same sort of impact, it is not just important to be a great mind, but to spend a bunch of time practicing how you communicate that information.

The most insidious cause of this is that getting a permanent position in a department still often depends on publishing in the top journals within that department's subject, which means explaining it only to the point that it is coherent to that subject's experts. Any further elucidation is sometimes seen as simplification or over-analysis, and to the detriment of getting an article accepted in subject-leading venues.

Or, to put it another way, putting more bits down the channel than are needed for comprehension by the editor is seen as unnecessary redundancy.


"The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning..."

(emphasis in the original)

I find that second sentence humorous every time I read it... I wonder if he had seen the current state of internet discussion if he would have changed "frequently" to "occasionally".


> I wonder if he had seen the current state of internet discussion if he would have changed "frequently" to "occasionally"

Which is accurate. A lot of Internet communication is a tracker phoning home, TLS handshakes and other communication with content but no "meaning" per se. In Shannon's day, such industrial communications--over telegraph--were common enough to warrant mentioning but not prevalent enough to make up the bulk of human communication traffic.


Metadata is data. When Shannon says "meaning", he is including control channel information as well. He is distinguishing variable messages from copying constants, which is a different and easier challenge.


Meaning in this case essentially means context. You have a communications channel, it delivers to you four 32-bit integers, what do they represent? Positional coordinates? One pixel in RGBA? In CMYK? IPv4 port/address 4-tuple with zero-padded 16-bit ports? A single IPv6 address? Big endian or little endian?

The bits are the same, the meaning is different.


I was introduced to Shannon's theory through Pierce's text, which is also surprisingly good and cheap [1].

[1] https://www.amazon.com/Introduction-Information-Theory-Symbo...


I was introduced to Shannon through the same text, and also found it good and cheap. I second the recommendation!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: