An anecdote on entrophy

My greatest concern was what to call it. I thought of calling it “information”, but the word was overly used, so I decided to call it “uncertainty”. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”

When, for the first time, learning laws of thermodynamics[1], regarding entropy, I exactly felt like what von Neumann said about it, “nobody knows what entropy really is.” Only having basic physics courses many years back, I don’t know how the epistemology of this subject has been evolved but it was a very surprising moment when I confronted the book by Thomas and Cover and Shannon’s paper (see my 2nd comment from this slog post Model vs. Model for links related to these references). I’ve been wondering about what philosophy behind the story of sharing the same lexicon in different disciplines (physics, information theory, and statistics in timely order) and this quote somewhat alleviates my burden of curiosity.

Wikiquote says that the Shannon’s words were given in Energy and Information by Tribus and McIrvine in Scientific American (Vol.224, pp.178-184, yr. 1971) and I didn’t know until now that Wikiquote exists.

  1. Wikipedia link[]
Leave a comment