Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you understand entropy (which can be thought of as a measure of disorder or randomness), then I think the Wikipedia definition is pretty clear.

  Systems with positive temperature increase in entropy as 
  one adds energy to the system. Systems with negative 
  temperature decrease in entropy as one adds energy to the 
  system.
http://en.wikipedia.org/wiki/Negative_temperature

http://en.wikipedia.org/wiki/Entropy



You can't add a single bit of information to a system with negative temperature, - right or not?


Not, and moreover, you've mixed apples and oranges.

A system's entropy in statisical mechanics is k log W, where k is Boltzmann's constant and W is the number of microstates. The microstate is a configuration of, say, electrons in energy levels.

Information theory has a quantity that behaves much like entropy in stat mech, but is not actually entropy in stat mech.

BTW: The statement in stat mech would be - Adding energy to a system with negative temperature reduces the number of microstates.


>> Information theory has a quantity that behaves much like entropy in stat mech, but is not actually entropy in stat mech.

I was under the impression that these are exactly the same, rather than analogous - http://en.wikipedia.org/wiki/Landauer%27s_principle , http://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_i... and so on (including recent advances).

And when I'm talking about adding a bit of information to the system - that's similar to http://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_i...

>>> Let's try more specific question. Laundauer's principle requires kT ln 2 of heat for every 1 bit of randomness erased from the system. What about systems with negative T? I can't erase bits?


The definitions in the second link are worth thinking about: http://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_i...

There is no requirement that information has a physical representation for information-theoretic entropy. Landauer's result assumes that it does have a physical representation, and derives some physical consequences.


Yes. You are right of course. There is no requirement that information necessarily has a physical representation.

But, if I'm not wrong, this requirement could always be satisfied, for any system with two or more microstates.


A simple example where information-theoretic entropy is used where there is no physical representation: Video codecs.

I'm guessing you've heard of mpeg and h.264 encoding. Which one encodes a movie better? One way of answering this question is to ask: Which codec added less entropy (perhaps for the same compression)?

For that matter: Before Shannon's information entropy, one might wonder if there is a way (another codec) perhaps recovering the information after mpeg coding and decoding. However, now you know that information-entropy can only increase or stay the same, which tells you that subsequent "correction codec" cannot remove entropy introduced by mpeg codec.


Thanks. I don't really like that example with codecs, because I could always argue that any codec can only be a physical system, operating in some environment at temperature T and will be constrained by Landauer's, ets.

Either way, I think we've digressed. I'm actually very happy with yours: "Adding energy to a system with negative temperature reduces the number of microstates.", because this is clear and unambiguous.


How to you carry information without a physical medium?


I think his argument, is that I have no right to talk about bits [edit: in the context of stat mech], before defining physical representation of these bits - that is a physical system, states and transitions between states.


> is that I have no right to talk about bits, before defining physical representation of these bits

I'm saying something slightly different: You can talk about bits in a system without a physical representation; That system can have an information-entropy associated with it. Once you implement a physical system representing those bits, then Landauer's comment applies.


>> A system's entropy in statisical mechanics is k log W, where k is Boltzmann's constant and W is the number of microstates. The microstate is a configuration of, say, electrons in energy levels.

We are not freshman, in natural units (pretty much any system of natural units) Boltzmann's constant is 1. And entropy is measured in bits ('nats', but ln 2 is also equal to 1) ;) . For a system with two microstates entropy would be 1 bit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: