Skip to content Skip to footer
What Is Entropy? by jfantl

What Is Entropy? by jfantl

15 Comments

  • Post Author
    IIAOPSW
    Posted April 14, 2025 at 6:47 pm

    Its the name for the information bits you don't have.

    More elaborately, its the number bits needed to fully specify something which is known to be in some broad category of state but the exact details to calculate it are unknown.

  • Post Author
    alganet
    Posted April 14, 2025 at 7:15 pm

    Nowadays, it seems to be a buzzword to confuse people.

    We IT folk should find another word for disorder that increases over time, specially when that disorder has human factors (number of contributors, number of users, etc). It clearly cannot be treated in the same way as in chemistry.

  • Post Author
    bargava
    Posted April 14, 2025 at 7:23 pm

    Here is a good overview on Entropy [1]

    [1] https://arxiv.org/abs/2409.09232

  • Post Author
    brummm
    Posted April 14, 2025 at 7:42 pm

    I love that the author clearly describes why saying entropy measures disorder is misleading.

  • Post Author
    glial
    Posted April 14, 2025 at 7:46 pm

    One thing that helped me was the realization that, at least as used in the context of information theory, entropy is a property of an individual (typically the person receiving a message) and NOT purely of the system or message itself.

    > entropy quantifies uncertainty

    This sums it up. Uncertainty is the property of a person and not a system/message. That uncertainty is a function of both a person's model of a system/message and their prior observations.

    You and I may have different entropies about the content of the same message. If we're calculating the entropy of dice rolls (where the outcome is the 'message'), and I know the dice are loaded but you don't, my entropy will be lower than yours.

  • Post Author
    ponty_rick
    Posted April 14, 2025 at 7:54 pm

    As a software engineer, I learned what entropy was in computer science when I changed the way that a function was called which caused the system to run out of entropy in production and caused an outage. Heh.

  • Post Author
    DadBase
    Posted April 14, 2025 at 8:03 pm

    My old prof taught entropy with marbles in a jar and cream in coffee. “Entropy,” he said, “is surprise.” Then he microwaved the coffee until it burst. We understood: the universe favors forgetfulness.

  • Post Author
    NitroPython
    Posted April 14, 2025 at 8:32 pm

    Love the article, my mind is bending but in a good way lol

  • Post Author
    gozzoo
    Posted April 14, 2025 at 8:37 pm

    The visualisation is great, the topic is interesting and very well explained. Can sombody recomend some other blogs with similar type of presentation?

  • Post Author
    nihakue
    Posted April 14, 2025 at 8:56 pm

    I'm not in any way qualified to have a take here, but I have one anyway:

    My understanding is that entropy is a way of quantifying how many different ways a thing could 'actually be' and yet still 'appear to be' how it is. So it is largely a result of an observer's limited ability to perceive / interrogate the 'true' nature of the system in question.

    So for example you could observe that a single coin flip is heads, and entropy will help you quantify how many different ways that could have come to pass. e.g. is it a fair coin, a weighted coin, a coin with two head faces, etc. All these possibilities increase the entropy of the system. An arrangement _not_ counted towards the system's entropy is the arrangement where the coin has no heads face, only ever comes up tails, etc.

    Related, my intuition about the observation that entropy tends to increase is that it's purely a result of more likely things happening more often on average.

    Would be delighted if anyone wanted to correct either of these intuitions.

  • Post Author
    karpathy
    Posted April 14, 2025 at 9:43 pm

    What I never fully understood is that there is some implicit assumption about the dynamics of the system. So what that there are more microstates of some macrostate as far as counting is concerned? We also have to make assumptions about the dynamics, and in particular about some property that encourages mixing.

  • Post Author
    TexanFeller
    Posted April 14, 2025 at 10:08 pm

    I don’t see Sean Carroll’s musings mentioned yet, so repeating my previous comment:

    Entropy got a lot more exciting to me after hearing Sean Carroll talk about it. He has a foundational/philosophical bent and likes to point out that there are competing definitions of entropy set on different philosophical foundations, one of them seemingly observer dependent:
    https://youtu.be/x9COqqqsFtc?si=cQkfV5IpLC039Cl5
    https://youtu.be/XJ14ZO-e9NY?si=xi8idD5JmQbT5zxN

    Leonard Susskind has lots of great talks and books about quantum information and calculating the entropy of black holes which led to a lot of wild new hypotheses.

    Stephen Wolfram gave a long talk about the history of the concept of entropy which was pretty good: https://www.youtube.com/live/ocOHxPs1LQ0?si=zvQNsj_FEGbTX2R3

  • Post Author
    jwilber
    Posted April 14, 2025 at 10:26 pm

    There’s an interactive visual of Entropy here in the Where To Partition section (midway thru the article): https://mlu-explain.github.io/decision-tree/

  • Post Author
    vitus
    Posted April 14, 2025 at 10:30 pm

    The problem with this explanation (and with many others) is that it misses why we should care about "disorder" or "uncertainty", whether in information theory or statistical mechanics. Yes, we have the arrow of time argument (second law of thermodynamics, etc), and entropy breaks time-symmetry. So what?

    The article hints very briefly at this with the discussion of an unequally-weighted die, and how by encoding the most common outcome with a single bit, you can achieve some amount of compression. That's a start, and we've now rediscovered the idea behind Huffman coding. What information theory tells us is that if you consider a sequence of two dice rolls, you can then use even fewer bits on average to describe that outcome, and so on; as you take your block length to infinity, your average number of bits for each roll in the sequence approaches the entropy of the source. (This is Shannon's source coding theorem, and while entropy plays a far greater role in information theory, this is at least a starting point.)

    There's something magical about statistical mechanics where various quantities (e.g. energy, temperature, pressure) emerge as a result of taking partial derivatives of this "partition function", and that they turn out to be the same quantities that we've known all along (up to a scaling factor — in my stat mech class, I recall using k_B * T for temperature, such that we brought everything back to units of energy).

    https://en.wikipedia.org/wiki/Partition_function_(statistica…

    https://en.wikipedia.org/wiki/Fundamental_thermodynamic_rela…

    If you're dealing with a sea of electrons, you might apply the Pauli exclusion principle to derive Fermi-Dirac statistics that underpins all of semiconductor physics; if instead you're dealing with photons which can occupy the same energy state, the same statistical principles lead to Bose-Einstein statistics.

    Statistical mechanics is ultimately about taking certain assumptions about how particles interact with each other, scaling up the quantities beyond our ability to model all of the individual particles, and applying statistical approximations to consider the average behavior of the ensemble. The various forms of entropy are building blocks to that end.

  • Post Author
    anon84873628
    Posted April 14, 2025 at 10:52 pm

    Nitpick in the article conclusion:

    >Heat flows from hot to cold because the number of ways in which the system can be non-uniform in temperature is much lower than the number of ways it can be uniform in temperature …

    Should probably say "thermal energy" instead of "temperature" if we want to be really precise with our thermodynamics terms. Temperature is not a direct measure of energy, rather it is an extensive property describing the relationship between change in energy to change in entropy.

Leave a comment

In the Shadows of Innovation”

© 2025 HackTech.info. All Rights Reserved.

Sign Up to Our Newsletter

Be the first to know the latest updates

Whoops, you're not connected to Mailchimp. You need to enter a valid Mailchimp API key.