“Entropy and Information Control: the Key to Understanding How to Mount the Fightback Against Trump and Other Populists”

Yves here. This piece is so science-illiterate and otherwise incoherent that I thought our physics and chemistry-knowledgeable readers would have particular fun picking it apart. Even a mere high school nerd, it’s not hard to see through what a complete hash the author makes of the concept of entropy. Entropy is the propensity of a system to move to a more disordered state over time. A good example of low entropy is the old fashioned static noise from a TV when the signal went out. It’s flat, unvarying, and with no informational content (unless those evil Rooskies are sending a mind-control signal at super high frequencies).

It’s pretty telling that the author, a prof of mathematics, is mis-applying or mis-translating the concept of entropy (or what he thinks laypeople think it means) to confer authority on him to talk about politics.

By contrast, this article tries to depict volatility as an increase in entropy. Volatility results from MORE energy being pumped into a system. It can lead to a state change and that higher energy system may look more disordered (steam versus water). But it takes energy to preserve or increase order in a system.1 Perhaps he does not want to consider that the populists want a differently ordered system (traditional values, smaller state, moar “freedom” or for very much hollowed out progressives, more taxing of the rich, more social services, a reduction in US and EU meddling with other countries)?

And that’s before getting to the fact that the much-bemoaned polarization is to no small the degree of amplification by social media algos of emotionally engaging, as in extreme, content, Elites demonizing populists is a tell that they are opposed to the messiness of democracy and would rather use force (information control) rather than do a better job of being in charge.

By Dorje C. Brody, Professor of Mathematics, University of Surrey. Originally published at The Conversation

The spectacular comeback of US president-elect Donald Trump has taken the world by surprise. No doubt people can point to various explanations for his election victory, but in my view, the science of information will pave the way towards deeper insights. Unless the Democrats – and their counterparts around the world – can develop a better understanding of how people receive and reject information, they will never fully understand what happened or successfully fight elections in the future.

There is a fundamental law of nature, known in physical science as the second law. This says that, over time, noise will overwhelm information and uncertainties will dominate. Order will be swamped by confusion and chaos. From a single particle to the whole universe, every system known to science obeys this law. That includes political systems, or societies.

Whenever there is progress in communication technology, people circulate more and more inessential or inaccurate information. In a political system, this is what leads to the noise domination described by the second law.

In science, the quantity that measures the degree of uncertainty is known as entropy. The second law therefore says that entropy can only increase, at least on average.

While entropy does not reduce spontaneously, it is possible to reduce it by spending energy – that is, at a cost. This is exactly what life is about – we create internal order, thus reducing entropy, by consuming energy in the form of food.

For a biological system to survive, it has to reduce uncertainties about the state of its environment. So there are two opposing trends: we don’t like uncertainties and try to reduce them. But we live in a world dominated by growing uncertainties. Understanding the balance of these two forces holds the key to appreciating some of the most perplexing social phenomena – such as why people would vote for a man who has been convicted of multiple crimes and strongly signalled his autocratic tendencies.

The world is filled with uncertainties and information technology is enhancing the level of that uncertainty at an incredible pace. The development of AI is only propelling the increase of uncertainty and will continue to do so at an unimaginable scale.

In the unregulated wild west of the internet, tech giants have created a monster that feeds us with noise and uncertainty. The result is rapidly-growing entropy – there is a sense of disorder at every turn.

Each of us, as a biological system, has the desire to reduce this entropy. That is why, for example, we instinctively avoid information sources that are not aligned with our views. They will create uncertainties. If you are a liberal or leftwing voter and have found yourself avoiding the news after Trump’s re-election, it’s probably linked to your desire to minimise entropy.

The Need for Certainty

People are often puzzled about why societies are becoming more polarised and information is becoming more segmented. The answer is simple – the internet, social media, AI and smartphones are pumping out entropy at a rate unseen in the history of Earth. No biological system has ever encountered such a challenge – even if it is a self-imposed one. Drastic actions are required to regain certainties, even if they are false certainties.

Trump has grasped the fact that people need certainty. He repeatedly offered words of reassurances – “I will fix it”. Whether he will is a more complex question but thinking about that will only generate uncertainties – so it’s better avoided. The Democrats, in contrast, merely offered the assurance of a status quo of prolonged uncertainties.

Whereas Trump declared he would end the war in Gaza, Kamala Harris remarked that she would do everything in her power to bring an end to the war. But the Biden-Harris administration has been doing exactly that for some time with little progress being made.

Whereas Trump declared he would end the war in Ukraine, Harris remarked that she would stand up against Putin. But the Biden-Harris administration has been merely sending weapons to Ukraine to prolong the war. If that is what “standing up against Putin” means, then most Americans would prefer to see a fall in their grocery prices from an end to the war.

Harris argued that Trump is a fascist. This may prove to be true, but what that means exactly is unclear to most Americans.

While Harris’s campaign message of hope was a good initiative, the Democrats failed in delivering certainty and assurance. By the same token they failed to control the information space. Above all, they failed the Americans because, while Trump may well bring an end to the war in Ukraine and Gaza in some form, his climate policy will be detrimental to all Americans, with lasting impacts.

Without understanding the science of information, the blame game currently underway will not bring Democrats anywhere. And there are lessons to be learned for other centre-left governments, like the UK Labour government.

It is not entirely inconceivable that the former prime minister Boris Johnson, encouraged by the events in the US, hopes for a dramatic return to the throne at the next general election. If so, prime minister Keir Starmer must find a way to avoid following the footsteps of Biden and Harris. He must provide people with certainty and assurance.

____

1 These entires from a Reddit thread seemed helpful:

You can call it “disorder” to a layman, but it’s not very precise.

Entropy is a measure of how many “possibilities” (microstates, i.e. atomic configurations) correspond to something you can “see” (macrostate, i.e. temperature, pressure, etc).

I’ll use a world made of Legos as an example.

A single 2×4 yellow Lego piece has no entropy. It is exactly what you see.

A one layer sheet of Legos has no entropy either, because you can see them all. The exception is if you cannot see the seams between the pieces. There is entropy in that case because you don’t know if it was a 2×4 or two 2x2s.

A large block of Legos has entropy because all you can see is the outside.

Mathematically, entropy is S = k*ln(W), where ln is the natural log, W is the number of possible microstates that fit with the given macrostate, and k is a constant conversion factor, depending on what type of entropy you are talking about.

And:

Entropy is a convenient way to describe the state function of a system, which measures the number of ways you can rearrange a system and have it look “the same” (for some value of “the same”). The problem in thermodynamics is that you have a large-scale description of a system (like, say, a steam engine or a heat engine), and physics (particle collision theory) that describes systems like that in exquisite, impossible-to-measure detail. You want to extract the large scale physics from the system – how will it evolve on large, observable scales? (For example, will the steam condense, or will some mercury in contact with the system expand or contract?).

The state function is very useful in cases like that, because it tells you something about how well you understand the condition of the system. The state function is a measure of the number of different ways you could rearrange the inobservably small parts of your system (the water molecules in the steam boiler, for example) and still have it match your macroscopic observations (or hypothetical predictions). That is useful because you can use the state function to calculate, in a broad way, how the system is most likely to evolve, without actually cataloguing each of the myriad states it might be in and assigning a probability to each.

Entropy is just the logarithm of the state function. It’s more useful because then, instead of dealing with a number of order 101000, you’re dealing with a number of order 1000. Incidentally, the reason entropy tends to increase is that there are simply more ways to be in a high entropy state. Many, many more ways, since entropy is a logarithm of a huge number to begin with. so if there’s roughly equal probability of a system evolving in each of many different ways, it’s vastly more likely to end up in a state you would call “high entropy” than one you would call “low entropy”.

Thermodynamically, the reason it takes energy to reduce entropy of a system is that you have to overcome the random excitation of each portion of the system to force it into a known state. Since you don’t know what state the system started in (otherwise its entropy would already be low, since you would have enough knowledge to reduce the value of the state function), you have to waste some energy that wouldn’t technically be needed if you knew more about the system, pushing certain particles (you don’t know in advance which ones) that are already going in the correct direction for your entropy reducing operation.

Maxwell’s Daemon is a hypothetical omniscient gnome who can reduce entropy without wasting any energy, by sorting particles on-the-fly. But with the advent of quantum mechanics we know that knowledge always has an energy cost, and a hypothetical Maxwell’s Daemon couldn’t measure which particles to sort where, without spending some energy to get that knowledge. So Maxwell’s Daemon turns out to require just as much energy to reduce entropy as would any normal physicist.

Anyway, entropy is closely related both to physics and to information theory, since it measures the amount of knowledge (or, more accurately, amount of ignorance) you have about a system. Since you can catalog Sn different states with a string of n symbols out of an alphabet of size S (for example, 2n different numbers with a string of n bits), the length of a symbol string (or piece of memory) in information theory is analogous to entropy in a physical system. Physical entropy measures, in a sense, the number of bits you would need to fully describe the system given the macroscopic knowledge you already have.

Incidentally, in the 19th century, entropy was only determined up to an additive constant because nobody knew where the “small limit” was in the state function, and therefore where the 0 was on the entropy scale. After the advent of quantum mechanics, we learned where the 0 is — pure quantum states have 0 entropy, because a system in a pure quantum state has only one physical state available to it, and the logarithm (base anything) of 1 is 0.

Print Friendly, PDF & Email

38 comments

  1. Ignacio

    Deliver certainty and assurance. They can’t. They are politically helpless and powerless by definition as per Matt Stoller. So: “Joy!” or “Nothing will fundamentally change”. Do not mess with thermodynamics please.

  2. The Rev Kev

    Yeah, I kinda got derailed when he wrote the following-

    ‘…Kamala Harris remarked that she would do everything in her power to bring an end to the war. But the Biden-Harris administration has been doing exactly that for some time with little progress being made.’

    So in what universe is shipping every bomb, rocket, artillery round, etc. that the US has in their arsenal help bring an end to the war? That’s like trying to bonk your way to virginity. The Biden-Harris administration has done everything in their power to keep this genocide going on as long as possible and extending it to other countries so why does this Professor of mathematics not recognize that?

    1. Polar Socialist

      Not that it matters as much as all the ammunition and money provided, but Biden-Harris government has also blocked every attempt at UN Security Council to stop the war.

      Cleaning campuses and streets of “antisemitic” anti-genocide protests is not a federal matter, I guess, but has had the full support of the current regime.

    2. Trees&Trunks

      “That’s like trying to bonk your way to virginity.”

      Thank you very much for an activity idea for the long-weekend. I have a growth and learning mindset, so it should be possible.

  3. BillS

    In reality, the description of entropy that is given in the article is not bad. S=k ln W is at least mentioned. Where things get unscientific is the use of such arguments to explain the state of political affairs – which is utter nonsense. How would one define “political microstate probability” W? In order to define this, one needs the state function of the system. How in spaghetti-monster’s name can you define a “political polarization” state function? One could just as well argue that “political entropy” is decreasing as the Overton Window narrows and people self-group into ever smaller ranges of opinion. I should also mention that “information entropy” has a very precise definition that different from thermodynamic entropy and is not even mentioned here: H = sum(p_i * ln p_i).

    Also, the quantum ground state of a system does not necessarily have zero entropy.

    Overall IMHO, the explanation of entropy is OK, but the political argument looks like a half baked brain-dump that looks and smells a bit like my morning movement. ,-)

    1. Revenant

      He could type in a definition from a textbook on statistical mechanics but the fact he wrote “pumping out entropy” like it is a thing rather than a relation / emergent property, shows that he’s not serious (and not a Marxist! Reification…).

      Idiotic article. Stupid scientism. Now politics has physics envy! But the author has serious mathematical physics chops, from his wiki.entry, so I don’t understand how he can write such nonsense.

      Yves, the footnotes are good but I think the pasting-in process has stripped some formatting. The paragraph on cataloguing states with a string of bits or symbols needs the exponentiation operator (the ^ symbol), e.g. S^A for S characters from an alphabet of size A or 2^n for the number of states described by n binary digits (bits).

      1. BillS

        I was trying to be as charitable as possible, but perhaps it is worth paraphrasing Wolfgang Pauli: “Das ist nicht nur nicht richtig; es ist nicht einmal falsch!”

        1. lyman alpha blob

          Indeed. This –

          “Whenever there is progress in communication technology, people circulate more and more inessential or inaccurate information. In a political system, this is what leads to the noise domination described by the second law.”

          – was some particularly muddled thinking. Lots of assumptions baked into that claim, making this scientist sound like the economist assuming the can opener. He seems to think that have fewer sources of information would lead to a better result. But what if there is just one source and that source is lying? “Remember the Maine!”

          And even if you take his argument at face value, he sure doesn’t have the slightest idea what to do about it. Just how might the unpopular and mendacious Keir Starmer go about providing the certainty and assurance necessary to prevent another equally unpopular and mendacious politician from taking charge?

      2. matt

        i am somewhat sympathetic to scientists who draw parallels between science and other phenomena. it is easier to understand concepts when you can draw parallels between other concepts. and when you spend all day thinking about science, of course you are going to draw science parallels. that’s just how your brain works.
        i think it’s more a matter of ‘this should have stayed in the diary’ instead of being published for the masses.

    2. KLG

      My teacher of Physical Chemistry of Macromolecules began with the relationship between entropy and information. I could manipulate the equations in this and my Physical Chemistry class (shudders) but didn’t really and truly understand the Second Law until we had children, who are “entropy generators.” Parents supply the free energy to reduce chaos to the tolerable level one learns to live with (note that one seldom recognizes the entropy s/he produces). Since our politicians are basically “children on the school playground” of my youth, maybe political entropy makes sense. But in general, “entropy” and “paradigm shift” make the people who use these terms out of any reasonable context sound smart, so there you go. And we could completely do without Thomas Kuhn and his paradigms...I’ll return to my corner now.

      1. debug

        Thanks KLG,

        Errol Morris does a pretty good job of debugging Kuhn’s mess. There are other critiques out there that hit the mark and go beyond Morris, too. For those who may be interested in exploring before buying Morris’ book here is a link to a series of essays that Morris wrote that cover a good bit of the ground in his book.

        https://archive.nytimes.com/opinionator.blogs.nytimes.com/tag/incommensurability/

        You have to scroll down and begin with the first part, since this link presents them in reverse chronological order.

  4. Anonted

    “…successfully fight elections in the future,” here I was thinking elections were about avoiding fights… but I suppose a man needs a mission.

    If one were to speak of water: sometimes you need steam, others liquid, etc, and ‘good’ management attempts to regulate the flow of energy to achieve the desired state (at cost, austerity comes to mind). Entropy therefore, is subjective, in that it depends on your use and interpretation of the system, and seems more a measure of our will than of the system itself (challenges to this perspective welcome, as i’m wading out of my depth here). So what looks like ‘high entropy’ to the author, could well look different from the perspective of, say, NSA, or God. Sometimes they want steam.

  5. bertl

    School boy question from my commonplace book: what if the system is part of a larger, but unrecognised system, and that system is apart of an even larger system, etc, and the entropy of each system feeds a larger system which possibly produces feedstock for the initial system in entropy which enables it to arrange itself in a completely different way? Each system would be in a state of constant change but overall energy/mass will remain constant because leakage is ultimately contained. That is, we only view a system in boundaries and terms we have ourselves defined and we may be missing the wood for the trees.

    I never got an answer to that at school and was informed I read too much science fiction.

    1. Anonted

      Indeed, entropy isn’t a thing (opinion). The ‘systems’ are contrived, and never cease, but change definition, through the dynamism of creation (destruction). They care not for your awareness, though I suspect, your suffering.

    2. Revenant

      Your schoolboy hunch is correct. Life proceeds in islands where energy is available to a system that can be harnessed and use to move from high to low entropy states, e.g. sunlight falling on the Earth sustains life on Earth. If you look at the sun and Earth as a single system, I am sure that entropy increases overall. Entropy is a phenomenon of a defined system, whatever that definition is. We are generally interested in tiny systems that we treat for convenience as isolated. The entropy of the Universe is a fairly meaningless tool for applying except to.the Universe as a whole.

    3. GramSci

      Well, we can’t see past our event horizon. Alas, for most Murikins, that’s about one light-second, which almost gets them to the moon. Their prophets foresee a future on Mars.

    4. lyman alpha blob

      My layman’s answer – the 2nd law is only valid in a closed system. You’re describing an open system, so entropy doesn’t necessarily always increase.

      Also, thinking about entropy too much makes my head hurt. I’m not really sure if my brain is now more or less disordered…

      1. KLG

        Yes! This was a thing 100 years ago regarding entropy and evolution. How could organisms get more complex as they evolved? The short answer is that as long as the sun shines planet Earth is not a closed system. And after our star expands as a red giant in a few billion years it won’t matter in this part of the universe. God can then try again. Still, the articles and books on entropy and evolution are very interesting reads, that no one reads anymore. Alas.

    5. matt

      niklas luhman’s systems theory does an excellent job of explaining this concept! essentially, there is no ‘innate’ system/environment boundary, the boundary by definition must be constructed by an observer. because all boundaries are pretty hazy- where does the atmosphere stop being earth and start being space? a bunch of scientists decided on that, and when they write their papers, they have to state the boundary conditions of the system. the system cannot be separated from the environment (just as you or i cannot be separated from our surroundings), we just make a simplification so we can get some sort of objective calculation.
      it comes back to the classic ‘all models are wrong but some models are useful.’ yes, all boundaries are technically fictitious, but if we didn’t have system boundaries it would be pretty hard to make sense of the world.
      as for if we are part of a larger system, well, now you’re talking about gods and such. i tend to cop out by going ‘well we can never really know for sure if we’re living in a simulation and whatnot.’ and make the simplifying assumption that jesus is the son of god so i can go about my days with less existential dread.

  6. Clark Landwehr

    I do think he’s right that the easier you make it to propagate ‘information,’ the more junk information you get. Producing ‘bad information’ is easy and cheap, producing ‘good information’ (=truth) is hard and expensive. You don’t need the entropy stuff. The more fundamental a scientific concept is (entropy), the harder it is to define even by ‘experts.’

    1. Revenant

      Even if there is a relevance, it’s a double-edged sword. He is not advocating that all politicians stop using modern media, just the nasty populists.

      Indeed, the dominant discourse was radical centrism until recently, so presumably that’s all a bunch of entropy too! :-)

      1. pjay

        – “He is not advocating that all politicians stop using modern media, just the nasty populists.”

        Yes. To the extent that I could figure out the point of this pseudo-scientific pap, that seems to be it. The Democrats “failed to control the information space.” They must understand the “science of information” if they are to be successful and control the “noise.” Not a very original argument these days. The author just dresses it up in ridiculous faux science jargon peppered with “entropy” and “the second law” and other such phrases.

        I admit to a knee-jerk hostile reaction when any “hard” scientist attempts to school us poor dimwits with a background in mushy social science or, worse yet, the humanities, by applying the “laws of physics” to human history or human societies. But this is a feeble effort.

    2. GramSci

      In Shannon’s sense of information, “redundant information” is an oxymoron, but the good professor seems to think it is/tends to ‘zero entropy’.

      IMO, it does not do so directly. In neocortex, whose wave function I take to be akin to the wave function of the Western speech community, this ‘redundant information’ of social media (let’s just call it ‘language’) creates resonances, or, if you will, “echo chambers”.

      These resonances have locked into an oscillatory, bivalent state akin to schizophrenia or bipolar disorder. It will likely take take a zero entropy nervous breakdown to reset this system.

      1. Polar Socialist

        In Shannon’s sense of information the meaning is not the same as the writer of the article gives to it. It relates to the information content of a “message”, but not to the quality of that information, and even less to any interpretation of that information.

        Nowhere in the information theory is there a consideration about the factuality or truthfulness of the information. Shannon’s entropy is basically just a measure of how much one can compress any piece of “information” without losing any of it. Or, to put it in other words, how structured that information is – as in if there are any repeating patterns.

        Or, to use a parallel, Kolmogorov’s complexity, which is always equal to Shannon’s entropy. Kolmogorov’s complexity measures how long an algorithm is need to produce the said piece of “information”.

        Long story short, Shannon’s theory of information specifically states that the meaning of the message [information] is completely irrelevant.

          1. BillS

            The notions of order and disorder are also human constructs. For a physical system, a state can appear disordered, but if the system can only be in that state, then the entropy is low. Entropy is a function of the distribution of states, not a function of what we perceive these states to look like.

    3. bertl

      Karl Popper, whom I rarely invoke to support a sensible argument, did make one valid, if limited, point. The only statements we know to be true are those we know to be untrue: that is, what may appear to be true can be falsified by experience, so it follows that the only truth we can rely on is what we know evidentially to be false. This idea works a lot better than the concept of entropy in most fields other than the theology of zealots where anything goes but little gets through.

      1. hk

        Not so much something we know is untrue, but something for which evidence of its untruth can be articulated in a manner that a countetexample, if found, furnishes a sufficient proof of its untruth (my words, not Popper’s, I think.) This leads to odd instances going in the “other” direction, too. People who believe X doesn’t exist (X = usually God) usually distort the definition of X so that a counterexample can be easily found. But the theology of God is such that God is not epistemologically “true,” thus the point of faith.

        What it does illustrate, I think, is that the requirements of epistemological “truth” is very high. Either the statement is general enough and its potential untruth can be very easily demonstrated or it is very specific and conditional and not exactly widely applicable. Even assuming the “epistemological” truth is all that relevant (in human communications, non-factual but “meaningful” statements are everywhere. Walter Kirn was mentioning, in the latest Taibbi/Kirn podcast, that fact checkers were “fact checking” his opinions, in opinion pieces.)

  7. Louis Fyne

    nothing personal against OP…..

    it cracks me up at the level of investment that Europeans have in US presidential politics….fair enough, western Europe is a satrap of DC.

    But the same tectonic forces that affects US politics is affecting UK-continental politics. OP thinks that Boris! is what the UK-equivalent-of-MAGA wants? OP needs a better clinically detached grasp of UK politics, then btry commenting on US politics.

    in my opinion

  8. Bruce Elrick

    Static on a TV screen would be an example of high entropy, not low. Low would be left half black right half white. Or top/bottom. Or every second line. Or a checkerboard.

  9. Scramjett

    Mechanical Engineer here…I thought about adding my own description, but then I remembered a video on the subject by the physicist Sabine Hossenfelder. It is a fantastic and well done video that neatly explains the current understanding of entropy, the problem with it as a concept, and, more importantly, the fact that order doesn’t have a well defined meaning to begin with. I don’t know where the authors fixation with “certainty” comes from. Perhaps he was confused with the Heisenberg Uncertainty Principle? In any case, here is a link to the video:

    https://www.youtube.com/watch?v=89Mq6gmPo0s

  10. herman_sampson

    What came to my mind is the three body problem: a multitude of objects (actually active agents, that is, people) interacting with each other. Since the three body (actually of passive objects) problem remains unsolvable, an active multi-body ‘system’ would be also and be uncontrollable. Politics and information are much more complex than the laboratory study of simple and controlled experiments.

  11. converger

    I’ve never been a fan of assuming that physics or information theory is more than a metaphor for social/cultural/political dynamics.

    That said, while they are conceptually similar, energy entropy (focused concentrated energy relative to random diffused energy) is *not* the exact same thing as information entropy (signal to noise). Yves is rarely ever wrong. But on this one, her critique doesn’t make any more sense than Brody’s analysis.

    Steam is high energy, but in terms of coherent information it’s static on a TV screen. Low energy laser beams can contain massive amounts of useful information. Contrary to the implication of the Reddit entry that Yves quotes, Legos contain the exact same amount of energy entropy whether they are laid out in a pleasing and highly informative single-layer pattern, or stacked in a cube, or as individual pieces randomly thrown into a box under a bed.

    Drowning out a quiet conversation with a 150 decibel megaphone that’s broadcasting static is using a lot of energy to destroy information by adding noise. So does a Trump cult screaming random lies that eliminate any structured attempt to focus on objective truth, or to seek a messy democratic consensus that has any correlation to something that might inform a genuinely helpful outcome.

    Pretty much every President since Reagan has accelerated the corruption, stupidity, and oppression of a failing American empire. Biden and Harris represented more of the same. I understand why people are sick of it. That said, anyone who assumes that Trump isn’t going to simply double down on corruption, stupidity and oppression is about to be deeply disappointed.

    1. Yves Smith Post author

      Again, I was referring to the sound, the TV buzz, not the image. but that may not make any difference. I suppose the old whine you would also get when the TV signal went down would be more apt as a narrow range of tones that were maintained without change.

Comments are closed.