How can we be so complex if the second law of thermodynamics is true?

By Rich Feldenberg:

There is no doubt that physics is a difficult subject to master, but there seem to be particular areas of physics that are commonly misunderstood and misapplied by the general public. One such area is quantum mechanics – a field within, so called, modern physics – where complex mathematical structures provides hints of the underlying nature of the universe that are completely counter intuitive to our “common sense” notions of how things should be. Another area of physics that falls into this category of being frequently misunderstood is thermodynamics – specifically the second law of thermodynamics – a field coming out of classical physics that deals with the notion of changes in entropy of physical systems. This article will focus on that second area – the second law of thermodynamics.

 

Thermodynamics originated in the 17th century, as a way to understand heat, energy, and work. Over time there came to be four laws of thermodynamics described and labeled as laws zero through three. The zeroth law relates the fact that if object A is in thermodynamic equilibrium with both objects B and C (meaning that there is no net heat exchange between them), then it follows that B and C are also in thermodynamic equilibrium with each other. Maxwell concluded from this observation that, “all heat is of the same kind”. We understand this effect when we take a temperature measurement with a thermometer. Once the thermometer is in thermodynamic equilibrium with the object of interest (there is no net heat exchange) the temperature of the thermometer will give you the temperature of the object being measured. A perfect thermometer will not change the temperature of the object in question.

 

The first law of thermodynamics is a conservation law, and simply put says that energy is a conserved property. The energy in a closed system is fixed, or constant, and while the energy can change form (i.e.. Could change from thermal to mechanical, kinetic, electromagnetic, gravitational, or so on) the amount of energy stays entirely the same, always and forever. The only processes that are allowable are those in which the total energy of a closed system is constant. This law lets us know which process can occur. If a process would require a change in the total energy of a closed system, then that process is forbidden by nature.

 

The third law of thermodynamics provides us with the simple statement that ‘the entropy of a perfect crystal at absolute zero temperature is zero’.   We’ll define entropy in a moment, once we get to the second law of thermodynamics, but we’ll just remark that absolute zero is the lowest temperature theoretically possible, and that if you ignore the effects of quantum mechanics where neither the momentum and position of a particle can both be known with perfect precision then in a classical system an object is in its lowest energy state at absolute zero, thereby removing any disorder in the system. This law also indirectly implies that it can never be possible to reach absolute zero through any means.

 

We won’t comment further on thermodynamic laws zero, one, or three in this article, but will move onto the second law of thermodynamics. The second law governs which types of process are spontaneous – will occur without the input of energy from the outside. The second law states that the entropy of a system as a whole, must increase for any spontaneous or irreversible process. For a reversible process the entropy could remain constant, which is also allowed by the second law. Entropy (given the symbol: S ) can be described as the amount of disorder in a system. For a system to increase its entropy, the system must become more disordered. This is not to say that certain subparts within the system might not become more orderly (i.e. Decrease their entropy), but they would do so at the expense of the system as a whole, which if you added all the contributions to ‘change in entropy’ together (the pluses and the minuses) you would find that the sum is always a plus (entropy has increased). This does not prohibit complex, and very ordered, systems to develop, but they do so because they are increasing entropy even more is some other part of the universe.

 

If the second law really forbid anything becoming more ordered or complex then we would be breaking the second law of thermodynamics every time you made your bed, cleaned the living room, baked a cake, or put together a lego model. When we use a refrigerator to cool the temperature in the freezer we are decreasing the entropy inside the fridge. Even the ancients were skilled at producing order when they built the pyramids out of clay and stone, mined and separated metal ore from the earth, and grew crops. To the uninitiated, all these things, at least on the surface, would seem to break the second law of thermodynamics. But, we know the second law can not be broken, at least no one has ever seen an example of an exception to the rule yet.

 

As impressive as these examples of technology to increase complexity are, they pale in comparison to the complexity of a living system. Look at how complex, orderly, and precisely organized is a living cell. Even a lowly bacteria is a little pocket of highly organized molecular structures, far out of thermal and chemical equilibrium with its environment (one requirement for life, even if not a complete definition). The cell has a very improbable structure, based on random chance alone – that the atoms of the cell would randomly assemble based on thermal motion into the complex set of protein, nucleic acids, and so forth – but we’ll see that it was not random chance that lead to living cells. A cell functions as a living thing precisely because its entropy is so low. So how could such a thing exist in a universe where the second law of thermodynamics is in effect? If entropy (disorder) has to increase, then how can there be even the simplest of cell types?

 

Well, the complex and organized structure of the living cell, can be generated when it creates an even greater amount of disorder in its surroundings. The heat generated by metabolism is transferred to the surroundings where it loses its potential to do useful work. The power supplied by the sun to run nearly all ecosystems, provides energy that can be harnessed by living things to keep their entropy low, and stay far from equilibrium with their surroundings. If the sun went out, that supply of energy would be cut off, and without a continuous supply of renewed energy being delivered, entropy of the ecosystem would certainly increase as organism die, losing order as their molecular parts are dispersed. The sun itself, the power supply, has a low entropy due to its dense structure of hydrogen, and is increasing the entropy of the universe as it fuses hydrogen to helium, releasing less orderly radiation and neutrinos out into space. It’s taking a nice condensed ball of hydrogen gas and producing a sea of radiation spreading out in all direction in space – in other words, it’s making a real mess of things! The entropy of the universe is ever increasing, as a consequence all the processes, both living and non-living, that the universe is so good at performing.

 

The total amount of energy in the universe remains unchanged throughout time (first law), but that energy becomes less and less usable due to the increasing entropy (second law).   The quality of that energy (how useful it is at doing work) does change, and the quality of universal energy is worsening as time goes on. In fact, it is entropy which seems to provide some sense of which way time is flowing, what some call an arrow of time. The difference between past and future is not the amount of energy in the universe (which is constant) but in which direction the disorder is higher. The past, always more ordered and the future always more disordered. This increasing disorder is a natural consequence of the number of micro-states a system has. What we mean by this is simply that, if you imagine say a container filled with helium gas (this is our closed system) each helium atom can occupy any particular point in the box, so long as there is not already another helium atom taking that spot. Even in a small box, there could be a very large number of helium atoms – atoms being so tiny, and a mere 4 grams of helium would contain 6.02×10^23 atoms – a truly astronomical number. If you consider where each helium atom is in the box at some given time, that is one micro-state. The atoms will have some thermal energy and will be moving in random directions, bouncing off the walls of the box and off each other, so at some other time each atom will be in some new location. This would be a new and completely different micro-state, but it is likely that both micro-state will look essentially indistinguishable – both appear to us just as completely random mix of helium atoms. That is they will have basically the same macro-state because there would be no way to tell the different micro-states.

 

Now a micro-state could appear different, however, if all the helium atoms suddenly moved to one corner of the box and left the remains areas an empty vacuum, or if they all huddled together into the shape of a little arrow in the middle of the box. There is nothing saying that such micro-states are impossible, it is just that with the huge number of micro-states available, those with random appearing properties will far out number the few states with non-random appearing properties. The non-random appearing states really are just random, but they are going to be very unlikely to occur, just by statistics alone. There will be many many micro-states where all the atoms look randomly distributed in space, and in comparison, really few micro-states where the atoms look non-randomly spaced. It’s just a statistical argument, nothing more.

 

So how can order be increased (entropy decreased) so that things like living things can be alive, evolution can take place, and so forth? We could force all the atoms in our box to reside in one small corner, but it would involve work being done on the system. This would lead to an increase in entropy somewhere else.   For example, we could have a piston in the box, and push the piston down causing the helium atoms to move closer to the corner, making the gas more dense, and decreasing the entropy in the box. In order to do this energy has to be supplied to the piston. This will mean that some of the energy used to drive the piston must be wasted as heat (it is thermodynamically impossible for the energy efficiency of the piston, or any machine, to be 100%) and leading to increased entropy.

 

Living systems are able to harness energy from their environment to remain in their low entropy ‘alive’ state. That energy may come directly from the sun to run the process of photosynthesis, or could be chemical energy derived from high energy chemical bonds in biomolecules consumed by animals, for instance. As stated before, the low entropy state of the living system remains highly ordered at the expense of an even greater increase in entropy of the universe.

 

Creationists have been known to invoke the second law of thermodynamics as a way to show that evolution breaks the laws of physics, but this only really reveals the creationists lack of understanding of the second law. One consequence of evolution is that over geological time the complexity of organisms has increased. That is not the “goal” of evolution, who’s only objective is to pass genes on to the next generation, but in the process of producing more efficient gene passing devices (i.e. Organisms that survive and reproduce more effectively in their environment) some will have proceeded down a road of increased complexity (keep in mind that many remain simple if they can find other ways of remaining good reproducers, in fact, some may even regress as parasitic worms have which no longer need much more than a gut and reproductive tract to be successful).   The second law does not forbid evolution or the evolution of increasing complexity. Organisms in the process of survival, reproduction, natural selection are simply taking the energy stored from sunlight and using in a multitude of different ways. The universe at large pays the price for all the things living things do, including evolving, by increasing its overall entropy.

 

We know that the entropy in the universe today is more than it was yesterday, and less than it will be tomorrow. If we extend this line of reasoning to the universal extremes then it stands to reason that entropy was at its minimum at the beginnings of the universe and will be at its maximum at the end of the universe (if there is such a thing). The Big Bang was a very orderly state when you consider that all energy was packed into a tiny subatomic space. What about our cosmic destiny? Most cosmologist believe the evidence shows that the universe will continue to expand forever, and entropy will eventually reach a maximum. At that point there will be no further processes or reactions (whether chemical or nuclear) that will occur. This is called the ‘heat death’ of the universe, as there can be no net heat transfer, and hence no way to increase entropy further. When this happens there will be no more stars or living things, just a sea of ever diluted radiation, as space-time continues to expand.

 

The second law is fundamental to our understanding of how things work. It also explains why some things can never be possible – like perpetual motion machines which never lose heat energy to their surroundings – impossible! It makes sense when you begin to understand it as a consequence of what is happening on a microscopic scale. It is certainly not an argument against complexity arising, but it does tell us that all complex systems have a universal cost that has to be paid.   As long as we have a ready source of incoming power – the sun in our case – things can continue to remain ordered for billions of years. That’s good news for us who have no choice but to obey the law!