The robustness of the entropy concept

Matters concerning the entropy of the entire cosmos can be left aside for the time being. For the moment, we can just appreciate the value of Boltzmann's formula, for it provides us with an excellent notion of what the entropy of a physical system should be actually defined to be. Boltzmann put forward this definition in 1875, and it represented an enormous advance on what had gone before,[15] so that it now becomes possible to apply the entropy concept in completely general situations, where no assumption need be made, such as the system in question having to be in some kind of stationary state. There are, nevertheless, still certain aspects of vagueness in this definition, associated, primarily, with the notion of what is to be meant by a 'macroscopic parameter'. We might, for example, imagine that it will become possible, in the future, to measure many detailed aspects of the state of a fluid, where these would today be considered to be 'unmeasurable'. Rather than being merely able to determine, say, what the pressure, density, temperature, and overall fluid velocity might be at various locations of the fluid, it might become possible in the future to ascertain the motions of the fluid molecules in a great deal more detail, where we might perhaps be able even to measure the motions of specific molecules in the fluid. Accordingly, the coarse-graining of the phase space would then have to be taken rather more finely that it had been before. Consequently, the entropy of a particular state of the fluid would be considered to be somewhat smaller, as judged in the light of this newer technology, than it would have been previously.

Some scientists have argued[16] that the use of such technology to ascertain finer details of a system in this way would always entail an entropy increase in the measuring apparatus which more than compensates the effective entropy reduction that would be ascertained to be taking place in the system under examination, by virtue of the detailed measurements. Accordingly, such detailed measurement of a system would still result in an increase in the entropy overall. This is a very reasonable point, but even if we take it into account, there is still a muddying of the Boltzmann entropy definition, as the lack of objectivity in what constitutes a 'macroscopic parameter' for the system as a whole is hardly clarified by such considerations.

An extreme example of this sort of thing was envisaged by the great nineteenth-century mathematical physicist James Clark Maxwell (whose equations for electromagnetism have been referred to earlier; §1.1, §1.3). Maxwell imagined a tiny 'demon' able to direct individual gas molecules one way or another, by opening or closing a little door, thereby enabling the Second Law, as applied to the gas itself, to be violated. Yet, to consider the entire system, including the body of Maxwell's demon itself, as a single physical entity, the actual sub-microscopic composition of the demon would have to be brought into the picture, and the Second Law should be restored once this is done.

In more realistic terms, we might imagine the demon to be replaced by some minute mechanical device, and we can argue that the Second Law still holds good for the entire structure. The issue of what constitutes a macroscopic parameter does not seem to me to be properly resolved by such considerations, however, and the very definition of entropy, for such a complicated system, remains somewhat enigmatic. It might indeed seem odd that an apparently well-defined physical quantity like the entropy of a fluid should be dependent upon the specific state of technology at the time!

Yet, it is remarkable how little the entropy values that would be assigned to a system are affected, in a general way, by developments in technology such as this. The entropy values that would be attributed to a system would, on the whole, change very little as a result of redrawing the boundaries of the coarse-graining regions in this kind of way, as might result from improved technology. We must indeed bear in mind that there is likely to be always some measure of subjectivity in the precise value of the entropy that one might assign to a system, on account of the precision that might be available in measuring devices at any one time, but we should not adopt the point of view that the entropy is not a physically useful concept for that kind of reason. In practice, this subjectivity will, in normal circumstances, amount to a very small factor. The reason for this is that the coarse-graining regions tend to have volumes that are absolutely stupendously different from one another, and the detailed redrawing of their boundaries will normally make no discernible difference to the entropy values that are assigned.

To get some feeling for this, let us return to our simplified description of the mixture of red and blue paint, where we modelled this by considering 1o24 compartments, occupied by equal total numbers of red and blue balls. There, we considered the colour at the various locations to be purple if the ratio of blue balls in a 105x105x105 cubical crate lay in the range 0.999 to 1.001. Suppose that, instead, by the use of finer precision instruments, we are able to judge the red/blue ratio of the balls at a much finer scale than before, and much more precisely. Let us suppose that the mixture is now judged to be uniform only if the ratio of red balls to blue balls is between 0.9999 and 1.0001 (so that the numbers of red and blue balls are now to be equal to an accuracy of one hundredth of a per cent), which is ten times as precise as we had demanded before, and that the region examined need now only be one half of the dimension—and therefore one eighth of the volume—that we had had to examine before in order to determine the hue. Despite this very considerably increased precision, we find that the 'entropy' we must assign to the 'uniformly purple' state ('entropy' in the sense of the log of the number of states that now satisfy this condition) hardly changes from the value that we had previously. Consequently, our 'improved technology' makes effectively no difference to the sort of entropy values that we get in this kind of situation.

This is only a 'toy model', however (and a toy model of configuration space rather than phase space) but it serves to emphasize the fact that such changes in the precision of our 'macroscopic parameters' in defining the 'coarse-graining regions' tend not to make much difference to the entropy values that are assigned. The basic reason for this entropy robustness is simply the enormous size of the coarse-graining regions that we encounter and, more particularly, of the vastness of the ratios of the sizes of different such regions. To take a more realistic situation, we might consider the entropy increase that is involved in the commonplace action of taking a bath! For simplicity, I shall not attempt to place an estimate on the not inconsiderable raising of entropy that occurs in the actual cleansing process(!), but I shall concentrate only on what is involved in the mixing together of the water that emerges from the hot and cold taps (either in the bath itself or in the interior of a mixer tap which might be attached to the bath). It would be not unreasonable to suppose that the hot water emerges at a temperature of around 50 °C and the cold, around 10 °C, where the total volume of water that finds itself in the bath is being taken to be 150 litres (made half from the hot water and half from the cold). The entropy increase turns out to be about 21407 J/K, which amounts to our point in phase space moving from one coarse-graining region to another that is about 1027 times larger! No reasonable-looking change in precisely where the boundary of the coarse-graining regions are to be drawn would make any significant impression on numbers of this scale.

There is another related issue that should be mentioned here. I have phrased things as though the coarse-graining regions are well defined, with definite boundaries, whereas strictly speaking this would not be the case, no matter what plausible family of 'macroscopic parameters' might be adopted. Indeed, wherever the boundary of a coarse-graining region might be drawn, if we consider two very close points in the phase space, one on either side of the boundary, the two would represent states that are almost identical, and therefore macroscopically identical; yet they have been deemed to be 'macroscopically distinguishable' by virtue of their belonging to different coarse-graining regions![17] We can resolve this problem by asking that there be a region of 'fuzziness' at the boundaries separating one coarse-graining region from the next and, as with the issue of subjectivity about what precisely is to qualify as a 'macro-

scopic parameter', we may choose simply not to care what we do with the phase-space points lying within this 'fuzzy boundary' (see Fig. 1.12). It is reasonable to consider that such points occupy a very tiny phasespace volume in comparison with the vast interiors of these coarse-graining regions. For this reason, whether we consider a point close to the boundary to belong to one region or to the other will be a matter of little concern, as it makes effectively no difference to the value of the entropy that would normally be assigned to a state. Again, we find that the notion of the entropy of a system is a very robust one—despite the lack of complete firmness in its definition—owing to the very vastness of the coarse-graining regions, and of the enormous imbalance between their sizes.

Fig. 1.12 'Fuzziness' at the boundaries separating one coarse-graining region from the next.

All this having been said, it must however be pointed out that there are various particularly subtle situations where such crude notions of 'macroscopic indistinguishability' would appear to be inadequate, and even seem to give us quite wrong answers for the entropy! One such situation occurs with the phenomenon of spin echo (first noticed by Erwin Hahn in 1950) that is made use of in connection with nuclear magnetic resonance (NMR). According to this phenomenon, some material with an initial specific state of magnetization, with nuclear spins[18] closely aligned, can lose this magnetization under the influence of a varying external electromagnetic field, the nuclear spins then taking up a much more higgledy-piggledy-appearing configuration owing to a complicated collection of spin precessions occurring at different rates. But if the external field is then carefully reversed, the nuclear spins all return to their original states, so that, very strikingly, the specific original magnetization state is retrieved! As far as macroscopic measurements are concerned, it would appear that the entropy had increased in the transition to this intermediate stage (with the higgledy-piggledy nuclear spins)— consistently with the Second Law—but when the nuclear spins regain the order that they had lost in the intermediate stage, as a result of the application of the reversed external electromagnetic field, it would appear that the Second Law has been grossly violated, owing to an entropy decrease during this final process![19]

The fact is that even though the spin states would appear to be very disordered in the intermediate situation, there is actually a very precise 'hidden order' in the apparently higgledy-piggledy arrangement of spins, this order being revealed only when the pattern of external magnetic field movements is reversed. Something analogous occurs with a CD or DVD, where any ordinary crude 'macroscopic measurement' would be likely not to reveal the very considerable stored information on such a disc, whereas an appropriate playing device specifically designed to read the disc would have no trouble in revealing this stored information. To detect this hidden order, one needs 'measurements' of a much more sophisticated type than the 'ordinary' macroscopic measurements that would be adequate in most situations.

We do not really need to consider anything so technically sophisticated as the examination of tiny magnetic fields to find 'hidden order' of this general kind. Something essentially similar occurs with a much simpler-looking apparatus (see Fig. 1.13, and for further information Note 1.10). This consists of two cylindrical glass tubes, one of which fits very snugly inside the other, there being a very narrow space between the two. Some viscous fluid (e.g. glycerine) is inserted uniformly into this thin space between the two cylinders, and a handle is attached appropriately to the inner one, so that it can be rotated with respect to the outer one which remains fixed. Now the experiment is set up so that there is a thin straight line of bright red dye inserted in the fluid, parallel to the axis of the cylinder (see Fig. 1.14). The handle is then turned around several times, the line of dye spreading as a consequence of this, until it is observed to be distributed uniformly around the cylinder so that no trace of its original concentration along a line is now seen, but the fluid acquires a very faint pinkish hue. By any reasonable-looking choice of 'macroscopic parameters' to ascertain the state of the dyed viscous fluid, the entropy would appear to have gone up, the dye being now uniformly spread over the fluid. (The situation might appear to be very similar to what happened with the stirred combination of red and blue paint that we considered in §1.2.) However, if the handle is now rotated in the reverse direction, by just the same number of turns as had been used before, we find, rather miraculously, that the line of red dye reappears, and becomes almost as clearly defined as it had been in the first place! If the entropy had indeed been raised in the initial winding, by the amount that had appeared to be the case, and if the entropy is considered to have returned to close to its original value after the rewinding, then we have a severe violation of the Second Law as a result of this rewinding process!

Fig. 1.13 Two snug-fitting glass tubes and viscous fluid between, with line of red

In both these situations, it would be the common viewpoint that the Second Law has not, in actuality, been violated, but that in such situations the entropy definition has not been refined enough. In my opinion, there is a 'can of worms' here, if one demands that there should be a precise objective definition of physical entropy, applicable in all circumstances, with respect to which the Second Law is to be universally valid.

line of red dye line of red dye

Fig. 1.14 The handle is turned several times spreading out the line of dye. The handle is then turned back the same number of times and the line reappears in apparent violation of the Second Law.

I do not see why one should demand that there always be a well-defined, physically precise notion of 'entropy', that is entirely objective and consequently 'out there' in Nature, in some absolute sense,[111] where this 'objective entropy' almost never decreases as time progresses. Must there always be an actual entropy concept that applies to the slightly tinted viscous fluid between the cylinders, or to the configurations of nuclear spins that had appeared to become totally disorganized, though retaining a precise 'memory' of the order that they had had before? I do not see that this need be the case. Entropy is clearly an extremely useful physical concept, but I do not see why it need be assigned a truly fundamental and objective role in physics. Indeed, it seems reasonable to me that the usefulness of the physical notion of entropy has its origin largely in the fact that, for systems that we tend to encounter in the actual universe, it turns out that the normal measures of 'macroscopic' quantities give rise to coarse-graining volumes that do in fact differ from one another by stupendously large factors. There is a profound issue, however, as to why, in the universe that we know, they should differ by such enor mous factors. These enormous factors reveal a remarkable fact about our universe that does seem to be clearly objective and 'out there'—and we shall be coming to this shortly—despite the admittedly confusing issues of subjectivity that are involved in our concept of 'entropy', these serving merely to cloud the central mystery that underlies the profound usefulness of this remarkable physical notion.

Was this article helpful?

0 0

Post a comment