Sunday, September 24, 2023

Space, particles, forces, entropy


By Herbert von Irksome



I.  Space, particles, forces



People writing about physics as we now understand it are brushing up against a larger, undiscovered conceptual territory.  New ideas that will change our entire way of thinking about forces, energy and matter are waiting in the wings for their moment to be discovered. 



Action-at-a-distance.  Reaction-at-a-distance.  What do words mean in physics?  What do letters and other symbols mean in physics?  F = ma.  Net force equals mass times acceleration.  Hoo boy.  Right as rain.  Right left front back.  These have a meaning easy to visualize.  Force, however, is a different matter.  It is carried by a field, but the action of the field issssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss caused or carried by “particles.”  Not by real particles, but by virtual particles.



The seat of force, however, is sort of matter itself.  On the large scale, where electric charge sums to zero, matter is uncharged mass.  But our basic stable, small-scale, matter, outside the nucleus, is electrically charged, and creates its own fields of the electromagnetic type.  I’m speaking of the electron and the proton.  The neutron is not stable outside the nucleus, because it decays into a proton, electron and neutrino after an average lifetime of 11 minutes. 



But as big Al Einstein claimed and many believe, matter don’t matter no more as much as the field produced by matter or by electric charge.  The action is through the field, thusly re-action is through the field, also.  What the hell is really going on?  That’s the big idea.  To figure out.  In a new way. And also find out where to get pi.  This time of day.



Space, spacetime, field, matter.  Toasty, toasty, toasty.



And the mistaken concept of “particles.”  A little tiny sphere—not.  The hydrogen atom as a little planetary system—not.  Even the concept of “little” needs to be re-examined.  It is a non-viewable realm of space.  Maybe not of space, even, but a realm of the breakdown of space into what we call elementary particles.  There should be no expectation that a linear distance scale continues down into the atomic realm.  Length—space, in other words—may someday be considered an unrealistic physical quantity. 







II  Entropy



As we understand it presently, disposing of accumulated data is what increases entropy.  Creating and disposing of the garbage.  But of course we are quite aware these days that throwing stuff away is not the only alternative.  We also have recycling.  Can we recycle information?  Meaning reuse some of it.  Well, how is information or data used in the first place?  And what is used data?  Ha Ha Ho.



Well, it makes sense.  Data that has been collected but not observed by an observer is unused, we could say.  We are talking about measurement and detection.  Length detection is one thing I’m writing about in another strooy. Storoy.  Stoory. What is an observer?  An entity or person that has a use or purpose for the data.



See there, it’s all right!  Once that observer uses the data for whatever purpose he had in mind in the first place, then the data is by definition “used.”  And entropy increases, mon ami.  Je ne parle pas Francais.  Maybe some day, however.



If the second law of thermodynamics is going to apply to the situation, it’s gotta be a closed situation, no energy comin’ in or going out.  Or perhaps in the more modern view, no info coming in or going out.  With regard to the disposal of information, that seems rather odd.  Because you are, I do think, taking info out of the system when you dispose of it permanently.  Right?  Have y’all even thought about that?  Not I, not yet, anyway.  Until now.



Why are you collecting the data?



H.C. von Baeyer says, p. 150:  Boltzmann’s entropy, it turns out, is “missing information.”  For suppose we knew the exact velocity and position of every molecule in a quart of air.  Our information would be maximal, order would be perfect, entropy would be at its very minimum.  But of course we don’t have this information: It is missing.  When we count all the possible combinations of values of the speeds and positions that the air molecules could take on, we get M, the monster number estimated by David Ruelle.  The letter M takes on a new significance: In addition to “monster” it stands for “missing.”  The number of its digits is the entropy.



Page 153: Computation, Charles Bennett of IBM realized, requires temporary storage of information, whether it is on a ribbon of paper, an electronic memory, or a magnetic tape.  According to Rolf Landauer the destruction of this information by erasure, by clearing the register, or by resetting the memory, is irreversible.  If the Demon commanded an infinite memory, or an infinite store of blank paper, he could indeed violate the second law. …



In the end, Bennet’s vision of the Demon was this:  As he has for a century, the goblin squats at his trapdoor between two gas-filled boxes, watching, measuring, figuring.  But instead of processing all his observations in his head, he works with a little hand-driven calculator that prints its output on a ribbon of blank paper.  As long as fresh ribbon is fed in from the outside, and used ribbon tumbles from the machine in an unending stream, the Demon can sort its molecules and draw on the energy contained in their random motion.  If the paper ribbon is ignored in the analysis, the Demon violates the second law by extracting work from the gas without wasting any heat. …



Page 154: However, if the Demon decides to work with a short ribbon that he recycles by periodic erasure, he will dissipate enough energy, generate enough entropy, and create enough disorder to save the validity of the second law. …



Page 159: … So whether or not the second law holds seems to depend on the whims of that obstreperous goblin.  Surely the supreme law of nature should be formulated in a more robust fashion!



Wojtek Zurek’s new definition of entropy, which is designed to remedy this ambiguity, makes use of algorithmic randomness (aka algorithmic complexity)—a mathematical concept that was introduced by Russian and American mathematicians in the 1960s for the purpose of measuring the degree of randomness of a number, or any collection of data, without recourse to probability. …



Algorithmic randomness is defined as the length of the shortest computer program that can generate the number. … In practice the way to measure length is to translate the computer programs into a standard numerical form and then to count its digits.

       





So that’s what von Baeyer has to say.  A question that arises is:  Do the two stipulations of bringing in paper tape from the outside and disposing of the information to the outside—erasing it—violate the assumption of a closed system?  Seems to me they do.  Only a recycling of the tape can be considered still inside the system.  But the info itself, when erased, leaves the system.



This is related to the first law of thermo, and raises the question of how the 2nd and 1st laws are related.  No energy leaves or enters the system—that’s the whaddaya call it, in mathematical terms?  The postulate? Axiom? Assumption, let’s call it, of the two laws.  So also, I would think, no information enters or leaves the system, but is only changed from one type of info to another…  how can that be?



What “types” of information are there?   So far, in the above discussions, we have missing and non-missing info…  hoo weee!



Another related issue:  What about missing info as a result of the limit imposed by the Heisenberg indeterminacy principle?  Which would relate entropy to uncertainty…   how can that be?  Well, the info is not as much missing as it is unavailable, or unattainable, or hidden.  Now how brown cow?  Well, missing, in fact, sounds good.  Ol’ von Baeyer was speaking of positions and velocities of molecules of a gas and how that could be missing info.  Shure.  But in one case, the classical case, the info is theoretically available, and in the quantum case, it ain’t available.  Caught on the horns of a dilemma, I am.  Where can I get pie?  This time of day. 



Collecting information—what does that do to the system?  Does it actually bring something new in, thus making it not a closed system?  Could be!  It could be that our definition of entropy and even the idea of a closed system in which computations are done is a self-contradictory notion. Creating new info?  Then also disposing of it as discussed above, and the question of closed system coming up there again.

---------------

Update, update, update! Here are my newest entropy examples, as of 2 Augusto 2023, Wednesday night, at home. A faded roadside restaurant sign from long ago, like the 1980s or 90s, still exists on Hwy 65 south of Pine Bluff, near the Grider Field Road turn off. It has an unreadable name on it, or unreadable to me anyway, because I don't remember the name of the restaurant and it's now just a concrete slab, plus the restaurant's faded sign next to the highway with only one or two letters out of ten or so still discernable. The name isn't written in block letters, it's written in cursive.  The other entropy example also comes from Hwy 65 south of Pine Bluff because I thought of it while driving a friend's mother from Dermott in southeast Arkansas to a chemotherapy appointment in Little Rock. The radio was on in the car, but was at very low volume, and I didn't want to interfere with talking by turning it up. I couldn't tell what song was playing on the Sirius XM Classic Vinyl station--well, yeah, modern technology of course had the name of the song right there on the little screen, but I was thinking of the case where I heard a song that I couldn't turn up, and about how to go about giving my best guess.

So these are my two examples: a sign that wasn't quite readable and song on the radio that wasn't quite identifiable because of the low sound level. These are both signals or messages with enough noise in them to obscure their content--just the kind of example that ol' juggler Claude Shannon was thinking of when he arrived at a modern definition of entropy in 1948. How would I estimate the entropy of the sign and the song? One way would be to assign probabilities to my best guesses for the name on the sign and the name of the song. So with that in mind, I'll come back later (sooner than I did this time).

It's later now and I'm back! Sunday 24Sept23. First, I'll mention Shannon's definition of entropy as given by Harry Robertson--namely, the maximum uncertainty in a probability distribution--and also the informal definition of possessing information as given by Ben Schumacher: the ability to reliably distinguish between two or more alternatives.* You possess info if you can distinguish between two or more alternatives presented to you. If you have a definite message, such as  being able to read a highway road sign or hear what song is playing on the radio, you have no need to distinguish between alternatives because you have complete information. This can also be called 100% reliability, and when this isn't the case, there's a distribution of reliabilities that has to be chosen among the alternatives.

Robertson's probability distribution is composed of Schumacher's alternatives, with different  probabilities attached to each alternative, because we can't otherwise reliably distinguish between them. The question is, what are those various probabilities? The answer is, they are found by maximizing the uncertainty in the probability distribution. And once that's done, you have calculated the entropy.

Now that I've said all that, I won't try to do anymore with it. I'll just say I didn't attempt an entropy calculation for my above examples of the almost unreadable roadside sign and the almost inaudible song on Sirius XM's Classic Vinyl. I found out the name of the song by looking at the display on the dashboard: My Generation. I figured out the name on the sign by asking a Pine Bluff friend  about the restaurant that was once there. He's lived here all his life, and he told me that the former restaurant, which was called The Hush Puppy, had become a strip club before the building was torn down. (Strip clubs seem to be not as popular in Arkansas as the are in Texas.) So the next time I passed by the sign, I had that very significant  information, and was able to reliably distinguish between one name and all possible other ones: The sign says "Centerfold's," but maybe without the apostrophe.

18Dec23. I was planning to take a picture of the sign and post it here, but it had been torn down by the time I got a chance to do so. A new sign had taken its place--a "property for sale" sign. Been a long time coming.

*Ben says this definition came from the ol' juggler himself. On page six of the course guidebook for the Great Courses DVD "The Science of Information: From Language to Black Holes," Ben says "Thus, Shannon's definition of information is as follows: 'Information is the ability to distinguish reliably among possible alternatives.'"