Sunday, September 24, 2023

Space, particles, forces, entropy


By Herbert von Irksome



I.  Space, particles, forces



People writing about physics as we now understand it are brushing up against a larger, undiscovered conceptual territory.  New ideas that will change our entire way of thinking about forces, energy and matter are waiting in the wings for their moment to be discovered. 



Action-at-a-distance.  Reaction-at-a-distance.  What do words mean in physics?  What do letters and other symbols mean in physics?  F = ma.  Net force equals mass times acceleration.  Hoo boy.  Right as rain.  Right left front back.  These have a meaning easy to visualize.  Force, however, is a different matter.  It is carried by a field, but the action of the field issssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss caused or carried by “particles.”  Not by real particles, but by virtual particles.



The seat of force, however, is sort of matter itself.  On the large scale, where electric charge sums to zero, matter is uncharged mass.  But our basic stable, small-scale, matter, outside the nucleus, is electrically charged, and creates its own fields of the electromagnetic type.  I’m speaking of the electron and the proton.  The neutron is not stable outside the nucleus, because it decays into a proton, electron and neutrino after an average lifetime of 11 minutes. 



But as big Al Einstein claimed and many believe, matter don’t matter no more as much as the field produced by matter or by electric charge.  The action is through the field, thusly re-action is through the field, also.  What the hell is really going on?  That’s the big idea.  To figure out.  In a new way. And also find out where to get pi.  This time of day.



Space, spacetime, field, matter.  Toasty, toasty, toasty.



And the mistaken concept of “particles.”  A little tiny sphere—not.  The hydrogen atom as a little planetary system—not.  Even the concept of “little” needs to be re-examined.  It is a non-viewable realm of space.  Maybe not of space, even, but a realm of the breakdown of space into what we call elementary particles.  There should be no expectation that a linear distance scale continues down into the atomic realm.  Length—space, in other words—may someday be considered an unrealistic physical quantity. 







II  Entropy



As we understand it presently, disposing of accumulated data is what increases entropy.  Creating and disposing of the garbage.  But of course we are quite aware these days that throwing stuff away is not the only alternative.  We also have recycling.  Can we recycle information?  Meaning reuse some of it.  Well, how is information or data used in the first place?  And what is used data?  Ha Ha Ho.



Well, it makes sense.  Data that has been collected but not observed by an observer is unused, we could say.  We are talking about measurement and detection.  Length detection is one thing I’m writing about in another strooy. Storoy.  Stoory. What is an observer?  An entity or person that has a use or purpose for the data.



See there, it’s all right!  Once that observer uses the data for whatever purpose he had in mind in the first place, then the data is by definition “used.”  And entropy increases, mon ami.  Je ne parle pas Francais.  Maybe some day, however.



If the second law of thermodynamics is going to apply to the situation, it’s gotta be a closed situation, no energy comin’ in or going out.  Or perhaps in the more modern view, no info coming in or going out.  With regard to the disposal of information, that seems rather odd.  Because you are, I do think, taking info out of the system when you dispose of it permanently.  Right?  Have y’all even thought about that?  Not I, not yet, anyway.  Until now.



Why are you collecting the data?



H.C. von Baeyer says, p. 150:  Boltzmann’s entropy, it turns out, is “missing information.”  For suppose we knew the exact velocity and position of every molecule in a quart of air.  Our information would be maximal, order would be perfect, entropy would be at its very minimum.  But of course we don’t have this information: It is missing.  When we count all the possible combinations of values of the speeds and positions that the air molecules could take on, we get M, the monster number estimated by David Ruelle.  The letter M takes on a new significance: In addition to “monster” it stands for “missing.”  The number of its digits is the entropy.



Page 153: Computation, Charles Bennett of IBM realized, requires temporary storage of information, whether it is on a ribbon of paper, an electronic memory, or a magnetic tape.  According to Rolf Landauer the destruction of this information by erasure, by clearing the register, or by resetting the memory, is irreversible.  If the Demon commanded an infinite memory, or an infinite store of blank paper, he could indeed violate the second law. …



In the end, Bennet’s vision of the Demon was this:  As he has for a century, the goblin squats at his trapdoor between two gas-filled boxes, watching, measuring, figuring.  But instead of processing all his observations in his head, he works with a little hand-driven calculator that prints its output on a ribbon of blank paper.  As long as fresh ribbon is fed in from the outside, and used ribbon tumbles from the machine in an unending stream, the Demon can sort its molecules and draw on the energy contained in their random motion.  If the paper ribbon is ignored in the analysis, the Demon violates the second law by extracting work from the gas without wasting any heat. …



Page 154: However, if the Demon decides to work with a short ribbon that he recycles by periodic erasure, he will dissipate enough energy, generate enough entropy, and create enough disorder to save the validity of the second law. …



Page 159: … So whether or not the second law holds seems to depend on the whims of that obstreperous goblin.  Surely the supreme law of nature should be formulated in a more robust fashion!



Wojtek Zurek’s new definition of entropy, which is designed to remedy this ambiguity, makes use of algorithmic randomness (aka algorithmic complexity)—a mathematical concept that was introduced by Russian and American mathematicians in the 1960s for the purpose of measuring the degree of randomness of a number, or any collection of data, without recourse to probability. …



Algorithmic randomness is defined as the length of the shortest computer program that can generate the number. … In practice the way to measure length is to translate the computer programs into a standard numerical form and then to count its digits.

       





So that’s what von Baeyer has to say.  A question that arises is:  Do the two stipulations of bringing in paper tape from the outside and disposing of the information to the outside—erasing it—violate the assumption of a closed system?  Seems to me they do.  Only a recycling of the tape can be considered still inside the system.  But the info itself, when erased, leaves the system.



This is related to the first law of thermo, and raises the question of how the 2nd and 1st laws are related.  No energy leaves or enters the system—that’s the whaddaya call it, in mathematical terms?  The postulate? Axiom? Assumption, let’s call it, of the two laws.  So also, I would think, no information enters or leaves the system, but is only changed from one type of info to another…  how can that be?



What “types” of information are there?   So far, in the above discussions, we have missing and non-missing info…  hoo weee!



Another related issue:  What about missing info as a result of the limit imposed by the Heisenberg indeterminacy principle?  Which would relate entropy to uncertainty…   how can that be?  Well, the info is not as much missing as it is unavailable, or unattainable, or hidden.  Now how brown cow?  Well, missing, in fact, sounds good.  Ol’ von Baeyer was speaking of positions and velocities of molecules of a gas and how that could be missing info.  Shure.  But in one case, the classical case, the info is theoretically available, and in the quantum case, it ain’t available.  Caught on the horns of a dilemma, I am.  Where can I get pie?  This time of day. 



Collecting information—what does that do to the system?  Does it actually bring something new in, thus making it not a closed system?  Could be!  It could be that our definition of entropy and even the idea of a closed system in which computations are done is a self-contradictory notion. Creating new info?  Then also disposing of it as discussed above, and the question of closed system coming up there again.

---------------

Update, update, update! Here are my newest entropy examples, as of 2 Augusto 2023, Wednesday night, at home. A faded roadside restaurant sign from long ago, like the 1980s or 90s, still exists on Hwy 65 south of Pine Bluff, near the Grider Field Road turn off. It has an unreadable name on it, or unreadable to me anyway, because I don't remember the name of the restaurant and it's now just a concrete slab, plus the restaurant's faded sign next to the highway with only one or two letters out of ten or so still discernable. The name isn't written in block letters, it's written in cursive.  The other entropy example also comes from Hwy 65 south of Pine Bluff because I thought of it while driving a friend's mother from Dermott in southeast Arkansas to a chemotherapy appointment in Little Rock. The radio was on in the car, but was at very low volume, and I didn't want to interfere with talking by turning it up. I couldn't tell what song was playing on the Sirius XM Classic Vinyl station--well, yeah, modern technology of course had the name of the song right there on the little screen, but I was thinking of the case where I heard a song that I couldn't turn up, and about how to go about giving my best guess.

So these are my two examples: a sign that wasn't quite readable and song on the radio that wasn't quite identifiable because of the low sound level. These are both signals or messages with enough noise in them to obscure their content--just the kind of example that ol' juggler Claude Shannon was thinking of when he arrived at a modern definition of entropy in 1948. How would I estimate the entropy of the sign and the song? One way would be to assign probabilities to my best guesses for the name on the sign and the name of the song. So with that in mind, I'll come back later (sooner than I did this time).

It's later now and I'm back! Sunday 24Sept23. First, I'll mention Shannon's definition of entropy as given by Harry Robertson--namely, the maximum uncertainty in a probability distribution--and also the informal definition of possessing information as given by Ben Schumacher: the ability to reliably distinguish between two or more alternatives.* You possess info if you can distinguish between two or more alternatives presented to you. If you have a definite message, such as  being able to read a highway road sign or hear what song is playing on the radio, you have no need to distinguish between alternatives because you have complete information. This can also be called 100% reliability, and when this isn't the case, there's a distribution of reliabilities that has to be chosen among the alternatives.

Robertson's probability distribution is composed of Schumacher's alternatives, with different  probabilities attached to each alternative, because we can't otherwise reliably distinguish between them. The question is, what are those various probabilities? The answer is, they are found by maximizing the uncertainty in the probability distribution. And once that's done, you have calculated the entropy.

Now that I've said all that, I won't try to do anymore with it. I'll just say I didn't attempt an entropy calculation for my above examples of the almost unreadable roadside sign and the almost inaudible song on Sirius XM's Classic Vinyl. I found out the name of the song by looking at the display on the dashboard: My Generation. I figured out the name on the sign by asking a Pine Bluff friend  about the restaurant that was once there. He's lived here all his life, and he told me that the former restaurant, which was called The Hush Puppy, had become a strip club before the building was torn down. (Strip clubs seem to be not as popular in Arkansas as the are in Texas.) So the next time I passed by the sign, I had that very significant  information, and was able to reliably distinguish between one name and all possible other ones: The sign says "Centerfold's," but maybe without the apostrophe.

18Dec23. I was planning to take a picture of the sign and post it here, but it had been torn down by the time I got a chance to do so. A new sign had taken its place--a "property for sale" sign. Been a long time coming.

*Ben says this definition came from the ol' juggler himself. On page six of the course guidebook for the Great Courses DVD "The Science of Information: From Language to Black Holes," Ben says "Thus, Shannon's definition of information is as follows: 'Information is the ability to distinguish reliably among possible alternatives.'"




Wednesday, August 2, 2023

Length detection, simultaneity, and the uncertainty principle


(I didn't update this post on Aug 2, just moved the publication date forward by a two years and a few months, and added this confessional note. But later I'll add more of a substantial something to it.)

Heisenberg’s idea is that only measurable quantities should enter the theory. Is the length of a moving meter stick a measurable quantity?

At first we all say, sure, it’s measured by two measurers who determine where the ends are, simultaneously.

Then we say, how exactly? How do they locate the two ends of a moving rod simultaneously when they don't know where the ends are? If there were no length contraction phenomenon, they would know where the ends of the rod are supposed to be—one meter apart for a meter stick.

Now there’s a difference in asking for the measurement to be done in principle, and in actually carrying out that measurement. Given a repeatable sequence of experiments attempting to measure the location of the two ends simultaneously, one can zero in on the correct positions. How would one—two, actually—know when the correct measurement had occurred?
An air track with gliders provides a good thought experiment. The measuring devices are the usual photogates, with one capability added: they must be synchronized. The experiment starts with the starting of the photogate timers, which are positioned so that they are separated by a distance equal to the length of the glider—the “rest length.” Then a glider comes on through. The first photogate is programmed to ignore the passage of the front of the glider and to stop timing when the back of the glider passes through it. The second photogate is programmed to stop timing when the front of the glider passes through it.

A successful length measurement is achieved when the gates stop timing simultaneously. Which is more of a detection of the presence of something—the ends of the glider—than a measurement of some quantity.

In an uncontrolled experiment, where you’re trying to measure the length of some object whose velocity and rest length you don’t know and where you only have one chance to make your measurement—this is basic kinematics—can you find its two ends simultaneously? Not with just two observers.

One possible realistic way would be to have two SETS of synchronized photogates placed with arbitrarily precise separation. Each set would be programmed just like the two single photogates discussed above. The first set ignores the leading edge of the object, but stops timing as the trailing edge passes. The second set of photogates stops timing as the leading edge of the object passes. Here's a drawing of the set-up:



===========================

                                                            <<<<<<<<<<               >>>>>>>>>>

The double line represents the meter stick, “<” is a trailing-edge detector and “>” is a leading-edge detector. The total length of this set up is one meter, corresponding to the longest possible length to be measured. The 10 detectors in each set are spaced at, let’s say, one-millimeter intervals. Thus, with a precision of one millimeter, lengths ranging from 1.000 (outermost detectors) to 0.982 meters (innermost detectors) can be measured. (The separation of the innermost detectors is 0.980 meters, but each one is itself one millimeter in width, making the smallest measureable length 0.982. See table below.)

Each detector on the left will ignore the leading edge of the meter stick and will stop timing as the trailing edge passes it. Each detector on the right will stop timing as the leading passes it (and ignore the trailing edge). Length detection is achieved when a pair (or more than one pair) of detectors is triggered simultaneously. For all except for the smallest and longest lengths measurable, more than one pair of detectors will be triggered simultaneously, as shown by the table.

Let the photogates/timers detectors be numbered 1-20 from left to right. L' is the contracted length.

 L'           gates triggered simultaneously

1.00
1,20
.999
(1,19) (2,20)
.998
(1,18) (2,19) (3,20)
.997
(1,17) (2,18) (3,19) (4,20)
.996
(1,16) (2,17) (3,18) (4,19) (5,20)
.995
(1,15) (2,16) (3,17) (4,18) (5,19) (6,20)
.994
(1,14) (2,15) (3,16) (4,17) (5,18) (6,19) (7,20)
.993
(1,13) (2,14) (3,15) (4,16) (5,17) (6,18) (7,19) (8,20)
.992
(1,12) (2,13) (3,14) (4,15) (5,16) (6,17) (7,18) (8,19) (9,20)
.991
(1,11) (2,12) (3,13) (4,14) (5,15) (6,16) (7,17) (8,18) (9,19) (10,20)
.990
(2,11) (3,12) (4,13) (5,14) (6,15) (7,16) (8,17) (9,18) (10, 19)
.989
(3,11) (4,12) (5,13) (6,14) (7,15) (8,16) (9,17) (10,18)
.988
(4,11) (5,12) (6,13 (7,14) (8,15) (9,16) (10,17)
.987
(5,11) (6,12) (7,13) (8,14) (9,15) (10, 16)
.986
(6,11) (7,12) (8,13) (9,14) (10,15)
.985
(7,11) (8,12) (9,13) (10,14)
.984
(8,11) (9,12) (10,13)
.983
(9,11) (10,12)
.982
(10,11)

In practice, especially for a very fast meter stick, which is what we’re interested in, it’s likely there will not be simultaneous times on any pair of (trailing, leading) detectors. This imprecision in time corresponds to a little "delta-x" imprecision in length. What does this little delta-x imprecision mean? Classically, it can be improved upon indefinitely. Quantumly, the limit is set by (delta-x)(delta-p) > h-bar/2, the Heisenberg uncertainty principle for simultaneous position and momentum measurements.

How precise the length measurement can be depends on how the fast the meter stick is traveling. Linear momentum, p, is mass times speed in Newtonian physics. In relativistic physics, the "gamma factor" of g = (1-v2/c2)-1/2 must be included, so p = g mv. And we are assuming or at least desiring a very fast-moving meter stick so we need the gamma factor and we expect a length contraction.

We know the rest length, L. The contracted length we're trying to measure is given by L’ = L/g. This is one equation in two unknowns. Do you (we, I, or whoever) have some other equation to use? I don't know, but when or if we do measure L', then v can be calculated to the same precision as L' was measured. The uncertainty principle seems to be violated by successful length detection.

Wait. The gamma factor in relativistic momentum and the gamma factor in length contraction cancel each other when multiplied together in the uncertainty principle inequality! What does this mean!? More musing will be required.
 
_____________________


But, in the meantime, let us not forget Dr. W.G.V. Rosser, who says there ain’t no conception of “length contraction” unless or until you make a transformation to another reference frame! The length is whatever you measure, or as Dr. WgV Spunk says, whatever you “detect.” However, what perforce does that actually tell you, information-wise? You got data, which you assume to be the times that ends of stick passed these locations.

What are these hierarchies of motion, anyway? The speed, acceleration, rate of acceleration, etc, things? And then there’s a hierarchy within each of these: rest frames moving at constant relative velocities with respect to each other; frames moving with constant acceleration with respect to each other, etc. Imaginary, mostly. And what about rotating frames? Can we imagine a hierarchy of these? The easiest hierarchy to imagine is concentric rotating frames—all centered about the same point, rotating at different angular velocities. BUT, then we have isolated a special point in space, the center of rotation. Hmmm. So, immediately, the rotation situation is dramatically different in one simple way.

But we don’t have a conception of space to begin with, except a space that exists with respect to a given mass. Hoo wee. Can space itself rotate? Ha Ha. Happens all the time.

What is the connection between spaces and mathematics? We know this. Topology. But what is the relation between real space and mathematics? Undiscovered. Doglegs and cateyes. Hornye toad surfaces.

Start the ajp paper pl;ease, okay? Kay.

Length measurement is a straightforward process when the object whose length is to be measured is at rest. Length measurement for a moving object is not a determination of distance from end to end that can be characterized as having the property of length. This property is for sale now, since private companies have bought out the metric system.

No, really.

Another conundrum: If you measure the time an object takes to pass a single detector, what information do you have? Time! How do you interpret this time? If you know the rest length of the object, and you don’t worry about length contraction, then you have a speed measurement—an average speed measurement: length/time. If you are worried about length contraction—if you’re the worrying type, which might be good in this case—then you have two unknowns: the contracted length and the velocity. And you only have the one time measurement as your experimental data. What’s a worrywart to do?

Update------Update-----Update------Update:  Let's continue with the idea of measuring the time the meter stick takes to pass a certain point, but let's make it TWO points instead of one.  Two timers are spaced like so:

                             |                                              |
                            1<------------ x ---------------->2


The average speed of the meter stick can be calculated thusly: distance x divided by Δt, the time it takes the leading edge of the meter stick to get from 1 to 2, or

                                         v = x / Δt.

There's also a time ΔT that the stick takes to pass by a single point, such as timer 1 or timer 2. This is the time I was discussing above in the "conundrum" paragraph.  But now we have an independent measurement of the average velocity, and so we can find L', the assumed-contracted length, by a simple calculation.  The timers are set up so they can measure both  Δt  and   ΔT .  Then we can use

                               L' = v ΔT  =  (x/ΔtΔT     
                                                      
                                   =  x (ΔT/Δt). 

The right-hand side contains three measured quantities, so this is another way to find the length of the moving meter stick.  Has it ever been done for real.....?