You appear to be confusing measurement with meaning.
I'm arguing strenuously against confusing measurement with meaning, but I see I'm not making myself clear. No matter, I can't think of anything new to say.
The existence of the Planck scale rests on exactly the same metaphysical basis as my post - it assigns numbers to distances.
Not understanding this sentence. The Planck scale sets the limits of our ability, even in principle, to measure time and space. I believe that is physics and not metaphysics. I'm not following your meaning here.
Machinery, physical entities set in motion by anything.
When you wrote that, didn't you think I might ask you what sets the machinery in motion? And didn't you realize you'd get stuck an infinite regress and either have to say, "Damn, I guess God did it after all," or, "Damn, I have no freaking idea why anything happens or what causality is."
Please consider yourself so asked.
I'm connecting metaphysics with physics, which is kind of a normal thing to do.
Well earlier
Q-reeus characterized me as "intensely philosophical and mathematical." Which I would frankly take a great compliment; but that I gather was intended to be mildly pejorative. If you are connecting metaphysics with physics, by definition you are doing philosophy and not science. Which is ok by me, but to the extent anyone is doing metaphysics, they are not doing science.
What's normal, I think, is for some scientists to be
unaware they are doing metaphysics when they say, "The world is the way I have modeled it." Rather than, "I've just developed a kickass model that gets 6 decimal places of accuracy when I slam a proton into a wall of taxpayer money."
I read that in Feynman's QED, I think I mentioned this earlier, they did some experiment, this might have been about the crazy renormalization I've heard about -- "getting the infinities out of the equations" -- exactly what we've been talking about. So after Feynman and Schwinger and Tomonaga got the Nobel prize and Freeman Dyson didn't, they did some experiment and got 12 decimal places of accuracy between their theory and the experiment. 12 decimal places. Now that's great if you're an engineer or a physicist. But to the extent you think the real numbers are actually real, you need to realize that a real number has infinitely many decimal places, and that your theory in no way describes what might be happening out there. Some genius not yet born is going to shock everyone with a new paradigm and it's off to the races again. Science is never final.
I want to communicate this: The mathematical real numbers are exceedingly strange and mathematicians don't even have a complete handle on them. For example we don't know how many real numbers there are. That's the Continuum hypothesis, which has driving the development of set theory from its very beginnings to the state of the art research of today.
So when you as physicists build models of the world that have time as a continuous real-valued variable, you are making a choice of modeling tool, and you need to be very careful of the ontological assumptions you're unconsciously making. Because if you studied the real numbers the way mathematicians do, you might not believe in them at all as a model of time.
And there are alternatives. Brouwer's intuitionistic real line and its modern incarnation as the constructive real line. The hyperreals with their cloud of infnitesimals, dubbed a "monad" after Leibniz,floating around each real.
So the standard real numbers aren't the last word in real numbers. There's no reason at all to think the standard real numbers we use in math and physics have anything to do with the true nature of the continuum. There are good reasons to think it doesn't. So just stop putting such religious faith in the real numbers. They are very strange and if you really believed they represented anything physically real, you'd have to be prepared to answer a lot of hard question about which set theoretic principles you regard as having ontological truth.
There is no way to observe a non-computable outcome.
I think we are all in agreement about this.
I am fascinated by the noncomputable numbers. They're not very well known. They include all the rationals and a lot of the familiar irrationals. In fact any real number you have ever heard of and ever WILL hear of are computable. That's because the noncomputable numbers can never have names. How weird is that!
[I lied slightly for brevity. Some noncomputable numbers can nevertheless be
defined, which is a subtly different concept. Chaitin's Omega is one such. It's the probability that a random Turing machine will halt. If it were computable it would solve the halting problem, which can't be solved. So Omega can't be computable. But we can easily describe it and talk about it. It's as naturally occurring a noncomputable number as you'll find.
https://en.wikipedia.org/wiki/Chaitin's_constant]
Now if you take the real line, the continuous real line of our imagination, it has no holes in it. If you take out all of the noncomputable numbers, all that's left is a paltry countable set of computable numbers, a set of measure zero. If you threw a dart at the real line you'd hit a noncomputable real with probability 1.
I think this should be taken into account by proponents of the Computable universe hypothesis. In a computable universe, your real line must be extremely sparse, a virtual desert of computable locations in a sea of nonexistence. And what does that say about your physics? Well nothing really since physics can never measure a noncomputable real anyway. So you can always build a MODEL with only computable numbers and functions. But you can't conclude anything about the true nature of the world.