Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

Comments on Meaning of Non-Locality?

Post

Meaning of Non-Locality?

+2
−0

Theory

If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry.

In the late 1920s (1927 to be precise), Werner Heisenberg suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities.

This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense.

In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one.

Or, mathematically, as the sum or standard deviations (error)

$ \psi \psi \delta x = \delta x $ (the original paper), or

$ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern)

In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$

Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured.

If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1.

The obvious conclusion is that this is an artifact of the math. That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression.

Experiment

It wasn't until 1972 that John Clauser proved experimentally that Albert Einstein and company's thought experiment even existed.

In the meantime, John Stewart Bell had proposed a test , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense.

If the use of statistical math is wrong, then both particles always had exact values. This is also sometimes called "hidden variables".

The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$

Results

50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later.

Another possibility is that measuring one particle influences the other through some yet unknown physics. Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2.

Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was.

We're left facing .. something.

Either -

  • Certain things just are not real until they are observed. (non-realism) This is the preferred answer by physicists because, by gosh, they've spent nearly one hundred years experimentally proving it over and over. But it's not that easy : what is an observer? And what makes an observer special? And who is observing the observer?

    • Parallel Universes have been offered as an explanation. This is a very good video explaining at greater detail, but the very short version is : the world has one extra dimension: $t$, $x$, $y$, $z$, and $d$ -- where $d$ is a constrained dimension, sometimes with only one possible value, sometimes with more -- that describes the possible branches in space and time at time + location $<t, x, y, z>$ when a natural decoherence event of any sort causes entanglement to collapse. I believe the one extra dimension is all you need -- there need not necessarily be "complete copies" of the universe, because the existence (or non-existence) of conservation functions that give $<t, x, y, z, d>$ a bigger shape are completely unknown at the moment.

, or

  • Something connects these two particles intimately. (non-locality) This would keep the "hidden" variables / everything has an exact value. But, would mean something new is happening here. Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z.

    • Wormholes have been offered as an explanation. However, working on proposals for warp drives, Miguel Alcubierre, Sonny White, and Erik Lenz have all demonstrated that the amount of energy on the inside of an arbitrarily-shaped Schwarzchild radius can be estimated from it's geometry $ E = {{r c^4} \over { 2 G }}$. The amount of energy required to keep entangled particles connected over a few kilometers (already tried by experiementers) would almost certainly be noticeable.

The Question

What else could non-locality mean?

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.
Why should this post be closed?

1 comment thread

The non-realist observer (1 comment)
The non-realist observer
honnza‭ wrote 11 months ago

One pretty interesting proposed solution is: the real reason it is so hard to develop a working model of quantum gravity is that gravity is not, in fact, quantum, but classical. There's no way to avoid everything measuring the total relative position of every other mass in the universe all the time. Therefore the mass distribution of stuff in the universe simply cannot be in superposition. And as all other properties of matter try to entangle with the gravity field, they too find themselves having definite values sooner or later. Notice how electron orbitals are their energy eigenstates in the potential well of their nucleus? It's because that's the exact thing that's being measured of them all the time. By gravity.