Post History
Theory If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing yo...
Question
physics
#6: Post edited
- ### Theory ###
- If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry.
- In the late 1920s (1927 to be precise), [Werner Heisenberg](https://infogalactic.com/info/Werner_Heisenberg) suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities.
- This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense.
- In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one.
- Or, mathematically, as the sum or standard deviations (error)
- $ \psi \psi \delta x = \delta x $ (the original paper), or
- $ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern)
- In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$
- Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured.
- If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1.
- **The obvious conclusion is that this is an artifact of the math.** That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression.
- ### Experiment ###
- It wasn't until 1972 that [John Clauser](https://scitechdaily.com/first-experimental-proof-that-quantum-entanglement-is-real/#:~:text=In%201972%2C%20John%20Clauser%20and,experimental%20proof%20of%20quantum%20entanglement.) proved experimentally that Albert Einstein and company's thought experiment even existed.
- In the meantime, John Stewart Bell had [proposed a test](https://infogalactic.com/info/Bell%27s_theorem) , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense.
- **If the use of statistical math is wrong, then both particles always had exact values.** This is also sometimes called "hidden variables".
- The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$
- ### Results ###
- 50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later.
- **Another possibility is that measuring one particle influences the other through some yet unknown physics.** Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2.
- Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was.
- We're left facing .. something.
- Either -
- * **Certain things just are not real until they are observed**. (non-realism) This is the preferred answer by physicists because, by gosh, they've spent nearly one hundred years experimentally proving it over and over. But it's not that easy : what is an observer? And what makes an observer special? And who is observing the observer?
* **Parallel Universes have been offered as an explanation.** This is a [very good video](https://youtu.be/kTXTPe3wahc) explaining at greater detail, but the very short version is that the world has one extra dimension: t, x, y, z, and d -- where d is a constrained dimension, sometimes with only one possible value, sometimes with more -- that describes the possible branches in space and time at time + location t, x, y, z when a natural decoherence event of any sort causes entanglement to collapse. I believe the one extra dimension is all you need -- there need not necessarily be "complete copies" of everyone, because the existing (or non-existence) of conservation functions that give t, x, y, z, d a bigger shape are completely unknown at the moment.- , or
- * **Something connects these two particles intimately.** (non-locality) This would keep the "hidden" variables / everything has an exact value. But, would mean something new is happening here. Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z.
- * **Wormholes have been offered as an explanation.** However, working on proposals for warp drives, Miguel Alcubierre, Sonny White, and Erik Lenz have all demonstrated that the amount of energy on the inside of an arbitrarily-shaped Schwarzchild radius can be estimated from it's geometry $ E = {{r c^4} \over { 2 G }}$. The amount of energy required to keep entangled particles connected over a few kilometers (already tried by experiementers) would almost certainly be noticeable.
- ### The Question ###
- What else could non-locality mean?
- ### Theory ###
- If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry.
- In the late 1920s (1927 to be precise), [Werner Heisenberg](https://infogalactic.com/info/Werner_Heisenberg) suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities.
- This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense.
- In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one.
- Or, mathematically, as the sum or standard deviations (error)
- $ \psi \psi \delta x = \delta x $ (the original paper), or
- $ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern)
- In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$
- Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured.
- If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1.
- **The obvious conclusion is that this is an artifact of the math.** That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression.
- ### Experiment ###
- It wasn't until 1972 that [John Clauser](https://scitechdaily.com/first-experimental-proof-that-quantum-entanglement-is-real/#:~:text=In%201972%2C%20John%20Clauser%20and,experimental%20proof%20of%20quantum%20entanglement.) proved experimentally that Albert Einstein and company's thought experiment even existed.
- In the meantime, John Stewart Bell had [proposed a test](https://infogalactic.com/info/Bell%27s_theorem) , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense.
- **If the use of statistical math is wrong, then both particles always had exact values.** This is also sometimes called "hidden variables".
- The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$
- ### Results ###
- 50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later.
- **Another possibility is that measuring one particle influences the other through some yet unknown physics.** Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2.
- Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was.
- We're left facing .. something.
- Either -
- * **Certain things just are not real until they are observed**. (non-realism) This is the preferred answer by physicists because, by gosh, they've spent nearly one hundred years experimentally proving it over and over. But it's not that easy : what is an observer? And what makes an observer special? And who is observing the observer?
- * **Parallel Universes have been offered as an explanation.** This is a [very good video](https://youtu.be/kTXTPe3wahc) explaining at greater detail, but the very short version is : the world has one extra dimension: $t$, $x$, $y$, $z$, and $d$ -- where $d$ is a constrained dimension, sometimes with only one possible value, sometimes with more -- that describes the possible branches in space and time at time + location $<t, x, y, z>$ when a natural decoherence event of any sort causes entanglement to collapse. I believe the one extra dimension is all you need -- there need not necessarily be "complete copies" of the universe, because the existence (or non-existence) of conservation functions that give $<t, x, y, z, d>$ a bigger shape are completely unknown at the moment.
- , or
- * **Something connects these two particles intimately.** (non-locality) This would keep the "hidden" variables / everything has an exact value. But, would mean something new is happening here. Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z.
- * **Wormholes have been offered as an explanation.** However, working on proposals for warp drives, Miguel Alcubierre, Sonny White, and Erik Lenz have all demonstrated that the amount of energy on the inside of an arbitrarily-shaped Schwarzchild radius can be estimated from it's geometry $ E = {{r c^4} \over { 2 G }}$. The amount of energy required to keep entangled particles connected over a few kilometers (already tried by experiementers) would almost certainly be noticeable.
- ### The Question ###
- What else could non-locality mean?
#5: Post edited
- ### Theory ###
- If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry.
- In the late 1920s (1927 to be precise), [Werner Heisenberg](https://infogalactic.com/info/Werner_Heisenberg) suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities.
- This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense.
- In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one.
- Or, mathematically, as the sum or standard deviations (error)
- $ \psi \psi \delta x = \delta x $ (the original paper), or
- $ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern)
- In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$
- Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured.
- If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1.
- **The obvious conclusion is that this is an artifact of the math.** That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression.
- ### Experiment ###
- It wasn't until 1972 that [John Clauser](https://scitechdaily.com/first-experimental-proof-that-quantum-entanglement-is-real/#:~:text=In%201972%2C%20John%20Clauser%20and,experimental%20proof%20of%20quantum%20entanglement.) proved experimentally that Albert Einstein and company's thought experiment even existed.
- In the meantime, John Stewart Bell had [proposed a test](https://infogalactic.com/info/Bell%27s_theorem) , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense.
- **If the use of statistical math is wrong, then both particles always had exact values.** This is also sometimes called "hidden variables".
- The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$
- ### Results ###
- 50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later.
- **Another possibility is that measuring one particle influences the other through some yet unknown physics.** Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2.
- Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was.
- We're left facing .. something.
- Either -
- * **Certain things just are not real until they are observed**. (non-realism) This is the preferred answer by physicists because, by gosh, they've spent nearly one hundred years experimentally proving it over and over. But it's not that easy : what is an observer? And what makes an observer special? And who is observing the observer?
- , or
- * **Something connects these two particles intimately.** (non-locality) This would keep the "hidden" variables / everything has an exact value. But, would mean something new is happening here. Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z.
- * **Wormholes have been offered as an explanation.** However, working on proposals for warp drives, Miguel Alcubierre, Sonny White, and Erik Lenz have all demonstrated that the amount of energy on the inside of an arbitrarily-shaped Schwarzchild radius can be estimated from it's geometry $ E = {{r c^4} \over { 2 G }}$. The amount of energy required to keep entangled particles connected over a few kilometers (already tried by experiementers) would almost certainly be noticeable.
- ### The Question ###
- What else could non-locality mean?
- ### Theory ###
- If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry.
- In the late 1920s (1927 to be precise), [Werner Heisenberg](https://infogalactic.com/info/Werner_Heisenberg) suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities.
- This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense.
- In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one.
- Or, mathematically, as the sum or standard deviations (error)
- $ \psi \psi \delta x = \delta x $ (the original paper), or
- $ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern)
- In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$
- Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured.
- If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1.
- **The obvious conclusion is that this is an artifact of the math.** That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression.
- ### Experiment ###
- It wasn't until 1972 that [John Clauser](https://scitechdaily.com/first-experimental-proof-that-quantum-entanglement-is-real/#:~:text=In%201972%2C%20John%20Clauser%20and,experimental%20proof%20of%20quantum%20entanglement.) proved experimentally that Albert Einstein and company's thought experiment even existed.
- In the meantime, John Stewart Bell had [proposed a test](https://infogalactic.com/info/Bell%27s_theorem) , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense.
- **If the use of statistical math is wrong, then both particles always had exact values.** This is also sometimes called "hidden variables".
- The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$
- ### Results ###
- 50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later.
- **Another possibility is that measuring one particle influences the other through some yet unknown physics.** Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2.
- Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was.
- We're left facing .. something.
- Either -
- * **Certain things just are not real until they are observed**. (non-realism) This is the preferred answer by physicists because, by gosh, they've spent nearly one hundred years experimentally proving it over and over. But it's not that easy : what is an observer? And what makes an observer special? And who is observing the observer?
- * **Parallel Universes have been offered as an explanation.** This is a [very good video](https://youtu.be/kTXTPe3wahc) explaining at greater detail, but the very short version is that the world has one extra dimension: t, x, y, z, and d -- where d is a constrained dimension, sometimes with only one possible value, sometimes with more -- that describes the possible branches in space and time at time + location t, x, y, z when a natural decoherence event of any sort causes entanglement to collapse. I believe the one extra dimension is all you need -- there need not necessarily be "complete copies" of everyone, because the existing (or non-existence) of conservation functions that give t, x, y, z, d a bigger shape are completely unknown at the moment.
- , or
- * **Something connects these two particles intimately.** (non-locality) This would keep the "hidden" variables / everything has an exact value. But, would mean something new is happening here. Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z.
- * **Wormholes have been offered as an explanation.** However, working on proposals for warp drives, Miguel Alcubierre, Sonny White, and Erik Lenz have all demonstrated that the amount of energy on the inside of an arbitrarily-shaped Schwarzchild radius can be estimated from it's geometry $ E = {{r c^4} \over { 2 G }}$. The amount of energy required to keep entangled particles connected over a few kilometers (already tried by experiementers) would almost certainly be noticeable.
- ### The Question ###
- What else could non-locality mean?
#4: Post edited
- ### Theory ###
- If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry.
- In the late 1920s (1927 to be precise), [Werner Heisenberg](https://infogalactic.com/info/Werner_Heisenberg) suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities.
- This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense.
- In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one.
- Or, mathematically, as the sum or standard deviations (error)
- $ \psi \psi \delta x = \delta x $ (the original paper), or
- $ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern)
- In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$
- Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured.
- If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1.
- **The obvious conclusion is that this is an artifact of the math.** That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression.
- ### Experiment ###
- It wasn't until 1972 that [John Clauser](https://scitechdaily.com/first-experimental-proof-that-quantum-entanglement-is-real/#:~:text=In%201972%2C%20John%20Clauser%20and,experimental%20proof%20of%20quantum%20entanglement.) proved experimentally that Albert Einstein and company's thought experiment even existed.
- In the meantime, John Stewart Bell had [proposed a test](https://infogalactic.com/info/Bell%27s_theorem) , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense.
- **If the use of statistical math is wrong, then both particles always had exact values.** This is also sometimes called "hidden variables".
- The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$
- ### Results ###
- 50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later.
- **Another possibility is that measuring one particle influences the other through some yet unknown physics.** Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2.
- Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was.
- We're left facing .. something.
- Either -
- * **Certain things just are not real until they are observed**. (non-realism) This is the preferred answer by physicists because, by gosh, they've spent nearly one hundred years experimentally proving it over and over. But it's not that easy : what is an observer? And what makes an observer special? And who is observing the observer?
- , or
- * **Something connects these two particles intimately.** (non-locality) This would keep the "hidden" variables / everything has an exact value. But, would mean something new is happening here. Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z.
* **Wormholes have been offered as an explanation.** However, working on proposals for warp drives, Miguel Alcubierre, Sonny White, and Erik Lenz have all demonstrated that the amount of energy on the inside of an arbitrarily-shaped Schwarzchild radius can be estimated from it's geometry. The amount of energy required to keep entangled particles connected over a few kilometers (already tried by experiementers) would almost certainly be noticeable.- ### The Question ###
- What else could non-locality mean?
- ### Theory ###
- If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry.
- In the late 1920s (1927 to be precise), [Werner Heisenberg](https://infogalactic.com/info/Werner_Heisenberg) suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities.
- This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense.
- In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one.
- Or, mathematically, as the sum or standard deviations (error)
- $ \psi \psi \delta x = \delta x $ (the original paper), or
- $ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern)
- In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$
- Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured.
- If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1.
- **The obvious conclusion is that this is an artifact of the math.** That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression.
- ### Experiment ###
- It wasn't until 1972 that [John Clauser](https://scitechdaily.com/first-experimental-proof-that-quantum-entanglement-is-real/#:~:text=In%201972%2C%20John%20Clauser%20and,experimental%20proof%20of%20quantum%20entanglement.) proved experimentally that Albert Einstein and company's thought experiment even existed.
- In the meantime, John Stewart Bell had [proposed a test](https://infogalactic.com/info/Bell%27s_theorem) , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense.
- **If the use of statistical math is wrong, then both particles always had exact values.** This is also sometimes called "hidden variables".
- The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$
- ### Results ###
- 50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later.
- **Another possibility is that measuring one particle influences the other through some yet unknown physics.** Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2.
- Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was.
- We're left facing .. something.
- Either -
- * **Certain things just are not real until they are observed**. (non-realism) This is the preferred answer by physicists because, by gosh, they've spent nearly one hundred years experimentally proving it over and over. But it's not that easy : what is an observer? And what makes an observer special? And who is observing the observer?
- , or
- * **Something connects these two particles intimately.** (non-locality) This would keep the "hidden" variables / everything has an exact value. But, would mean something new is happening here. Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z.
- * **Wormholes have been offered as an explanation.** However, working on proposals for warp drives, Miguel Alcubierre, Sonny White, and Erik Lenz have all demonstrated that the amount of energy on the inside of an arbitrarily-shaped Schwarzchild radius can be estimated from it's geometry $ E = {{r c^4} \over { 2 G }}$. The amount of energy required to keep entangled particles connected over a few kilometers (already tried by experiementers) would almost certainly be noticeable.
- ### The Question ###
- What else could non-locality mean?
#3: Post edited
- ### Theory ###
- If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry.
- In the late 1920s (1927 to be precise), [Werner Heisenberg](https://infogalactic.com/info/Werner_Heisenberg) suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities.
- This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense.
- In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one.
- Or, mathematically, as the sum or standard deviations (error)
- $ \psi \psi \delta x = \delta x $ (the original paper), or
- $ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern)
- In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$
- Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured.
- If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1.
- **The obvious conclusion is that this is an artifact of the math.** That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression.
- ### Experiment ###
- It wasn't until 1972 that [John Clauser](https://scitechdaily.com/first-experimental-proof-that-quantum-entanglement-is-real/#:~:text=In%201972%2C%20John%20Clauser%20and,experimental%20proof%20of%20quantum%20entanglement.) proved experimentally that Albert Einstein and company's thought experiment even existed.
- In the meantime, John Stewart Bell had [proposed a test](https://infogalactic.com/info/Bell%27s_theorem) , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense.
- **If the use of statistical math is wrong, then both particles always had exact values.** This is also sometimes called "hidden variables".
- The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$
- ### Results ###
- 50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later.
- **Another possibility is that measuring one particle influences the other through some yet unknown physics.** Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2.
- Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was.
- We're left facing .. something.
- Either -
- * **Certain things just are not real until they are observed**. (non-realism) This is the preferred answer by physicists because, by gosh, they've spent nearly one hundred years experimentally proving it over and over. But it's not that easy : what is an observer? And what makes an observer special? And who is observing the observer?
- , or
- * **Something connects these two particles intimately.** (non-locality) This would keep the "hidden" variables / everything has an exact value. But, would mean something new is happening here. Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z.
- ### The Question ###
What could non-locality mean?
- ### Theory ###
- If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry.
- In the late 1920s (1927 to be precise), [Werner Heisenberg](https://infogalactic.com/info/Werner_Heisenberg) suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities.
- This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense.
- In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one.
- Or, mathematically, as the sum or standard deviations (error)
- $ \psi \psi \delta x = \delta x $ (the original paper), or
- $ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern)
- In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$
- Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured.
- If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1.
- **The obvious conclusion is that this is an artifact of the math.** That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression.
- ### Experiment ###
- It wasn't until 1972 that [John Clauser](https://scitechdaily.com/first-experimental-proof-that-quantum-entanglement-is-real/#:~:text=In%201972%2C%20John%20Clauser%20and,experimental%20proof%20of%20quantum%20entanglement.) proved experimentally that Albert Einstein and company's thought experiment even existed.
- In the meantime, John Stewart Bell had [proposed a test](https://infogalactic.com/info/Bell%27s_theorem) , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense.
- **If the use of statistical math is wrong, then both particles always had exact values.** This is also sometimes called "hidden variables".
- The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$
- ### Results ###
- 50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later.
- **Another possibility is that measuring one particle influences the other through some yet unknown physics.** Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2.
- Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was.
- We're left facing .. something.
- Either -
- * **Certain things just are not real until they are observed**. (non-realism) This is the preferred answer by physicists because, by gosh, they've spent nearly one hundred years experimentally proving it over and over. But it's not that easy : what is an observer? And what makes an observer special? And who is observing the observer?
- , or
- * **Something connects these two particles intimately.** (non-locality) This would keep the "hidden" variables / everything has an exact value. But, would mean something new is happening here. Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z.
- * **Wormholes have been offered as an explanation.** However, working on proposals for warp drives, Miguel Alcubierre, Sonny White, and Erik Lenz have all demonstrated that the amount of energy on the inside of an arbitrarily-shaped Schwarzchild radius can be estimated from it's geometry. The amount of energy required to keep entangled particles connected over a few kilometers (already tried by experiementers) would almost certainly be noticeable.
- ### The Question ###
- What else could non-locality mean?
#2: Post edited
Meaning of Non-Locality or Non-Realism?
- Meaning of Non-Locality?
- ### Theory ###
- If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry.
- In the late 1920s (1927 to be precise), [Werner Heisenberg](https://infogalactic.com/info/Werner_Heisenberg) suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities.
- This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense.
- In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one.
- Or, mathematically, as the sum or standard deviations (error)
- $ \psi \psi \delta x = \delta x $ (the original paper), or
- $ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern)
- In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$
- Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured.
- If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1.
- **The obvious conclusion is that this is an artifact of the math.** That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression.
- ### Experiment ###
- It wasn't until 1972 that [John Clauser](https://scitechdaily.com/first-experimental-proof-that-quantum-entanglement-is-real/#:~:text=In%201972%2C%20John%20Clauser%20and,experimental%20proof%20of%20quantum%20entanglement.) proved experimentally that Albert Einstein and company's thought experiment even existed.
- In the meantime, John Stewart Bell had [proposed a test](https://infogalactic.com/info/Bell%27s_theorem) , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense.
- **If the use of statistical math is wrong, then both particles always had exact values.** This is also sometimes called "hidden variables".
- The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$
- ### Results ###
- 50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later.
- **Another possibility is that measuring one particle influences the other through some yet unknown physics.** Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2.
- Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was.
We're left facing .. something. Either -* **Certain things just are not real until they are observed**, or* **Something connects these two particles intimately.** Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z.- ### The Question ###
What could non-locality or non-realism mean?
- ### Theory ###
- If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry.
- In the late 1920s (1927 to be precise), [Werner Heisenberg](https://infogalactic.com/info/Werner_Heisenberg) suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities.
- This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense.
- In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one.
- Or, mathematically, as the sum or standard deviations (error)
- $ \psi \psi \delta x = \delta x $ (the original paper), or
- $ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern)
- In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$
- Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured.
- If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1.
- **The obvious conclusion is that this is an artifact of the math.** That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression.
- ### Experiment ###
- It wasn't until 1972 that [John Clauser](https://scitechdaily.com/first-experimental-proof-that-quantum-entanglement-is-real/#:~:text=In%201972%2C%20John%20Clauser%20and,experimental%20proof%20of%20quantum%20entanglement.) proved experimentally that Albert Einstein and company's thought experiment even existed.
- In the meantime, John Stewart Bell had [proposed a test](https://infogalactic.com/info/Bell%27s_theorem) , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense.
- **If the use of statistical math is wrong, then both particles always had exact values.** This is also sometimes called "hidden variables".
- The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$
- ### Results ###
- 50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later.
- **Another possibility is that measuring one particle influences the other through some yet unknown physics.** Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2.
- Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was.
- We're left facing .. something.
- Either -
- * **Certain things just are not real until they are observed**. (non-realism) This is the preferred answer by physicists because, by gosh, they've spent nearly one hundred years experimentally proving it over and over. But it's not that easy : what is an observer? And what makes an observer special? And who is observing the observer?
- , or
- * **Something connects these two particles intimately.** (non-locality) This would keep the "hidden" variables / everything has an exact value. But, would mean something new is happening here. Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z.
- ### The Question ###
- What could non-locality mean?
#1: Initial revision
Meaning of Non-Locality or Non-Realism?
### Theory ### If I am understanding this history of this correctly, in the 1920s the mathematical process of converting measurements with a certain amount of measurement error into numbers, then doing your calculation, wasn't doing a very good job of predicting the experimental outcomes of nuclear chemistry. In the late 1920s (1927 to be precise), [Werner Heisenberg](https://infogalactic.com/info/Werner_Heisenberg) suggested dispensing with the conversion to discrete values step, and doing all of the math in terms of probabilities. This change in process produced much better results. However, Albert Einstein, Boris Podolsky, and Nathan Rosen produced a criticism that the statistical math must be incomplete. They generated a hypothetical example of two particles being produced by the same event, where the answers given by the new statistical physics process were nonsense. In this case, there is a limit to the ability to measure two co-variant terms (position and momentum). Whatever device we use to measure one of these properties changes the other one. Or, mathematically, as the sum or standard deviations (error) $ \psi \psi \delta x = \delta x $ (the original paper), or $ \sigma_{x} \sigma_{y} \ge {h \over 2} $ (modern) In the Einstein, Podolsky, Rosen test case, two particles are generated with complementary velocities - equal and opposite to one another; but the actual momentums could be anything between "a" and "b" with an equal % chance of any value in between, or mathematically $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$ Because you know they are equal and opposite, the moment you know the velocity of particle 1, you also know the velocity of particle 1 is equal to and opposite of what you just measured. If there was any "realness" to the probability distribution of particle 1 and particle 2, it just vanished for both particles - even though you never measured particle 2, and even if Particle 2 is light years away from Particle 1 when you measure Particle 1. **The obvious conclusion is that this is an artifact of the math.** That was Einstein, Podolsky, and Rosen's interpretation of the result. They suggested physicists do more work to come up with a more complete expression. ### Experiment ### It wasn't until 1972 that [John Clauser](https://scitechdaily.com/first-experimental-proof-that-quantum-entanglement-is-real/#:~:text=In%201972%2C%20John%20Clauser%20and,experimental%20proof%20of%20quantum%20entanglement.) proved experimentally that Albert Einstein and company's thought experiment even existed. In the meantime, John Stewart Bell had [proposed a test](https://infogalactic.com/info/Bell%27s_theorem) , if entangled particles were ever found to exist, of proving whether Einstein, Podolsky, and Rosen's identified edge case was a real disproof of Heisenberg's statistical treatment, indicating that more physics were required to get a complete answer, or that the math was valid ... even though valid math makes no sense. **If the use of statistical math is wrong, then both particles always had exact values.** This is also sometimes called "hidden variables". The proposed experiment involved being able to control the range of allowed values $ P [ a \le X \le b] = \int_{a}^{b} | \psi(x^2)| dx$. If you plot all of these along a range, "hidden variables" predicts an Y = mX straight-line relationship between the adjustment an experimenter makes, and the value an experimenter observes. The predicted distribution by quantum mechanics is a distinctly different curved line, looking a bit like a sine wave. $4c \ge \sqrt{2} - 1$ ### Results ### 50 years of experiments have demonstrated again and again that the $\sqrt{2} - 1$ relationship holds. The two particles in experiment after experiment, it seems, absolutely did not have exact values at the time they were created. Or even several seconds later. **Another possibility is that measuring one particle influences the other through some yet unknown physics.** Maybe the measurement generates some small wave of disturbance that travels through normal space and interferes with Particle 2. Experimenters have separated the particles by distances so great that this "influence wave" would have to be traveling faster than light to do its work. Experiementers have also placed Particle 2 behind layers of shielding, moved Particle 2 so that it isn't on a straight line path, even run Particle 2 through a maze of other apparatuses so that an "influence wave" would not only have to travel faster than light, it would also need to magically know where Particle 2 was. We're left facing .. something. Either - * **Certain things just are not real until they are observed**, or * **Something connects these two particles intimately.** Both particles are still somehow existing at the event that created them both, even though existing in two discrete points in t,x,y,z. ### The Question ### What could non-locality or non-realism mean?