Could it be possible to quantum entangle particles on a mass scale?
So in the world I'm building, I'm considering to give one of my characters the ability to quantum entangle specific particles on demand. Now although for many years we thought quantum entanglement was a random and uncontrollable phenomenon, it actually turns out that scientists have been able to do this through their own initiative, to a certain degree: https://www.sciencealert.com/new-production-line-method-for-quantum-entanglement-on-demand.
Sadly however, I have a tendency to misinterpret the science from these sort of articles, but I'll try my best here. So essentially what these scientists have done is somehow entangled photons with electrons in a method that can generate 40 entanglements on demand in a single second.
Of course, I'm no physicist, let alone quantum physicist, so I'm not going to pretend I can fully understand this. This, of course, is where you guys come in. So overall, my question is What kind of ability or device would my character need to do this sort of thing and what would be the limitations of this quantum entanglement?
This post was sourced from https://worldbuilding.stackexchange.com/q/174661. It is licensed under CC BY-SA 4.0.
1 answer
Just reduce the rate at which you lose entanglement
(The paper, for anyone wanting to read it, is Humphreys et al. 2018.)
The hey problem here isn't entangling particles, per se - the problem is keeping them entangled. The authors make the point that what we're interested in isn't just the rate at which we entangle particles $r_{\text{ent}}$, but also the decoherence rate $r_{\text{dec}}$, the rate at which particles decohere, often due to interactions with their immediate environment. This breaks the entanglement and is a huge problem in quantum computing, for a number of reasons. Now, you need to have $r_{\text{ent}}>r_{\text{dec}}$ to have a net positive gain of entangled pairs, or, as they put it, a quantum link efficiency of $\eta\equiv r_{\text{ent}}/r_{\text{dec}}>1$.
The study in question produces entanglement rates of $r_{\text{ent}}=39\text{ Hz}$. Previous work (Stockhill et al. 2017) has produced entanglement rates as high as $r_{\text{ent}}\sim1000\text{ Hz}$ using objects called quantum dots - but at the cost of $r_{\text{dec}}\sim10^7\text{ Hz}$, for an efficiency of only $\eta\sim10^{-4}$, a net loss. The big jump here was producing a significantly large $\eta$ by producing low decoherence rates. With $r_{\text{dec}}=5\text{ Hz}$, the researchers achieved an efficiency of $\eta=39/5\approx8$.
Let's go back to quantum dots. Stockhill et al. were able to entangle qubits at a rate of $r_{\text{ent}}=7300\text{ Hz}$ - almost 200 times as many each second! But there are limits to how quickly you can entangle particles, and those limits are set by your experimental setup. For instance, the quantum dot experiment required lasers, detectors, a magnetic field, and plenty of additional equipment. So you're going to have to improve your experimental setup to increase $r_{\text{ent}}$. The authors speculated that they could reach $r_{\text{ent}}\approx130000\text{ Hz}$ with particular upgrades.
Again, though, you'd still have to deal with decoherence rates. But raising $r_{\text{ent}}$ by that much - a factor of 20 - would make the quantum dot method potentially more feasible, in terms of efficiency $\eta$. You just have to deal with decoherence.
Folks have pointed out that, compared the number of particles we interact with on macroscopic scales, you could only entangle small numbers of particles within reasonably timescales. This is true, but it overlooks the fact that you probably don't need absurdly large numbers of entangled pairs. For example, quantum computers can perform pretty powerful computations with a few thousand qubits - and even dozens of qubits would allow for excellent performance for some tasks. So I suspect this isn't really an issue at all.
0 comment threads