Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

If our universe was a simulation, what could a bug look like?

+0
−0

Let's assume, without revoking any of today's science, that the world is a simulation.

What would a bug look like?

I'm assuming that "the eiffel tower suddenly being bent at 45°" is rather unlikely, the same way you don't see a bunch of clowns appearing in the middle of a game of need for speed. So what is likely?

  • repetitions / deja-vus?
  • changes in gravity / speed of light?
  • ...
History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.
Why should this post be closed?

This post was sourced from https://worldbuilding.stackexchange.com/q/3136. It is licensed under CC BY-SA 3.0.

0 comment threads

2 answers

+0
−0

If a universe is a simulation, then, logically, it must have all the natural laws built into it. Agreed? Now, if it is a deterministic universe - that is a universe where, theoretically, you could predict its entire future if you knew everything about it at a certain point in time - these laws would be all that is needed to run the universe. It's sort of like The Game of Life - you input some data and let the thing go.

Now, we live in a universe where quantum mechanics exists, and thus probability exists. This has given a lot of people a lot of headaches, because there are loads of events we can't predict. In other words, you would have a harder time programming in natural laws than you would in a deterministic universe, because you would have to determine some random variables. If a universe is a simulation, then there would have to be an algorithm running in the computer(s) controlling it that determines these random variables - which would not make them random at all.

In a deterministic universe, it would be easy to see a glitch. In a certain spot at a certain time, some phenomenon would occur that violates at least one law of science. For example, perhaps a falling ball moves a few nanometers to one side when it shouldn't have. Given the complexity of a large enough simulation, this could happen quite a bit at small scales. Maybe a photon travels in a vacuum at a slightly slower or faster speed than it should have. Perhaps a new particle appears (or disappears) into (or out of) thin air. Any of these things could be a bug, and they would probably happen a lot. But they would be so minor. It would be very rare for large-scale bug (e.g. the Earth suddenly moves 10 million miles in one direction) to happen.

But we live in a universe where quantum mechanics rules on some scales, which gives us a very nice little loophole. If there was a bug, it could actually follow the rules of quantum mechanics. How? Well, the Heisenberg Uncertainty Principle says in part that conservation of energy can be violated on tiny scales for tiny amounts of time. So a particle suddenly appearing and disappearing could actually fit right in. There is a tiny probability in the universe that a lot of odd things could happen - quantum tunneling, for instance - that shouldn't. A bug could masquerade as any of these.

So it's fair to say that small bugs could happen that would merely appear to be quantum phenomena. We would write them off as products of uncertainty and chance, and they would go by without anybody thinking that they were bugs. And in a simulation, small bugs would probably be very likely.


I'm a bit bored, so I thought I might come up with a list of some of the bugs that might show up in the simulation. Taking some inspiration from the Wikipedia article on software bugs:

  • Infinite loop - I guess the equivalent here would include time travel and all the assorted issues that come with it. This could include time paradoxes, which give everyone headaches, or closed timelike curves, which also give people headaches. Both would involve odd problems with causality - that is, either one thing causes another thing which causes the first thing or one thing causes another thing that makes the first thing impossible. Savvy?
  • Division by zero - This would be guaranteed to annoy the runners of the simulation. It annoys the heck out of me when I accidentally do it with a pocket calculator; on a scale like this, it would be catastrophic. But what would a manifestation of division by zero look like? Well, a singularity, probably. If they try to simulate what happens at the exact center of a black hole. . . Ouch. The computer wouldn't be able to handle it - just like if you asked a computer to figure out $f(0)$, where $f(x)=\frac{1}{x}$, if the computer wasn't pre-programmed to know that such a calculation will always lead to an undefined quantity.
  • Incorrect code transfer - This isn't really a bug so much as an error on the part of one of the programmers, and it might not even turn out to cause a problem. Say I (one of the people working on the simulation) was assigned to transcribe the equations of what we, the simulated people, know as general relativity, to the final program. I would have to transfer the main equation,

    $$R_{ab}-\frac{1}{2}Rg_{ab}+\Lambda g_{ab}=\frac{8 \pi G}{c^4}T_{ab}$$

to the program. Now, I would also have to transfer some of the intermediate steps, too, such as calculating the Christoffel symbols. Let's say, though, I didn't use the concept of Einstein summation notation but did everything out by hand. Let's also say that while I translated (in spherical coordinates) $$\frac{1}{2}\Gamma _{abc}=\left(\frac{\partial}{\partial x^c}g_{ab}+\frac{\partial}{\partial x^b}g_{ac}-\frac{\partial}{\partial x^a}g_{bc}+\right)$$ correctly for $\Gamma_{ttt}$, $\Gamma_{tt r}$, $\Gamma_{tt \theta}$, and so forth, I made a mistake for the case of, say, $\Gamma_{rt \phi}$. This would mean that the computer would make weird calculations that it shouldn't have, that could throw everything off. Now, the reason I said that this might not count as a bug would be that this program is like The Game of Life: you write up the laws and click 'start'. So that error would simply become part of the physical laws in the simulation. It wouldn't make sense, but it would be a law nonetheless.
History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.

0 comment threads

+0
−0

One quite nasty type of bug (which can especially be hard to find, and can give quite inconsistent results) is out-of-range indices, in a language which doesn't range-check indices (likely to be used in simulations because range-checks cost valuable computing time, and do nothing useful if your code is correct).

An out of range index ultimately means that values are read or written in a place where they should not have been read or written; this place may be completely unrelated to the place where the data is meant to go. Indeed, the infamous buffer overrun is a special case of out-of.range indices.

Inside the simulation, such out-of-range indices could for example manifest as strange influences between completely unrelated events (because the out-or-range read ready data belonging to the other event, or the out-of-range write alters data belonging to the other event). Such influences could violate otherwise strict laws (for example, they could easily result in faster-than-light effects, if the erroneously accessed memory belongs to a far away event "” after all, far away in spacetime doesn't need top mean far away in computer memory).

Similar effects could be caused by reads of uninitialized variables which happen to contain unrelated data belonging to a different point in space.

Finally, while not really a bug, also bit flips in memory (caused e.g. by "” real, not simulated "” cosmic ray particles crossing the memory chip and altering the charge of a memory cell) might cause quite interesting effects in the simulation. Such events would be rare (but if the simulation runs quite slowly and the computer uses non-EEC memory, it might be not that rare if measured in simulated time). Since bit flips can also cause rather large differences in values, this would give random events that may well be measurable in-simulation (but of course would not be predictable; after all, they are not even predictable in the "outer" world).

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.

0 comment threads

Sign up to answer this question »