Size of the universe, zipped
Compressed archives in computer science are, from some perspective, decreasing the data size by minimizing repetition of data.
Take each particle (to any precision you prefer) in the whole universe, in different fields, at a certain point of time, as a vector in the field. If these vectors are put together to form an enormous file (whether it is logical that a disk can contain this file is irrelevant to this question), and then compressed using the aforementioned type of compression algorithm, how much smaller can it become?
In other words, do the vectors at a discrete point of time of particles in different fields have a pattern that we can facilitate such that, when processing data on the whole universe, there is no need to really calculate things per particle?
Note: I'm referring to the relative compression size, not the real number of bits it may consume. I am talking about the average in the universe, so whether the universe is infinitely big is irrelevant.
This post was sourced from https://worldbuilding.stackexchange.com/q/78638. It is licensed under CC BY-SA 3.0.
0 comment threads