This stuff is really fascinating to me. Will we ever hit a point in data storage where you can fit all the data about a volume in space, into a smaller volume of data storage? I'm sure the Uncertainty Principle plays into it, but theoretically could it be done--store 1 cubic meter's worth of data into a 1 cubic micrometer of storage?
No, since you could fill a cubic meter with the storage media. And then you need to record the data on the storage media as well as the structure of the storage media.
> If file sizes could be specified accurate to the bit, for any file size N, there would be precisely 2^(N+1)-1 possible files of N bits or smaller. In order for a file of size X to be mapped to some smaller size Y, some file of size Y or smaller must be mapped to a file of size X or larger. The only way lossless compression can work is if some possible files can be identified as being more probable than others; in that scenario, the likely files will be shrunk and the unlikely ones will grow.
As a simple example, suppose that one wishes to store losslessly a file in which the bits are random and independent, but instead of 50% of the bits being set, only 33% are. One could compress such a file by taking each pair of bits and writing "0" if both bits were clear, "10" if the first bit was set and the second one not, "110" if the second was set and the first not, or "111" if both bits were set. The effect would be that each pair of bits would become one bit 44% of the time, two bits 22% of the time, and three bits 33% of the time. While some strings of data would grow, others would shrink; the ones that shrank would--if the probability distribution was as expected--outnumber those that grow (4/9 files would shrink by a bit, 2/9 would stay the same, and 3/9 would grow).
You can't really, by definition. By the same logic you should be able to store 1 cubic meter's worth of data storage cube into a 1 cubic micrometer of data storage cube, which in turn could be stored in a smaller cube ad infinitum.
Going by the same logic you could store all entropy of the universe in an infinitely small cube which is not possible unless the universe had zero entropy.
Sort of, according to the holographic principle. Since there is a limit on information density in space, you should be able to reduce a 3-dimentional volume to a 2-dimentional digital data structure. https://en.wikipedia.org/wiki/Holographic_principle
Tinsy nitpick. The universe technically didn't exist back then (at least not in the form of spacetime), I believe. Existence itself was confined to a singularity which then expanded with tar -xzf singularity.tar.gz
It doesn't even matter in the first place if it could store 'all' the information about a space, because it's physically impossible to measure that data.
But when you talk about recreating a volume, in a manner that is at least vaguely possible: you could store a record of every atom at angstrom-level resolution in much less space than the original.
I am not a physicist, but isn't that already possible, depending on the contents of the volume? I can store all the data about a 1km^3 vacuum containing a single atom at the centre in a volume much smaller than 1km^3.