Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There is enough information if you assume reality is continuous. Pick a point A to be the origin. Then then you can encode the number by placing something at 1/N meters away from the origin.


No, because you can’t have something have that precisely defined of a position.


you can - you merely need enough energy to precisely define it (as according to the heisenberg uncertainty principle)!

Unless...such an energy exceeds the Schwarzschild radius...


So you can’t.


You can't halve a Planck length so you're limited to ~1.6×10^−35.


I think current theories break down at less than a Planck length, but they are not constrained to integer multiples of it.


Relativistic speeds can contract the length of any object as measured from an outside observer. If an object the size of 1 Planck length travels fast enough: you won't be able to measure it, as from your position it would be smaller than the Planck length as it passes by.

It's not impossible (afaik) for things to be smaller than the Planck length. We just don't have the ability (maybe ever) to measure something smaller than this limit.

Now, good luck finding something the size of 1 Planck length, and also to accelerate it to relativistic speeds.


By definition you can if you accept it's continuous.


No, because space might be continuous, but that doesn’t mean the uncertainty principle and the Planck limit disappear..


The Compton wavelength will probably cause trouble for the storage scheme long before gravity becomes a problem.


Those are only relevant for decoding it.


Layman question. But can space be quantinized (not sure what is proper term)? Like there is finite positions for particle between two points?


Frustratingly, attempts to discretize space invariably run into problems with relativity, since they effectively impose a preferred frame of reference. I.e. you can impose a minimum distance, but relativistic length contraction means that observers measure different minima and in different directions.

Apparently, under some of these models, this implies that the speed of light ends up depending on wavelength, lending them to empirical tests. My understanding is that these discrete space models have failed to line up with experiment, at least within the limits of measurement.


That's currently unknown. For all current practical purposes, kind of. The plank length sets a limit on spatial resolution of any information, so a finite region with (universally) bounded entropy per conceivable bucket on that scale still has finite entropic capacity.


The current theories use continuous space and time. However, we can't encode information into space itself. We would have to use some configuration of matter, and then there are limits to how well-defined a particle's position can be coming from the uncertainty principle.

On the other hand, general relativity implies that if you put enough matter in a small enough space it becomes a black hole, and then we can't access the information in it.

IANA physicist but I think this line of thought is probably speculative at the moment because it involves both general relativity and quantum mechanics and it isn't known how they should work together.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: