What are the practical limits of computation? Hans-Joachim Bremermann theorized a computer the size of the world, as old as the world. It could process 1093 bits.
The so-called Bremermann’s limit is calculated from bits per second per kg, and I guess he chose the Earth as the example computer because anything larger seems especially implausible (or, alternatively, he’s a Douglas Adams fan).
The limit is used to work out things like how easy it would be to crack an encryption key. Anything beyond this limit is considered to be a “transcomputational problem,” meaning that you can’t really brute-force solve it.
Here’s a good example from Wikipedia: the human retina has a million cells, give or take. If you wanted to use a computer to calculate what it sees, and treated each cell as a “bit,” you’d need to process 10300,000 bits to understand each potential combination. So yeah, that’s not happening.
(Side note: if it’s beyond a computer, how do our brains process visual information? I assume that it’s because we don’t try to process every combination of “bits” from the light-sensitive cells. I went down the Wikipedia rabbit-hole on visual processing, and it’s still a bit too dense for me to make out. If you want to read about the functions of the five areas of the human visual cortex, be my guest. Then come back and explain it to me.)
I live in Auckland, New Zealand, and am curious about most things.