This was 3rd beer idea, so please bear with me. What if the universe was not actually expanding but the speed of light was slowing? Wouldn't that be indistinguishable to our observations? Either way we should see an increase in the redshift the further a light source is from us?
Suppose we're in a giant extremely advanced compute cluster, and the speed of light has to do with how long it takes data to travel between compute resources on the network/bus. As more data travels across this bus (entropy increases in the universe) it takes longer to reach its destination (speed of light slows.) This seems vaguely reasonable considering fundamental limitations past and present on computers of varying types, there's always a trade off between size of data and speed of access that has it's roots in physics.
So I know it's a bit wacky, but it sounds easily falsifiable to me, simply plot entropy over time and dark energy over time and see how well they correlate (or don't most likely.)
Can anyone help me test my armchair theory? I think that would be a fun exercise. I know dark energy has been plotted over time, but what about entropy, is that possible?