-2

This was 3rd beer idea, so please bear with me. What if the universe was not actually expanding but the speed of light was slowing? Wouldn't that be indistinguishable to our observations? Either way we should see an increase in the redshift the further a light source is from us?

Suppose we're in a giant extremely advanced compute cluster, and the speed of light has to do with how long it takes data to travel between compute resources on the network/bus. As more data travels across this bus (entropy increases in the universe) it takes longer to reach its destination (speed of light slows.) This seems vaguely reasonable considering fundamental limitations past and present on computers of varying types, there's always a trade off between size of data and speed of access that has it's roots in physics.

So I know it's a bit wacky, but it sounds easily falsifiable to me, simply plot entropy over time and dark energy over time and see how well they correlate (or don't most likely.)

Can anyone help me test my armchair theory? I think that would be a fun exercise. I know dark energy has been plotted over time, but what about entropy, is that possible?

Qmechanic
  • 201,751
Eloff
  • 226
  • "As more data travels across this bus it takes longer to reach its destination" I'm not too sure about this statement. Maybe this is just a quibble, but data will travel at the same speed regardless of how much there is. It just takes longer for the full data packet to be transmitted. – rurouniwallace May 04 '13 at 17:08
  • Related: http://physics.stackexchange.com/q/62146/2451 , http://physics.stackexchange.com/q/2110/2451 and links therein. – Qmechanic May 04 '13 at 17:18
  • This seems related to "tired light" theories, first proposed by Zwicky nearly one-hundred years ago. Wiki tells me that these theories are on the "fringes of astrophysics", because they don't agree with experimental data. – innisfree May 04 '13 at 17:41
  • @ZettaSuro Well yes, excluding interference like packet collisions, it would travel at the same speed still, but my badly stated point is that it would be processed slower as the load increased. Latency of a packet can be defined as travel time + processing time at both ends. More data to process = longer processing time = larger latency. – Eloff May 04 '13 at 17:46
  • @innisfree well that would answer the question, if you could say why "tired light" doesn't agree with the evidence. – Eloff May 04 '13 at 17:47
  • It doesn't seem to resemble past or present computers, for why would internal simulation-time (ours) be synchronized with external 'meta'-time (server)? If it isn't, there would be no internally observable effects due to increasing amount of data on the network/bus, just a slowdown for whoever is running the server. – Stan Liou May 05 '13 at 11:12
  • @StanLiou yeah I thought about that too the other day, most likely you'd be right and from inside the simulation we wouldn't notice the slow-down. – Eloff May 06 '13 at 22:33
  • The idea that light gets slower because of latency in a grid we live in doesn' t make any sense. Latency is ultimately caused by the speed of light, thus the reasoning is completely circular. – Sklivvz Jun 22 '13 at 18:17
  • This question has been bothering me since it was posted, has been flagged as "not a real question", and been looked at by all the mods. I've opted to treat it as not relating to mainstream thought in the physics community and not being sufficiently well developed to be testable. – dmckee --- ex-moderator kitten Jun 25 '13 at 15:47

1 Answers1

0

How do we know the universe is expanding and the speed of light isn't slowing instead (thanks to innisfree for the idea of where to look.) From wikipedia

By the 1990s and on into the twenty-first century, a number of falsifying observations have shown that "tired light" hypotheses are not viable explanations for cosmological redshifts.[2] For example, in a static universe with tired light mechanisms, the surface brightness of stars and galaxies should be constant, that is, the farther an object is, the less light we receive, but its apparent area diminishes as well, so the light received divided by the apparent area should be constant. In an expanding universe, the surface brightness diminishes with distance. As the observed object recedes, photons are emitted at a reduced rate because each photon has to travel a distance that is a little longer than the previous one, while its energy is reduced a little because of increasing redshift at a larger distance. On the other hand, in an expanding universe, the object appears to be larger than it really is, because it was closer to us when the photons started their travel. This causes a difference in surface brilliance of objects between a static and an expanding Universe. This is known as the Tolman surface brightness test that in those studies favors the expanding universe hypothesis and rules out static tired light models.

Eloff
  • 226