TED Conversations

Matt Hintzke

Student, Coffman Engineers, Inc

This conversation is closed. Start a new conversation
or join one »

Refuting a quantum mechanics theory

There is a fairly popular theory first developed in the 1950's I believe that states that the universe in which we are all accustomed to is only one of an infinite number of parallel universes and that because of the concept of locality and the act that, due to quantum mechanics, all particles (and essentially objects) can be at 2 or more places at the same time, these "other places" are actually other universes. Meaning that there are inifinite number of you and me doing all different things at the same time.
However, due to simple cause and effect logic, it appears that such a thing is impossible. Every action (or effect) that happens in the universe is governed by a cause. Essentially, I believe that all actions by myself, other people, animals, and inanimate objects can be traced back to the Big Bang itself. If all constituents of math and physics have fixed values, meaning things like gravitational constants, then everything, including brainfunction can be completely defined by a previous cause. All functions are manipulated by the environment around it, whether physically, emotionally, psychologically, or habitually, and because of this, it appears that there is only 1 single way that the space-time can unfold, through infinite number of causes and effects.

Overall, what I am saying is that it appears logical to say that if we could re-enact a big bang with 100% precision, that universe's history would be identical to ours in every single way.

What do you think about this theory?

An example I thought of was this:
Are there any scientific experiments that truly give randomized results given very precise initial conditions? If you do an experiment 1,000,000 times with every initial condition exactly the same, should you not get the same result every single time? This concept can be applied to the big bang's initial conditions


Showing single comment thread. View the full conversation.

  • Jun 19 2011: Dear Matt,
    If we re-enacted the big bang with 100% precision (having over come all the difficult), there would be still one catch. One Intial condition would be different..it would not be at the same time t=0 as the orignal and would have an associated history (of an observer recreating the big bang) change the equations. Hence recreating the big bang with the exact initial conditions would not be possible ( even if travelled back in time t=0 to recreate it in a different but identical dimensions). Since creation of initial condition it seems would not be possible, hence replicating the exact same result would elude us. the difference I think between this and the repeatability of the regular experiment we do lies in the R factor.. A 95% factor is good enough to prove say two falling objectsfrom same height falling together. Rest being attributed to shape, wind etc.But to bring out the exact samehigh entrphy state of the universe, from thel ow entrophy state would require an R of 100% and nothing less. and the t>0 would put a varience factor ( however infinitesimally small) which progress geometrically with time.
    Therefore primarily, the cause and effect law stays intact but the effort to recreate an event does not.
    extending this case beyond t=0, to now. I propose if you could create an R=100% for every thing that happens, it should at first flow follow that the universe must become deterministic and future predictable. However the prediction must take into ints calculation ..pre-knowledge of the first iterative future generated by the model. Thus the calculation for the second iteration would be influenced by the first (which would in most cases now be incorrect). If the second iteration should throw up a future x, it would still risk a variance (caused by the pre-knowledge consideration at every second by every sentient being possessing the model)

Showing single comment thread. View the full conversation.