TED Conversations

This conversation is closed.

To revolutionize both math and physics by the grassroots popularization of a new quantitative tool which is introduced below.

We start with the Law of the Excluded Middle. Consider it's complement and call this new law the Law of the Exclusive Middle. Let these two laws be equivalent. You now have something similar to Fuzzy Math, but it is very different. These two laws are connected by equivalence which is very different than Fuzzy Math.

Next consider Descartes "I think therefore I am". We reverse engineer this into a statement which reflects our foundation (above), to derive "Maybe I think therefore maybe I am". We keep both of these statements and set them as equivalent.

We then proceed to all of the standard tools of mathematics which are used for the analytic quantification of magnitudes. In math, things are said to exist. In our new system things are regarded as "maybe existing". We keep both of these tools and regard them as being equivalent. We will call one of them Mathematics, and the other should be called something like Conjectural Modeling to reflect that it is based entirely on absolute indeterminacy.

We now have a quantitative tool which is split down the middle, essentially a kind of mirror image. On one side, absolute determinacy. On the other side, absolute indeterminacy. Both sides held together by equivalence.

We now have a tool which is capable of addressing both the equivalence inherent to relativity, and the indeterminacy which is inherent to Quantum Mechanics.

We can write correct and accurate quantitative models using either system. In fact, for every possible question there should be two solutions. One based on determinacy, and the other based on randomness. These two answers are equivalent. As an example, whether I know with absolute certainty that I have 10 dollars, this is quantitatively identical to not knowing but "expecting" that I have 10 dollars where 10 is an expected value instead of a value known with absolute certainty.

I have many examples and a lot of math to reinforce these views. I am convinced that this solution is extremely important.

Share:

Showing single comment thread. View the full conversation.

  • thumb
    Sep 29 2012: Is this different from the distinction between deterministic and stochastic models?
    • Sep 29 2012: Deterministic and stochastic models are different. They will always be different. But this difference is qualitative.

      The reason we can make these things equivalent is because quantitatively they are the same. In other words, whether you have a stochastic model, or a deterministic model, you get the same numbers. You always get the same answers. You get two answers to every possible problem. One is certain, and the other is uncertain or an expected value. If we say that these are equivalent, then we have a kind of duality in everything that we are doing.

      You can probably see that this would give a fresh new approach to resolving paradoxes regarding the wave-particle duality of light. It also provides a framework for offering an easy to understand justification for the occurrence of which-way information in QM. There are many benefits of this approach. This is just the tip of the iceberg. My purpose here is to popularize this view, or share it with as many people as possible because I am confident that it is correct.
      • thumb
        Sep 29 2012: So you are proposing modeling any problem as if it could be either a deterministic or a stochastic process, or both, rather than sorting phenomena into deterministic and stochastic and using the sorts of models that fit best for each?
        • Sep 29 2012: That is precisely correct. Every problem that is solvable using mathematics of logic should have 2 solutions. One deterministic, and the other essentially uncertain or stochastic.

          For example, the number 5. It can be regarded as a known value. It is known with an implied (but unstated) inherent certainty. We could however regard this quantity as an expected value, in which case it is precisely the same magnitude but it's "qualitative" properties have changed, it is now inherently uncertain.

          The magnitudes 5 (known) and 5 (expected) are identical, but their qualities are different. For an empiricist, what this means is that there must be two equivalent models of the entire universe. One deterministic, the other essentially uncertain or stochastic. Both models would produce the same exact numbers. One universe is deterministic, and the other is stochastic. They must be equivalent. The connections to Relativity and QM are obvious.

          This should also be true for the entirety of all mathematics. I have come close to a proof for the general case for "all of mathematics", but I do not have that proof completed. I have come close and dont want this line of research to be forgotten. That is why I engage in online debate.
      • thumb
        Sep 29 2012: So one might conceptualize this as all stochastic with a practice of always as one case assuming the most likely underlying error distribution and as the second case the degenerative case (the deterministic version).
        • Sep 29 2012: In my view neither case is degenerative. You may have the emergence or order from a disordered system, or emergence of disorder from one which is deterministic. But neither case is really degenerative. Together they form a duality. The duality is held together by assumption of equivalence. And in fact we can come very close to proving the equivalence of these two structures, but I am more comfortable simply saying that the duality forms a consistent system.

          By embracing a quantitative tool which has duality built into it, we can look at the wave particle duality with this new tool and understand it in a whole new way. Perhaps for the first time.

          I want to apply this work to Bell's Inequality, various works of Alain Aspect and others. That is my goal. Either to do it myself, or provide a tool for others to follow. I want to create a new tradition within science which embraces the duality of random and nonrandom, that acknowledges that they are equivalent and proceeds from there.

Showing single comment thread. View the full conversation.