TED Conversations

This conversation is closed. Start a new conversation
or join one »

To revolutionize both math and physics by the grassroots popularization of a new quantitative tool which is introduced below.

We start with the Law of the Excluded Middle. Consider it's complement and call this new law the Law of the Exclusive Middle. Let these two laws be equivalent. You now have something similar to Fuzzy Math, but it is very different. These two laws are connected by equivalence which is very different than Fuzzy Math.

Next consider Descartes "I think therefore I am". We reverse engineer this into a statement which reflects our foundation (above), to derive "Maybe I think therefore maybe I am". We keep both of these statements and set them as equivalent.

We then proceed to all of the standard tools of mathematics which are used for the analytic quantification of magnitudes. In math, things are said to exist. In our new system things are regarded as "maybe existing". We keep both of these tools and regard them as being equivalent. We will call one of them Mathematics, and the other should be called something like Conjectural Modeling to reflect that it is based entirely on absolute indeterminacy.

We now have a quantitative tool which is split down the middle, essentially a kind of mirror image. On one side, absolute determinacy. On the other side, absolute indeterminacy. Both sides held together by equivalence.

We now have a tool which is capable of addressing both the equivalence inherent to relativity, and the indeterminacy which is inherent to Quantum Mechanics.

We can write correct and accurate quantitative models using either system. In fact, for every possible question there should be two solutions. One based on determinacy, and the other based on randomness. These two answers are equivalent. As an example, whether I know with absolute certainty that I have 10 dollars, this is quantitatively identical to not knowing but "expecting" that I have 10 dollars where 10 is an expected value instead of a value known with absolute certainty.

I have many examples and a lot of math to reinforce these views. I am convinced that this solution is extremely important.

0
Share:
progress indicator
  • Sep 30 2012: Looks like Godel used modal logic to make some proofs of things.

    But yeah, it seems like a tool for studying logic through semantics. I'm quite shocked to find the use of equivalence here, instead of so many other places where it should be used for example in math and physics.

    If we say that the excluded middle is equivalent to the exclusive middle, it's easy to derive that random and non-random are equivalent. This equivalence would be present in the mathematical models, and would also be assumed to apply to physical reality. I have many good examples to support this, and the more I search for counterexamples it seems that all I can find is more compelling reasons to support it.

    To say that "random and nonrandom are equivalent" is a statement which begs a lot of justification. I have that justification in the form of highly credible examples and some very solid mathematics. It is all really very simple, nothing complex whatsoever. A child could understand my argument.

    And, it almost sounds absurd to say that random and nonrandom are equivalent. But the numbers dont lie. Think of how absurd it seems to say that different frames of reference are equivalent. Yet we know that Relativity is indeed correct.

    If you start with equivalence at LEM, Law of Identity and Law of Noncontradiction, then all of these results follow and I believe that Relativity itself can be explained in terms of these first principles. So, if true, this would be a very big deal. I simply cannot do it alone. I need to share the idea wherever possible and elicit as much high quality criticism as possible, and if it is wrong then so be it, but if it's right then mankind gets a great big birthday gift.
    • Sep 30 2012: I'm not quite following you but perhaps other people can.
      I will remain sceptical for now.

      On the topic of random numbers being not random... perhaps read into deterministic automata (theory behind computers). It basically shows that 'random' doesn't exist within computers.
      This doesn't really claim anything about the real world though unless you believe that people do not have free will etc. (aka everything in the universe is deterministic aka everything was already determined at the big bang or before... well whenever it started, it could be infinity ago).

      But I see that someone else has already said things about this...
      • Sep 30 2012: Law of Excluded Middle says you can only have truth values 0 and 1.

        Law of Exclusive Middle says there is no 0, there is no 1, all truth values must fall on the open interval (0, 1) noninclusive.

        Consider that 0.999... = 1. Consider 0.999... as a magnitude of truth, or an expected magnitude.

        For anything which is true, we can [a] be certain of it's truth with a certainty of 1, or [b] we can be uncertain of this truth with magnitude 0.999... . Quantitatively they are indistinguishable because .999... = 1. The only difference being qualitative. In one case we are certain, in the other we are uncertain.

        Certainty and uncertainty are equivalent in this example. We can easily generalize this and make lots of examples.
  • Sep 30 2012: Thank you for that suggestion !! I actually did find something which sounds a lot like what I've been claiming.

    Quoting Wikipedia's article on Modal Logic
    "Likewise, a prefixed "diamond" () denotes "possibly p". Regardless of notation, each of these operators is definable in terms of the other:
    [square p] (necessarily p) is equivalent to [not diamond not p] ("not possible that not-p")
    [diamond p] (possibly p) is equivalent to [not square not p] ("not necessarily not-p")"

    This is strikingly similar to what I have been proposing. The only differences I can see (at this point) is that modal logic is typically used in epistemology and I dont know whather this has been applied to physics as of yet. It is possible, but I need to search for that in the literature.

    The other main difference I can see is that I am creating a very robust analogy to probablity theory. There may be a bridge from modal logic to probability theory, but Im not aware of it. More research needed on my end. Additionally, in my view the Law of Excluded Middle is just a model of existence. While modal logic seems to be more concerned with semantics or epistemology, my approach explicitly looks at tangible things. Law of Excluded Middle is one model, the Exclusive Middle is another model. Fuzzy logic combines them into a cohesive theory, while I would let them stand apart and simply say that they are equivalent.

    Thanks for that excellent comment and I'll have to research whether this has ever been applied in physics or math. I think that it should be. It seems very near to what Ive been arguing, there is an application of equivalence, but did they ever connect the dots to relativity ? I kind of doubt that. I think it's worth a look.
    • Sep 30 2012: Hi there William,

      I've had a course in this on my university but I was pretty bad at it. The field of model logic is used a bit in the artificial intelligence for computer scientists. I'll try to help you out by saying what I know about it.

      There are lots of proofs based upon these axioms. I've had to go through them all at some point in time. The language itself is "sound" and "complete" (although I don't quite remember what they mean heheh... there are proofs of it though).

      I know that there have been some (not that many) papers about introducing probability theory but I don't believe they resulted in a lot. Basically there are 2 ways of adding probability.
      1) just give chances to the diamond and then use math from there (aka you add in rules for multiply, add, substract etc)
      2) you make a model where 'worlds' are 'reachable' where each 'world' represents a combination of facts (aka p, q, x where in world 1 (p v q) is true while in world 2 this is not the case). Then you can add facts of "the real world" which then falsifies a whole bunch of theoretical ones and you simply remove them. Then probability would be "from all current worlds which I deem possible, how many different worlds can I reach".
      This is the most used way I think... it's kinda hard to explain in just 1 post ;)

      I'm not gonna explain everything about modal logic here though as I'm not a great teacher nor do I know everything of it myself. But it's nice that it sounds very close to your idea.

      As it's application in physics... I don't know that it has been. As far as I know it's a purely mathematical/logics field.

      I have a book of it called "Epistemic Logic for AI and Computer Science" by J.J.Ch. Meyer and W. van der Hoek. It contains all proofs and quite a few theoretical examples of other types of logics.
  • Sep 30 2012: Your first paragraph is not completely clear to me. Do you want to have
    |- (p v not(p))
    and also
    |- not( p v not(p))
    Being the same thing? Because that is contradictory right off the bat.

    What I do however think you want to say (reading the other paragraphs) is that you want to introduce the logical box and logical diamond from Modal Logic.
    Perhaps you should take a look at modal logic and tell me if that is what you mean or how your idea differs from it.
  • thumb
    Sep 29 2012: 1 minus x does not equal 1 (1-x =/= 1) It equals 1-x, no matter what number x represents. There are no special exceptions. I read the Wikipedia fallacy. It states that because 9*.111... = .999..., therefore x = both one and 999... which is not rational. Math must be rational to be acccurate. Approximations don't count.
  • Sep 29 2012: The fact that .999... = 1 is accepted as fact universally throughout the math and science community. I wont debate that here, please refer to this Wikipedia article or check with anyone in the math community http://en.wikipedia.org/wiki/0.999...


    Another example might be helpful. Bob has an apple, and Mary has an apple. Bob is 100% certain that his apple exists and he assigns a truth value T=1 to this magnitude of existence. Mary is not certain that her apple exists at all, she is permanently uncertain whether it exists or not. But it is very, very convincing that it does, however she remains uncertain of it. She says that her apple exists with truth value T= .999... to reflect the inherent existential uncertainty of this apple.

    Bob is certain, Mary is uncertain. However, quantitatively these are identical views to hold because .999... = 1. Qualitatively they are different. But quantitatively, they are in fact quite the same.

    We have proven that these two approaches to the existence of apple are quantitatively equivalent.

    This has very little to do with epistemology, or the study of words and language. I am doing philosophy of math & physics, definitely not epistemology.
    • thumb
      Sep 29 2012: .999... (to infinity) does not equal 1. It approximates 1. There is a HUGE difference between the words "equal" and "approximate", and no philosophy of math or physics should conflate the two. If you do, you are skewing the results, thus making things harder for yourself.

      1 minus .000000000000000000001 = .999999999999999999999.
      One cannot equal .999999999999999999999

      I can be 99.99999% certain that something is right and still be wrong. 99.99999 =/= 1. The probability that I am wrong can be slim, but it still exists. Therefore, knowledge =/= faith.
      • Sep 29 2012: That is in fact incorrect. Please check those facts with an external source. As I stated above, .999... = 1 identically. It is not an approximation. They are exactly equal. Further, this is universally regarded as fact throught the math community worldwide.

        I cannot debate this topic here, it is off topic. This thread is about equivalenve and applying it to the Law of the Excluded Middle. Please read the Wikipedia article I cited or check with a mathematician. I cant use this space to debate something which is considered to be an elementary fact of mathematics, this is why calculus works, I wont debate that here. Again, it is universally accepted that 0.999... = 1 identically, it is not an approximation.

        Thanks for the feedback but Im trying to stay on topic.
        • thumb
          Sep 29 2012: OK. I will leave this thread. But if you believe that .999... = 1, then you do not believe in the law of the excluded middle

          The law of excluded middle can be expressed by the propositional formula x=x. It means that a statement is either true or false. Think of it as claiming that there is NO middle ground between being true and being false. Every statement has to be one or the other. That’s why it’s called the law of excluded middle, because it excludes ANY middle ground between truth and falsity. So while the law of non-contradiction tells us that no statement can be both true and false, the law of excluded middle tells us that they must ALL be one or the other.

          If x = something other than x, then it is necessary to declare the conclusion false.

          The Law of the Excluded Middle does not take infinity into consideration. It deals with finite things only, therefore, .999 does not equal 1. 1 only = .999 if you take it into the realm of infinity.

          If you do not take it into the realm of infinity, then if you say 1=.999, you are violating the law of non-contradiction.

          Or do I misunderstand the law of the excluded middle?

          If I can't speak of the math, how can I speak of the law of the excluded middle? It's founded on mathematical principles.
  • thumb
    Sep 29 2012: You begin suggesting the split universe theory, then you move away from it.

    In U1, things either exist or do not exist, and in U2, you are dealing with probabilities, which are real things, whether or not they can be seen by you who are here and now. What exists in U2, and will become part of U1, is relative to your trajectory and velocity. Some probabilities are therefore more probable than others/
    But when you move to say that .999....=1, you are mistaken. .999... = .999... and 1=1. 1 = 1.000 minus .001, thus PROVING that 1 =/= .999

    I hear you struggling with the mathematical nature of reality, but I don't think that it can be explained using arithmetic. Arithmetic isn't dynamic enough to convey abstract probabilities. Where math is a language, arithmetic is a word. That's why I suggested that you look toward geometry that doesn't require years of learning math as a language.

    Quantum physics already recognizes your U2, but it's not just one universe. They are very close to announcing that there is sufficient agreement for declaring that we live in a multiverse that relative to us, is filled with probabilities waiting to be realized relative to your position as observer. (Your frame of reference).
  • thumb
    Sep 29 2012: "I think, therefore I am" can be expressed as Pi (I think = diameter, and I am = circumference). "I" (or Pi) is the relationship of your reality to your "perceived" identity just as Pi is the relationship of a circle to its diameter.

    Apply this concept to a sphere with infinite tangent points and you can then say, I think this, therefore that is the most probable consequence of my actions. That would be a valuable tool. It's called "critical thinking", and most people avoid it like the plague.

    I "know" that I have $10.00 is not equivalent to I "expect" (have faith) that I have $10.00. Knowledge and faith are two different things.

    You can have faith that something that violates scientifically known principles is true, but that does not make it true nor will all the thinking in the world make it true. You can have faith in something that you do not know if it does not violate inviolate principles, and you can manifest it into being.

    But no mater how many ways you frame the debate, and no matter how many identities you establish to do so, faith will never equal knowledge. They are two entirely different concepts. Faith = absence of knowledge. So in this context, all faith based ideas would be expressed as negatives in your mathematical construct.
    • Sep 29 2012: If you are a mathmatician, logician or physicist I can change your mind with an example as follows.

      Consider two universes. Call them U1 and U2. In U1 things either exist, or they do not. In U2 everything is partially existent, and we say that things "might exist" instead of saying that they "do exist". Let U1 and U2 be considered equivalent, just like in relativity.

      Now we go out and apply our theory. We see a tree. An observer claims that he is certain that it exists and gives it's existence a truth value = 1. A second observer is never sure of anything, ever, but the existence of tree is pretty convincing and he assigns it's existence a truth value of .999... .

      Since .999... = 1, precisely, it is clear that in terms of quantifying these truth values that these two situations are indeed equivalent. The same thing would hold for nonexistent objects.

      What I am saying is that whether something is certain or uncertain, that this "certainty / uncertainty" is a quality which we ascribe to things. Clearly these two qualities are quite opposite each other. However, they also have a quantitative aspect and that these quantities can be equivalent.

      If I exist with truth value = 1, then I exist with absolute certainty. There is no question about it.

      But if I say that I "might" exist, I am making a statement which is based on uncertainty. Consider that this is expressed as a potential, and that in this case the truth value = .999...

      Clearly, .999... = 1 and the only difference is whether we are certain or uncertain. Quantitatively, these two models are equivalent.

      Random and nonrandom are equivalent.
      • thumb
        Sep 29 2012: You begin suggesting the split universe theory, then you move away from it.

        In U1, things either exist or do not exist, and in U2, you are dealing with probabilities, which are real things, whether or not they can be seen by you who are here and now. What exists in U2, and will become part of U1, is relative to your trajectory and velocity. Some probabilities are therefore more probable than others.

        But when you move to say that .999....=1, you are mistaken. .999... = .999... and 1=1.

        .999 = 1.000 minus .001, thus PROVING that 1 =/= .999. It only approximates it. Approximations are not equalities.

        I hear you struggling with the mathematical nature of reality, but I don't think that it can be explained using your model.

        Quantum physics already recognizes your U2, but it's not just one universe. They are very close to announcing that there is sufficient agreement for declaring that we live in a multiverse that relative to us, is filled with probabilities waiting to be realized relative to your position as observer. (Your frame of reference).
  • Sep 29 2012: Yes, it is a bit different. Currently, in mathematics it is believed that stochastic and non-stochastic processes are two very different things. Indeed they are. It is difficult to call them opposites, but may be considered complementary by some.

    In my view these would be considered equivalent processes. Equivalent in the same sense that two completely different frames of reference can be regarded as being equivalent in relativity.

    I have several worked examples which illustrate why we can say that random and non-random processes can be regarded as being equivalent. This applies to physical as well as mathematical processes.

    The best way to explain these examples is with a short video. I have two videos on YouTube and I will try to link to them here:

    The first video should play automatically at this link. It is called "Two Physicists Walk Into A Bar"
    http://www.youtube.com/user/HeliumXenonKrypton?feature=mhee

    The second video concerns the Sierpinski Triangle, a very famous fractal:
    http://www.youtube.com/watch?v=aM95mNEQsY0&list=UUpxczJY6eD7lNBzfOATQLyg&index=3&feature=plcp

    This general approach is a valid application and extension of the methods of Relativity. Because it is so closely related to randomness, it should be clear that it provides a way to connect General Relativity with Quantum Mechanics.
    • thumb
      Sep 29 2012: The Sierpinski Triangle - when viewed through the randomness method - does not provide a duplicate of the Sierpinski Triangle. You will have triangle points left over. Thus the Sierpinski approximation of the Sierpinski triangle is not equal. The equivalence that is mentioned in the video you offered is based on a provable fallacy.
      • Sep 29 2012: Incorrect. When generated by the random algorithm, you let the process go through infinitely many iterations to generate the triangle. The two fractals, generated by their respective algorithms are identical with the possible exception of a few points needed to start the random process, but that can be eliminated by selecting a starting point properly.
        • thumb
          Sep 29 2012: If your starting point is not random, then you have violated the entire randomness method. Those exceptions (the extra points) are important if you are putting a theory together. Random =/= deliberate just as approximate =/= exact.
  • thumb
    Sep 29 2012: Is this different from the distinction between deterministic and stochastic models?
    • Sep 29 2012: Deterministic and stochastic models are different. They will always be different. But this difference is qualitative.

      The reason we can make these things equivalent is because quantitatively they are the same. In other words, whether you have a stochastic model, or a deterministic model, you get the same numbers. You always get the same answers. You get two answers to every possible problem. One is certain, and the other is uncertain or an expected value. If we say that these are equivalent, then we have a kind of duality in everything that we are doing.

      You can probably see that this would give a fresh new approach to resolving paradoxes regarding the wave-particle duality of light. It also provides a framework for offering an easy to understand justification for the occurrence of which-way information in QM. There are many benefits of this approach. This is just the tip of the iceberg. My purpose here is to popularize this view, or share it with as many people as possible because I am confident that it is correct.
      • thumb
        Sep 29 2012: So you are proposing modeling any problem as if it could be either a deterministic or a stochastic process, or both, rather than sorting phenomena into deterministic and stochastic and using the sorts of models that fit best for each?
        • Sep 29 2012: That is precisely correct. Every problem that is solvable using mathematics of logic should have 2 solutions. One deterministic, and the other essentially uncertain or stochastic.

          For example, the number 5. It can be regarded as a known value. It is known with an implied (but unstated) inherent certainty. We could however regard this quantity as an expected value, in which case it is precisely the same magnitude but it's "qualitative" properties have changed, it is now inherently uncertain.

          The magnitudes 5 (known) and 5 (expected) are identical, but their qualities are different. For an empiricist, what this means is that there must be two equivalent models of the entire universe. One deterministic, the other essentially uncertain or stochastic. Both models would produce the same exact numbers. One universe is deterministic, and the other is stochastic. They must be equivalent. The connections to Relativity and QM are obvious.

          This should also be true for the entirety of all mathematics. I have come close to a proof for the general case for "all of mathematics", but I do not have that proof completed. I have come close and dont want this line of research to be forgotten. That is why I engage in online debate.
      • thumb
        Sep 29 2012: So one might conceptualize this as all stochastic with a practice of always as one case assuming the most likely underlying error distribution and as the second case the degenerative case (the deterministic version).
        • Sep 29 2012: In my view neither case is degenerative. You may have the emergence or order from a disordered system, or emergence of disorder from one which is deterministic. But neither case is really degenerative. Together they form a duality. The duality is held together by assumption of equivalence. And in fact we can come very close to proving the equivalence of these two structures, but I am more comfortable simply saying that the duality forms a consistent system.

          By embracing a quantitative tool which has duality built into it, we can look at the wave particle duality with this new tool and understand it in a whole new way. Perhaps for the first time.

          I want to apply this work to Bell's Inequality, various works of Alain Aspect and others. That is my goal. Either to do it myself, or provide a tool for others to follow. I want to create a new tradition within science which embraces the duality of random and nonrandom, that acknowledges that they are equivalent and proceeds from there.