TED Conversations

Sophie Rand

Student Engineering, The Cooper Union For The Advancement of Science and Art


This conversation is closed. Start a new conversation
or join one »

Can we ever know how another person "senses" the world?

In my Bioelectricity class this week, we learned about the cells
in our body that help us sense our environment: chemosensors in our
tongue that help us sense taste, for example, the photoreceptors in
our eye that sense light, and the hair cells in our ears that sense
the mechanical vibrations of sound, to name a few.

As a result, I recently revisited my answer to the age-old question of
“how do I know that the blue I see is the same blue you see?” that was
so startling and exciting to most 3rd graders playing baby Kierkegaard
a little bit differently. An answer could be that we just have to
trust that perception is guided by biology and that humans are
biologically identical to within 80% of our biological systems.

This answer, of course, raises new questions: even if you and I may
perceive the same blue, is that blue "real?" Where does sensation
leave off and perception begin, and how may we trust ourselves as we
try to compare them? Can we ever know how another person "senses" the
world? Would love to hear your thoughts!


Showing single comment thread. View the full conversation.

  • Feb 17 2012: With the example of seeing blue, if everything else is considered equal, we can agree that the light hitting our eyes includes wavelengths between 440 and 490 nanometres, assuming that all of those terms I used are unambiguous—and at least as I type this, I trust that they are.

    In addition to considering everything else equal, such as our inertial frames of reference, we are ignoring normally-relevant factors like culture, mood, genetics, and individual mechanisms for discerning variation (e.g., I may be able to see when a blue is closer to 440nm than 445nm or roughly how many other wavelengths are present and what they may be).

    If we're wired in a similar-enough way, we can also agree that a given blue stimulus from one context corresponds to some blue stimulus in another one. For example, we can both agree that a pair of denim pants are blue in the same sense that the sky is blue, and if we only had a pack of 8 crayons, we would use the crayon labelled "Blue" to color a picture of pair of jeans and the sky even if that label was removed.

    We can even go as far as showing that the perception of blue involves analogous neural circuitry between me and you—we could even have identical neurons that fire when we see blue!

    As for the subjective experience of perceiving what we sense, I really don't know if that question even makes sense beyond sounding syntactically and semantically valid, although at the time of typing this I trust that it's a sensible question to *you*. The potential problem here is that we get pretty close to the limit of what can be expressed through communication as opposed to direct observation. I *hope* that you can see this as a paragraph written in English instead of the mess of red, green, and blue blotches that it's made of (i.e., as pixels of standard video output), and that the gist of the paragraph has been conveyed as pertaining to whatever discussion I imagined you wanted to kick off.

    I can't even explain to myself how I see blue!
    • thumb
      Feb 17 2012: I think your answer expresses exactly why this is such an interesting issue. We can measure the world in near infinite detail, but it never seems to communicate what the world 'is like.' I often hike in the woods with my dog, and I always wonder what the experience in the woods is like with her powers of scent. She experiences a much different walk and even if she could speak, there is no way for her to communicate the subjective experience.

      The example of color is such a good one precisely because we DO assume we have similar experiences, but nevertheless can't communicate them in language. For example there is no way for me to explain to a person with protanopia-type color blindness (no red receptors) what purple is like. The color blind person can learn everything science and observation has to tell them about purple: color, wavelength, light, biology of the eye, firing of the neurons and still make no progress in understanding what purple looks like. It suggests a real limitation in our ability to translate experience into language.

      And I think it does make sense to talk about it or at least think about — I'm experiencing something that a color blind person isn't but I can't communicate it. It's truly fascinating because it makes you wonder what other subjective experiences are like that—how differently I world experience life if I had my dog's nose or eyes like the mantis shrimp that can see the direction of polarized light. It leaves room in a world dominated by positivism for poetry.
      • Feb 17 2012: If you were able to explain to yourself what it was like to see whatever colors you see, then you could just use that explanation to tell others, but it doesn't seem that you can any better than I can say how blue appears to me. Since you've experienced what you see at least once before you can now recognize for yourself the phenomena as whatever it is to you on subsequent occurrences, but how can you convey what I would consider a purple-seeing deficiency if you don't know how I tell the difference between purple and almost-purple?

        One's particular flavour of sense-perception is a unique and random situation, tailored made just for one's self—in fact that's what it is to BE one's self. If I became blind just now, a large part of who I am would fade away and give rise to a new me that saw the world differently, which would still be very different from someone blind from birth, since the emergent conditions would be comparably different.

        We have the capacity and ability to expand and reconfigure our senses and how they are rendered. The BrainPort is a device that blocks any optical input and converts images into tongue-based stimuli routed to the visual cortex, where upon one can see visually by using their tongue. Cochlear implants have a related mechanism. Theoritcally, you can map any mechanic phenomenal input to a perceptual subjective output (which then can be fed back into the system).

Showing single comment thread. View the full conversation.