TED Conversations

Student B.S. Engineering, Cooper Union for the Advancement of Science and Art


This conversation is closed.

How can computer models help us build intuition?

The use of visual diagrams to explain and understand difficult concepts is as old as history itself, but in the twentieth century, for the first time, engineers and scientists were able to enlist the help of computational tools to represent systems with greater clarity and detail. While computers, with the right peripherals, are able to present data to all the senses, in two or three dimensions and through time, perhaps their greatest pedagogical virtue is their interactivity. People learn by doing: young children internalize Newton's Laws long before their first formal physics class by manipulating the world around them. Computers offer the promise of similar interactivity for systems which are less readily accessible, or even entirely esoteric. In my Bioelectricity class, for example, we have been using computer simulations of the complicated Hodgkin-and-Huxley membrane equations to gain insight into neural reactions to various experimental stimuli. How can computer models be used to learn, understand and ultimately build intuition about systems in nature and science?


Showing single comment thread. View the full conversation.

  • Feb 17 2012: On the notion of intuition: this is troubling in that it implies a layer of reasoning that is hidden from consciousness and is often associated with metaphysical connotations.

    By definition, intuition is knowledge without inference or reason yet from a neuroscience perspective it follows that such knowledge does not spontaneously manifest but rather is the product of cognitive processes, which are themselves a product of neural circuit function. I would think that rather than propagating and being reliant on "intuition" and it's perceived benefits we should aim to understand the underlying cognitive processes that yield superior understanding compared with traditional reasoning approaches.
    • Feb 17 2012: I tend to agree, although I think that having a part of the mind which you're not conscious of is probably necessary to reduce overhead in thinking about things - like the concept of data hiding in computer science... I think when people talk about intuition they're really talking about things like inferring from pattern recognition or by analogy, as well as rules which they've learned and then internalized and incorporated into their perceptions and judgements and are no longer aware that they're incorporating into their decisions, but which ineffably seem to make their decisions "better" than someone else's on a subject. I've heard people talk about "developing your intuition on a topic" and a lot of times they bring up analogies to think about (like the water pump analogy with electricity). I also remember a study in which a complex formula predicted the next position of a dot on a screen, and you'd get ten dollars if you could reliably predict where the dot would land next after a while, and $100 if you could explain how you were able to predict it. Participants were able to learn where the dot would land next, and so could a neural net, but they were unable to put their understanding into words - they didn't have privileged enough access to their own brains to trace their thought processes. Being conscious of your thoughts seems to be a higher level mode and I think a lot of the details are hidden from us because it was simply never to our adaptive advantage to be aware of them.
    • Feb 17 2012: I would definitely agree with November's assessment, but specifically to address your concern over becoming overly reliant on intuition to understand complex processes, I would say that an intuitive grasp of a certain concept can help immensely in the creative process. For example, I am currently working on a device to harness energy from ocean waves. While equations exist describing hydrostatic and hydrodynamic situations of all imaginable kinds, I've found it much more useful to rely on my intuitive grasp of fluid physics as a jumping-off point and then to go back and quantitatively analyze any new improvement. While I do know some people who like to fiddle with equations as their preferred means of understanding a given subject, I would wager to say that they are in the minority when it comes to their chosen method of learning.

      The brain is an inherently stochastic organ -- in fact, computer models show that without a healthy dose of noise, neurons would only be able to respond in a nearly on-off fashion to graded stimuli. That noise, in combination with population-averaging effects, allows for the generation of a graded, smooth output in response to varying stimuli. While it might be uncomfortable to accept any level of randomness in our thinking, I would argue that that this is necessary for any real innovation to take place. As Albert Einstein said: "imagination is more important than knowledge. Knowledge is limited. Imagination encircles the world."

Showing single comment thread. View the full conversation.