Pessimists, mark your calendars: July 17-20 comes the Global Catastrophic Risks Conference at the University of Oxford. The conference aims to open dialogue about the greatest threats to human survival now and into the future. It is curated by the Future of Humanity Institute, whose director is TEDster Nick Bostrom. Among the discussion topics: + […]Continue reading
Why you should listenPhilosopher Nick Bostrom envisioned a future full of human enhancement, nanotechnology and machine intelligence long before they became mainstream concerns. From his famous simulation argument -- which identified some striking implications of rejecting the Matrix-like idea that humans are living in a computer simulation -- to his work on existential risk, Bostrom approaches both the inevitable and the speculative using the tools of philosophy, probability theory, and scientific analysis.
Since 2005, Professor Bostrom leads a research group of mathematicians, philosophers, and scientists at Oxford University tasked with investigating the big picture for the human condition and its future. He has been referred to as one of the most important thinkers of our age.
His recent book Superintelligence advances the ominous idea that “the first ultraintelligent machine is the last invention that man need ever make.”
What others say
“Bostrom cogently argues that the prospect of superintelligent machines is ‘the most important and most daunting challenge humanity has ever faced.’ If we fail to meet this challenge, he concludes, malevolent or indifferent artificial intelligence (AI) will likely destroy us all.” — Reason, September 12, 2014
Nick Bostrom’s TED talk
Nick Bostrom on the TED Blog
Many TEDTalks speakers have answered the 2008 Edge Foundation question: What have you changed your mind about? Why? Among the more than 160 essays from leading thinkers — scientists, philosophers, artists — look for Wired’s Chris Anderson, Nick Bostrom, Stewart Brand, Richard Dawkins, Aubrey de Grey, Juan Enriquez, Helen Fisher, Neil Gershenfeld, Daniel Gilbert, Daniel […]Continue reading