Sep 29, 2014
Prof Nick Bostrom is widely respected as the premier academic thinker on topics related to strong artificial intelligence, transhumanism, and existential risks. His talks, books, and articles cover all of these topics, and his vocation involves bringing attention and critical thought to these most pressing human issues.
He is the Founder and Director of the Future of Humanity Institute at Oxford, and author of the new book "Superintelligence: Paths, Dangers, Strategies." In this episode, Nick and I explore the topic of identifying "existential" human risks (those that could wipe out life forever), and how individuals and groups might mediate these risks on a grand scale to better secure the flourishing of humanity in the coming decades and centuries.
For More Information, Visit the HUB of Startups / Business in Emerging Technology. From Robotic Limbs to Getting Angel Investment, from Biotech to Intellectual Property:
For more information on Nick Bostrom himself, please visit his personal website at: