Amnon H. Eden, Eric Steinhart, David Pearce and James H. Moor
Questions
Bill Joy in a widely read but controversial article claimed that the most powerful 21st century technologies are threatening to make humans an endangered species (Joy 2000). Indeed, a growing number of scientists, philosophers and forecasters insist that the accelerating progress in disruptive technologies such as artificial intelligence, robotics, genetic engineering, and nanotechnology may lead to what they refer to as the technological singularity: an event or phase that will radically change human civilization, and perhaps even human nature itself, before the middle of the 21st century (Paul & Cox 1996; Broderick 2001; Garreau 2005; Kurzweil 2005).
Singularity hypotheses refer to either one of two distinct and very different scenarios.
The first (Vinge 1993; Bostrom to appear) postulates the emergence of artificial superintelligent agents—software-based synthetic minds—as the ‘singular’ outcome of accelerating progress in computing technology. This singularity results from an ‘intelligence explosion’ (Good 1965): a process in which software-based intelligent minds enter a ‘runaway reaction’ of self-improvement cycles, with each new and more intelligent generation appearing faster than its predecessor. Part I of this volume is dedicated to essays which argue that progress in artificial intelligence and machine learning may indeed increase machine intelligence beyond that of any human being. As Alan Turing (1951) observed, “at some stage therefore we should have to expect the machines to take control, in the way that is mentioned in Samuel Butler's ‘Erewhon’”: the consequences of such greater-than-human intelligence will be profound, and conceivably dire for humanity as we know it. Essays in Part II of this volume are concerned with this scenario.
A radically different scenario is explored by transhumanists who expect progress in enhancement technologies, most notably the amplification of human cognitive capabilities, to lead to the emergence of a posthuman race. Posthumans will overcome all existing human limitations, both physical and mental, and conquer aging, death and disease (Kurzweil 2005). The nature of such a singularity, a ‘biointelligence explosion', is analyzed in essays in Part III of this volume. Some authors (Pearce, this volume) argue that transhumans and posthumans will retain a fundamental biological core. Other authors argue that fully functioning, autonomous whole-brain emulations or ‘uploads’ (Chalmers 2010; Koene this volume; Brey this volume) may soon be constructed by ‘reverse-engineering’ the brain of any human. If fully functional or even conscious, uploads may usher in an era where the notion of personhood needs to be radically revised (Hanson 1994).
Read on
An excellent read. It's refreshing to see a scholarly approach to singularity studies that takes a critical look at views from many different sources. I look forward to reading more.
ReplyDelete