James Miller, Department of Economics, Smith College
Expectations of the Singularity
Several of the possible paths to the singularity have signposts indicating their destination. For example, let's say that within fifteen years someone makes a computer simulation of a chimpanzees' brain and puts this brain in a robot's body and this artificial chimpanzee acts just as a real chimp does. If this occurs then many technology opinion leaders will understand that if we can create a computer chimp we can create a computer man. And once we have a simulation of a human brain we should eventually be able to increase the speed of this simulation a million fold, make a million copies of the simulation and usher in the Singularity.
Or perhaps, within 15 years it will be apparent to all the technologically literate that a computer will eventually pass the Turing test. And of course once this test is passed we could eventually speed up the computer a million fold, make a million copies of the computer and get a singularity. Or maybe, within 20 years brain/computer interfaces will be increasing at such a rate that an intelligence explosion seems inevitable.
If either of these scenarios comes to pass technology opinion leaders will expect that the Singularity is near and this expectation will likely disseminate to the masses. The world will greatly change when many come to expect the Singularity.
Life Expectancy—Surviving to a friendly singularity might yield a life span of hundreds, thousands or even millions of years, which radically increases the loss of dying before the Singularity. Combined with the vastly increased quality of life a positive singularity would bring, many of those who think they have a decent shot at living to the Singularity will take great care to avoid death. Increased fear of death will raise the premium many workers require to labor at dangerous jobs, and will boost demand for medical care and safety products. Voters in doubt as to whether they will live to the Singularity might put enormous pressure on governments to speed up the coming Singularity perhaps even if this acceleration increases the risk of a dystopian singularity.
Savings—If a squirrel were uplifted to human-level intelligence, assimilated into human society, and employed at a job that paid $50,000 a year, the squirrel wouldn't gain much benefit from the nuts it had stored for the winter. The uplifting would have essentially destroyed the value of the squirrel's investments by giving the squirrel so much that his stored nuts became worthless post-uplift. Similarly, a utopian singularity might give humans enough resources so anything anyone owned pre-singularity contributed only trivially to his wealth. Furthermore, by killing us a dystopian singularity would also destroy our wealth. Expecting a singularity means probably thinking you will get less future value from your savings than you otherwise would. Consequently, for a given interest rate savings will drop which will harm the economy, possibly postponing the singularity. Expectations of a brain emulation Malthusian future, however, would cause savings to rise.1
Long Term Investments—Many businesses make investments on which they don't expect to break even for at least fifteen years. Singularity expectations would reduce spending on these investments. Even expectations of extremely rapid technological advancement would kill many investment projects. For example, believing that nanotechnology would revolutionize the development of drugs and office buildings would shrink many pharmaceutical and construction ventures.
Education—Expectations of rapid technological improvement will cause many students to doubt the marketability of acquiring the kind of skills that artificial intelligence would render obsolete. Why, for example, spend six years earning a PhD. when you expect AI teachers to decimate the demand for human professors? Students who expect a singularity will have less motivation to study and parents who believe that a singularity will occur before their children hit thirty will likely put less pressure on their kids to study. The possibility of renting out multiple copies of your downloaded digitized brain might cut against all of this, however.
Government Spending—If politicians and voters believe a singularity would wipe out future deficits they would be more willing to have their governments borrow. Consequently, for any given interest rate government spending will likely rise when many come to expect a singularity. Singularity expectations could dissipate demographic-driven fear of rising Social Security and Medicare expenditures.
Interest Rates—Although some of the forces unleashed by singularity expectations will raise interest rates while others causes rates to fall on balance we can be reasonably confident that rates will go up.
Reproductive Timing—Belief that future improvements in genetic technology would increase the "quality" of one's children will cause some women to postpone reproduction. Also, if singularity expectations significantly raised the variance of peoples' estimate of the quality of the future many might wait to have a baby until they learn how the Singularity will turn out. By reducing the non-working percent of the population, child postponement would boost economic activity, perhaps even speeding up the arrival of the Singularity.
1 See Robin Hanson, Economic Growth Given Machine Intelligence.
Singularity Hypotheses: A Scientific and Philosophical Assessment contains authoritative essays and critical commentaries on central questions relating to accelerating technological progress and the notion of technological singularity, focusing on conjectures about the intelligence explosion, transhumanism, and whole brain emulation
The only way of discovering the limits of the possible is to venture a little way past them into the impossible (Arthur C. Clarke's 2nd law)
Subscribe to:
Post Comments (Atom)
Interesting read!
ReplyDeleteI can well imagine a split into two groups: those who see the positive impact and support the idea of transhumanism and and those who want to save the "true humanity". This could have far-reaching consequences. It would be conceivable for example a country or region with people who have consciously chosen to reject future technology.
Best regards