Bruce Sterling Says The Singularity Has No Business Model

Saturday, January 19, 2013

Bruce Sterling

 The Singularity
Science fiction author Bruce Sterling has created a minor firestorm after he posted an anti-Singularity piece on the  In it he writes that essentially the Singularity will not occur because it does not make sense financially to pursue it, and we do not have any tangible evidence of the technology yet.
It has been 20 years since Vernor Vinge wrote in Omni about the Singulrity. This piece, along with follow up writing by Vinge and others inspired Ray Kurzweil to take up the phrase as a central theme to what will happen as the Law of Accelerating Returns progresses into the future.

Bruce Sterling is a long-time science fiction author, and one of the creators of the cyberpunk movement.  His books such as,Islands in the Net and Schismatrix are very influential in science fiction circles.  Sterling also writes for Wired at Beyond The Beyond.

Taking up the The Rapture of the Nerds banner, Sterling now has posted a critical analysis of the Singularity in response to the question, "What should we be worried about?"
On the, Sterling writes,

This aging sci-fi notion has lost its conceptual teeth. Plus, its chief evangelist, visionary Ray Kurzweil, just got a straight engineering job with Google. Despite its weird fondness for AR goggles and self-driving cars, Google is not going to finance any eschatological cataclysm in which superhuman intelligence abruptly ends the human era. Google is a firmly commercial enterprise.
It's just not happening. All the symptoms are absent. Computer hardware is not accelerating on any exponential runway beyond all hope of control. We're no closer to "self-aware" machines than we were in the remote 1960s. Modern wireless devices in a modern Cloud are an entirely different cyber-paradigm than imaginary 1990s "minds on nonbiological substrates" that might allegedly have the "computational power of a human brain." A Singularity has no business model, no major power group in our society is interested in provoking one, nobody who matters sees any reason to create one, there's no there there. 
So, as a Pope once remarked, "Be not afraid." We're getting what Vinge predicted would happen without a Singularity, which is "a glut of technical riches never properly absorbed." There's all kinds of mayhem in that junkyard, but the AI Rapture isn't lurking in there. It's no more to be fretted about than a landing of Martian tripods.
George Dvorky at iO9 has found various commentators who have reacted to Sterling's comments including Tyler Cowen of Marginal Revolution, who re-posted Sterling's article, prompting a heated discussion.

singularity ai

Author David Brin commented on iO9:

While Bruce S is right to growl and snap at the cyber-transcendentalists (I do it plenty) I think he is too blithe about ignoring how AI is likely to come together. If you knew anything about the area where the most money is being spent on advanced software, in total secrecy, programming the new entities to be utterly ruthless, parasitical and predatory... you would shudder and know fear. 
I am not talking the military... but high frequency stock trading program systems. THAT is where "skynet" may emerge, suddenly and silently. Learn more... and be afraid!

User 'jb' also commented at Marginal Revolutions, "We are watching the singularity happen, brick by brick. It’s just that it will take 20-30 years before the “wall” is high enough that someone says 'Hey, that’s a pretty impressive wall you’ve built there.'"

At the New Yorker, Gary Marcus noted that Sterling's "optimism has little to do with reality."  Kevin Drum of Mother Jones wrote, "I'm genuinely stonkered by this. If we never achieve true AI, it will be because it's technologically beyond our reach for some reason. It sure won't be because nobody's interested and nobody sees any way to make money out of it."

From our perspective on this, Sterling's take on the Singularity, and how it will come about, does not take into account the nature of exponential technology, and as such is clearly missing the boat. As countless comments and the economy itself tells, there is a massive business potential in the development of artificial intelligence, and if we simply track Moore's Law to the number of bits in the human brain, we will definitely hit the potential for greater-than human abilities computationally in the periods Kurzweil outlines in his book, The Singularity Is Near.

Google's hiring of Kurzweil represents the company's understanding of the tremendous upside in developing the AI technologies that will precede a Singularity.

As Dvorky points out, one leading artificial intelligence researcher has estimated that there's roughly a trillion dollars to be made alone as we move from keyword search to genuine AI question-answering on the web.

Dharmendra Modha's SYNAPSE Project

Even if the business model does not, in itself, lead to the smarter-than human AI, projects working on reverse engineering the human brain, such as Dharmendra Modha's SYNAPSE project at IBM or Henry Markram's Blue Brain Project, and subsequent efforts have great potential at creating a substrate independent intelligence.  These projects are further influenced by neuroscience progress and even genetics research.

In the areas of pure research, applied research, business, the military and more, the push is on to create better artificial intelligence algorithms and hardware.  Certainly in the past the promise of Artificial General Intelligence (AGI) has always been that it is right around the corner, or '50 years away.'  What Kurzweil outlines in his books, and what Sterling seems not to accept, is that the convergence of various technologies is now, more than ever leading us headlong to a Singularity - business model or not.

Responding to the debate, Sterling on Beyond The Beyond has posted a lengthy  and heated rebuttal to Dvorky's iO9 piece:

*It’s been twenty years. Two long decades. Is any of this speculative stuff actually occurring in our real lives today? Read what Vinge wrote, it’s all in the record, and here it is for you below, nice and handy. “Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.” That’s what Vinge wrote. “Singularity” doesn’t mean neat-o Google cars or stock-buying algorithms, Singularity means what Vinge said it means. 
*Don’t soft-shoe around in the tall weeds while you mutter the shibboleths of the 1990s. We’re supposed to be two-thirds of the way there by now. The AI Rapture’s supposed to be all over us here in the 2010s, annihilating humanity with superhuman, conscious intelligence. At the latest, when you’re ten years older than today. 
*You truly expect that to happen? You wrote that in your will, you’re all prepped for that with the canned water and AI-killing survival EMP guns? You’re good to go with the post-human era in this decade? A kid who’s ten now, he’ll never see twenty-one? You’re all righteously upset when skeptics say that’s hokum? 
*Be honest. Vinge was honest. He was wrong in his dark suspicions 20 years ago, but at least he was confronting evidence and trying to say some useful, provocative things that were falsifiable. We’re lucky we’ve got the “glut of technical riches, never properly absorbed.” If it had been otherwise, we wouldn’t be here.
What do you think? Let us know in the comments!

TOP IMAGE: Wikipedia Commons

By 33rd SquareSubscribe to 33rd Square