2 Ratings:

Singularity Summit 2011: Ray Kurzweil Main Presentation

  • Extraett
  • uploaded: Oct 27, 2011
  • Hits: 359

Description:

The Singularity represents an "event horizon" in the predictability of human technological development past which present models of the future may cease to give reliable answers, following the creation of strong AI or the enhancement of human intelligence.

A number of noted scientists and technologists have predicted that after the Singularity, humans as we exist presently will no longer be driving technological progress, with models of change based on past trends in human behavior becoming obsolete.

In the 1950's, legendary information theorist John von Neumann was paraphrased by mathematician Stanislaw Ulam as saying, "The ever-accelerating progress of technology...gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue."

In 1965, statistician I.J. Good described a concept similar to today's meaning of the Singularity, in "Speculations Concerning the First Ultraintelligent Machine":

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.

The concept was solidified by mathematician and computer scientist Vernor Vinge, who coined the term "technological singularity" in an article for Omni magazine in 1983, followed by a science fiction novel, Marooned in Realtime, in 1986. Seven years later, Vinge presented his seminal paper, "The Coming Technological Singularity," at a NASA-organized symposium. Vinge wrote:

What are the consequences of this event? When greater-than-human intelligence drives progress, that progress will be much more rapid. In fact, there seems no reason why progress itself would not involve the creation of still more intelligent entities – on a still-shorter time scale.

In 2000, AI researcher Eliezer Yudkowsky and entrepreneur Brian Atkins founded the Singularity Institute to work toward smarter-than-human intelligence by engaging in Artificial Intelligence and machine ethics research. On the Institute's site, Yudkowsky states:

The Singularity is beyond huge, but it can begin with something small. If one smarter-than-human intelligence exists, that mind will find it easier to create still smarter minds. In this respect the dynamic of the Singularity resembles other cases where small causes can have large effects; toppling the first domino in a chain, starting an avalanche with a pebble, perturbing an upright object balanced on its tip. All it takes is one technology – Artificial Intelligence, brain-computer interfaces, or perhaps something unforeseen – that advances to the point of creating smarter-than-human minds. That one technological advance is the equivalent of the first self-replicating chemical that gave rise to life on Earth.

In 2005, inventor Ray Kurzweil released The Singularity is Near, where he presented the Singularity as an overall exponential (doubling) growth trend in technological development:

What, then, is the Singularity? It's a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed. Although neither utopian or dystopian, this epoch will transform the concepts that we rely on to give meaning to our lives, from our business models to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the significance of our past and the ramifications for our future. To truly understand it inherently changes one's view of life in general and one's own particular life.

While some regard the Singularity as a positive event and work to hasten its arrival, others view the Singularity as dangerous, undesirable, or unlikely. The most practical means for initiating the Singularity are debated, as are how, or whether, it can be influenced or avoided if dangerous.

The Singularity Summit is the world's leading dialog on the Singularity, bringing together scientists, technologists, skeptics, and enthusiasts alike. It was created to provide a much needed forum to discuss the risks and opportunities presented by our expanding relationship with technology.



Previous Media Next Media
Show more Show less

0 comments

No comments yet.



 
Visit Disclose.tv on Facebook