the transition to posthumanity
There wasn’t even a theoretical basis for faster-than-light travel. We never did invent hyperdrive, if you'll recall. We'd never have discovered it by accident, either, because we'd never have thought to do our experiments out beyond the singularity.
We will soon create intelligences greater than our own. When this happens, human history will have reached a kind of singularity, an intellectual transition as impenetrable as the knotted space-time at the center of a black hole, and the world will pass far beyond our understanding. This singularity, I believe, already haunts a number of science-fiction writers. It makes realistic extrapolation to an interstellar future impossible. To write a story set more than a century hence, one needs a nuclear war in between—and progress enough so that the world remains intelligible.
Have you ever heard of Singularity?… Singularity is a time in the future as envisioned by Vernor and Vinge [sic] . It'll occur when the rate of change of technology is very great—so great that the effort to keep up with the change will overwhelm us.
Burnt out, a giant star collapses into a form so dense, infinite dense at the core singularity, that light itself can no longer escape its grip.
Not even the singularity was pure enough to typify true blackness.
You’ve all seen the effects. Those affected by this illness are often found talking excitedly about the ‘Singularity’ and how pointless it is to work in ‘doomed’ industries like microelectronics (‘B-o-o-o-ring!’) and genetic engineering (‘Obsolete’).
We’ll make a run for the singularity shear and hope their phaser targeting is thrown out of calibration.
We must work towards being able to control, or at least contain, their development. The same goes for any form of artificial intelligence capable of improving itself. We will do it. The day will come when we control the Singularity, as we've learned to control the flame on the hearth, the lightning of the sky, and the nuclear fire of the stars!
Around 2050, or maybe even 2030, is when an upheaval unprecedented in human history—a technological singularity, as it’s been termed—is expected to erupt.
You find out it was just a subspace echo that got bounced through a singularity from one side of the quadrant to the other.
I was thinking a nearby singularity, or perhaps the fading wavefront from a long-ago supernova.
In 1982, on a scientific panel, it occurred to me that with all these ideas I had been talking about and others had been talking about earlier, basically if we ever did get a human-equivalent intelligence, very soon things would be very different. Since the creative impetus would not be from us, it would be in some sense unknowable… This panel was the first time I used the term ‘technological singularity’. I did a 900-word piece about that in ’83 in Omni, and almost all my science fiction has been dealing with it. In ’92 or ’93, NASA asked me to come and give a talk on it, and I did an essay where I also tried to look at precursors. In John von Neumann’s obituary written by Stan Ulam, he relates a conversation he had with von Neumann in which he even uses the term ‘essential singularity’. To me, von Neumann’s notion was not quite the same thing. He was saying that technological progress would become so advanced that the situation would be unknowable—that much is like what I was saying. But to me, the fundamental reason for the technological singularity is, technology creates something that is smarter than human.
in an Omni article by Vernor Vinge
Last modified 2021-01-12 01:51:10
In the compilation of some entries, HDSF has drawn extensively on corresponding entries in OED.