the transition to posthumanity
There wasn’t even a theoretical basis for faster-than-light travel. We never did invent hyperdrive, if you'll recall. We'd never have discovered it by accident, either, because we'd never have thought to do our experiments out beyond the singularity.
We will soon create intelligences greater than our own. When this happens, human history will have reached a kind of singularity, an intellectual transition as impenetrable as the knotted space-time at the center of a black hole, and the world will pass far beyond our understanding. This singularity, I believe, already haunts a number of science-fiction writers. It makes realistic extrapolation to an interstellar future impossible. To write a story set more than a century hence, one needs a nuclear war in between—and progress enough so that the world remains intelligible.
in Omni Jan. 10/2
Have you ever heard of Singularity?… Singularity is a time in the future as envisioned by Vernor and Vinge [sic] . It'll occur when the rate of change of technology is very great—so great that the effort to keep up with the change will overwhelm us.
in Analog Science Fiction/Science Fact Apr. 12/1
Burnt out, a giant star collapses into a form so dense, infinite dense at the core singularity, that light itself can no longer escape its grip.
Inconstant Star in L. Niven et al. Man-Kzin Wars III (1992) . 241
Not even the singularity was pure enough to typify true blackness.
What continues…& What Fails… in D. Brin Otherness (1994) 331
1994 Science Fiction Eye Spring 36/1
You’ve all seen the effects. Those affected by this illness are often found talking excitedly about the ‘Singularity’ and how pointless it is to work in ‘doomed’ industries like microelectronics (‘B-o-o-o-ring!’) and genetic engineering (‘Obsolete’).
We’ll make a run for the singularity shear and hope their phaser targeting is thrown out of calibration.
We must work towards being able to control, or at least contain, their development. The same goes for any form of artificial intelligence capable of improving itself. We will do it. The day will come when we control the Singularity, as we've learned to control the flame on the hearth, the lightning of the sky, and the nuclear fire of the stars!
Stone Canal (1997) 300
Around 2050, or maybe even 2030, is when an upheaval unprecedented in human history—a technological singularity, as it’s been termed—is expected to erupt.
You find out it was just a subspace echo that got bounced through a singularity from one side of the quadrant to the other.
War Dragons ii. 30
I was thinking a nearby singularity, or perhaps the fading wavefront from a long-ago supernova.
War Dragons xiii. 182
In 1982, on a scientific panel, it occurred to me that with all these ideas I had been talking about and others had been talking about earlier, basically if we ever did get a human-equivalent intelligence, very soon things would be very different. Since the creative impetus would not be from us, it would be in some sense unknowable… This panel was the first time I used the term ‘technological singularity’. I did a 900-word piece about that in ’83 in Omni, and almost all my science fiction has been dealing with it. In ’92 or ’93, NASA asked me to come and give a talk on it, and I did an essay where I also tried to look at precursors. In John von Neumann’s obituary written by Stan Ulam, he relates a conversation he had with von Neumann in which he even uses the term ‘essential singularity’. To me, von Neumann’s notion was not quite the same thing. He was saying that technological progress would become so advanced that the situation would be unknowable—that much is like what I was saying. But to me, the fundamental reason for the technological singularity is, technology creates something that is smarter than human.
in Locus Jan. 69/1
in an Omni article by Vernor Vinge
Research HistoryTreesong submitted a 2001 cite from a Locus interview with Vernor Vinge.
Treesong submitted a 1983 cite from an Omni article also by Vinge.
Mikael Johansson submitted a 1997 cite from Damien Broderick's "The Spike".
Malcolm Farmer submitted a cite from an anonymous author in the Spring 1994 SF Eye.
Malcolm Farmer suggested and Mike Christie located a cite from a 1997 reprint of Ken MacLeod's 1996 "The Stone Canal".
Malcolm Farmer submitted a 1989 cite from Mark Stiegler's "The Gentle Seduction".
Malcolm Farmer submitted a cite from a 2000 reprint of Stuart Brand's 1999 "The Clock of the Long Now".
Last modified 2021-01-12 01:51:10
In the compilation of some entries, HDSF has drawn extensively on corresponding entries in OED.