1988V. VingeThreats & Other Promises 254
Ham’s eyes were drawn to the stone carving (now bluish green) that sat on Larry’s desk. The old man nodded. ‘It’s an ansible.’ ‘Surely they don’t call it that!’ ‘No. But that’s what it is.’
1986V. VingeMarooned in Realtime 28
‘We can’t reproduce the most advanced of our own devices. When those finally break, we'll be as helpless as you.’ ‘I thought your autons were good for hundreds of years.’ Ibid. 29 If we can’t, if we fall to a premachine age when our autons fail…we'll be just too primitive and too few to survive.
1992V. VingeFire upon Deep (1993) 68
At three-tenths lightspeed, Pham spent decades in coldsleep getting from star to star, then a year or two at each port trying to make a profit with products and information that might be lethally out-of-date.
2010V. VingePreliminary Assessment of the Drake Equation in Year’s Best SF 16 (2011)
‘Somebody scare up a definition of the Drake equation and let’s supply some answers.[’] Now if we’d been back on Earth, I’m sure everyone would’ve had that definition instantly. Groundhogs don’t appreciate the solitude of deep space. In deep space, you don’t have an instant link to the Internet. It can take hours or days to get home. I take considerable satisfaction from this fact.
1981V. VingeTrue Names in Binary Star No. 5 179
As they fell deeper into the humid air of the lowlands, Mr. Slippery dipped into the news channels: word was already coming over the LA Times of the fluke accident in which the Hokkaido aerospace launching laser had somehow shone on MT3’s optics.
1983V. VingeFirst Word in Omni Jan. 10/2
We will soon create intelligences greater than our own. When this happens, human history will have reached a kind of singularity, an intellectual transition as impenetrable as the knotted space-time at the center of a black hole, and the world will pass far beyond our understanding. This singularity, I believe, already haunts a number of science-fiction writers. It makes realistic extrapolation to an interstellar future impossible. To write a story set more than a century hence, one needs a nuclear war in between—and progress enough so that the world remains intelligible.
2001V. Vinge in Locus Jan. 69/1
In 1982, on a scientific panel, it occurred to me that with all these ideas I had been talking about and others had been talking about earlier, basically if we ever did get a human-equivalent intelligence, very soon things would be very different. Since the creative impetus would not be from us, it would be in some sense unknowable… This panel was the first time I used the term ‘technological singularity’. I did a 900-word piece about that in ’83 in Omni, and almost all my science fiction has been dealing with it. In ’92 or ’93, NASA asked me to come and give a talk on it, and I did an essay where I also tried to look at precursors. In John von Neumann’s obituary written by Stan Ulam, he relates a conversation he had with von Neumann in which he even uses the term ‘essential singularity’. To me, von Neumann’s notion was not quite the same thing. He was saying that technological progress would become so advanced that the situation would be unknowable—that much is like what I was saying. But to me, the fundamental reason for the technological singularity is, technology creates something that is smarter than human.