“Wow," the empty air finally said. "Wow. That puts a pretty different perspective on things, I have to say. I'm going to remember this the next time I feel an impulse to blame myself for something. Neville, the term in the literature for this is 'egocentric bias', it means that you experience everything about your own life but you don't get to experience everything else that happens in the world. There was way, way more going on than you running in front of me. You're going to spend weeks remembering that thing you did there for six seconds, I can tell, but nobody else is going to bother thinking about it. Other people spend a lot less time thinking about your past mistakes than you do, just because you're not the center of their worlds. I guarantee to you that nobody except you has even considered blaming Neville Longbottom for what happened to Hermione. Not for a fraction of a second. You are being, if you will pardon the phrase, a silly-dilly. Now shut up and say goodbye.”
Eliezer Yudkowsky“The purpose of a moral philosophy is not to look delightfully strange and counterintuitive or to provide employment to bioethicists. The purpose is to guide our choices toward life, health, beauty, happiness, fun, laughter, challenge, and learning.”
Eliezer Yudkowsky“I keep trying to explain to people that the archetype of intelligence is not Dustin Hoffman in 'The Rain Man;' it is a human being, period. It is squishy things that explode in a vacuum, leaving footprints on their moon.”
Eliezer Yudkowsky“We tend to see individual differences instead of human universals. Thus, when someone says the word 'intelligence,' we think of Einstein instead of humans.”
Eliezer Yudkowsky“I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.”
Eliezer Yudkowsky“When you think of intelligence, don't think of a college professor; think of human beings as opposed to chimpanzees. If you don't have human intelligence, you're not even in the game.”
Eliezer Yudkowsky“Since the rise of Homo sapiens, human beings have been the smartest minds around. But very shortly - on a historical scale, that is - we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years.”
Eliezer Yudkowsky“The purest case of an intelligence explosion would be an Artificial Intelligence rewriting its own source code. The key idea is that if you can improve intelligence even a little, the process accelerates. It's a tipping point. Like trying to balance a pen on one end - as soon as it tilts even a little, it quickly falls the rest of the way.”
Eliezer Yudkowsky“Intelligence is the source of technology. If we can use technology to improve intelligence, that closes the loop and potentially creates a positive feedback cycle.”
Eliezer Yudkowsky“There's a popular concept of 'intelligence' as book smarts, like calculus or chess, as opposed to, say, social skills. So people say that 'it takes more than intelligence to succeed in human society.' But social skills reside in the brain, not the kidneys.”
Eliezer Yudkowsky“Anything that could give rise to smarter-than-human intelligence - in the form of Artificial Intelligence, brain-computer interfaces, or neuroscience-based human intelligence enhancement - wins hands down beyond contest as doing the most to change the world. Nothing else is even in the same league.”
Eliezer Yudkowsky