“Lies propagate, that's what I'm saying. You've got to tell more lies to cover them up, lie about every fact that's connected to the first lie. And if you kept on lying, and you kept on trying to cover it up, sooner or later you'd even have to start lying about the general laws of thought. Like, someone is selling you some kind of alternative medicine that doesn't work, and any double-blind experimental study will confirm that it doesn't work. So if someone wants to go on defending the lie, they've got to get you to disbelieve in the experimental method. Like, the experimental method is just for merely scientific kinds of medicine, not amazing alternative medicine like theirs. Or a good and virtuous person should believe as strongly as they can, no matter what the evidence says. Or truth doesn't exist and there's no such thing as objective reality. A lot of common wisdom like that isn't just mistaken, it's anti-epistemology, it's systematically wrong. Every rule of rationality that tells you how to find the truth, there's someone out there who needs you to believe the opposite. If you once tell a lie, the truth is ever after your enemy; and there's a lot of people out there telling lies.”
Eliezer Yudkowsky“The purpose of a moral philosophy is not to look delightfully strange and counterintuitive or to provide employment to bioethicists. The purpose is to guide our choices toward life, health, beauty, happiness, fun, laughter, challenge, and learning.”
Eliezer Yudkowsky“I keep trying to explain to people that the archetype of intelligence is not Dustin Hoffman in 'The Rain Man;' it is a human being, period. It is squishy things that explode in a vacuum, leaving footprints on their moon.”
Eliezer Yudkowsky“We tend to see individual differences instead of human universals. Thus, when someone says the word 'intelligence,' we think of Einstein instead of humans.”
Eliezer Yudkowsky“I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.”
Eliezer Yudkowsky“When you think of intelligence, don't think of a college professor; think of human beings as opposed to chimpanzees. If you don't have human intelligence, you're not even in the game.”
Eliezer Yudkowsky“Since the rise of Homo sapiens, human beings have been the smartest minds around. But very shortly - on a historical scale, that is - we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years.”
Eliezer Yudkowsky“The purest case of an intelligence explosion would be an Artificial Intelligence rewriting its own source code. The key idea is that if you can improve intelligence even a little, the process accelerates. It's a tipping point. Like trying to balance a pen on one end - as soon as it tilts even a little, it quickly falls the rest of the way.”
Eliezer Yudkowsky“Intelligence is the source of technology. If we can use technology to improve intelligence, that closes the loop and potentially creates a positive feedback cycle.”
Eliezer Yudkowsky“There's a popular concept of 'intelligence' as book smarts, like calculus or chess, as opposed to, say, social skills. So people say that 'it takes more than intelligence to succeed in human society.' But social skills reside in the brain, not the kidneys.”
Eliezer Yudkowsky“Anything that could give rise to smarter-than-human intelligence - in the form of Artificial Intelligence, brain-computer interfaces, or neuroscience-based human intelligence enhancement - wins hands down beyond contest as doing the most to change the world. Nothing else is even in the same league.”
Eliezer Yudkowsky