Eliezer S. Yudkowsky (/ˌɛliˈɛzər ˌjʌdˈkaʊski/ EH-lee-EH-zər YUD-KOW-skee; born September 11, 1979) is an American artificial intelligence researcher and writer on decision theory and ethics, best known for popularizing ideas related to friendly artificial intelligence, including the idea of a "fire alarm" for AI. He is a co-founder and research fellow at the Machine Intelligence Research Institute (MIRI), a private research nonprofit based in Berkeley, California. His work on the prospect of a runaway intelligence explosion influenced philosopher Nick Bostrom's 2014 book Superintelligence: Paths, Dangers, Strategies. More information...
According to PR-model, yudkowsky.net is ranked 117,433rd in multilingual Wikipedia, in particular this website is ranked 81,572nd in English Wikipedia.
The website is placed before eroski.es and after 1book.co.jp in the BestRef global ranking of the most important sources of Wikipedia.