If a triangle could speak, it would say... that God is eminently triangular, while a circle would say that the divine nature is eminently circular.

Baruch Spinoza

Eliezer Yudkowsky - Biography

Eliezer Shlomo Yudkowsky (born September 11, 1979) is an American artificial intelligence researcher concerned with the singularity and an advocate of friendly artificial intelligence, living in Redwood City, California.

Contents

Biography

Yudkowsky did not attend high school and is an autodidact with no formal education in artificial intelligence. He co-founded the nonprofit Singularity Institute for Artificial Intelligence (SIAI) in 2000 and continues to be employed as a full-time Research Fellow there.

Work

Yudkowsky's research focuses on Artificial Intelligence theory for self-understanding, self-modification, and recursive self-improvement (seed AI); and also on artificial-intelligence architectures and decision theories for stably benevolent motivational structures (Friendly AI, and Coherent Extrapolated Volition in particular). Apart from his research work, Yudkowsky has written explanations of various philosophical topics in non-academic language, particularly on rationality, such as "An Intuitive Explanation of Bayes' Theorem".

Publications

Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias sponsored by the Future of Humanity Institute of Oxford University. In early 2009, he helped to found Less Wrong, a "community blog devoted to refining the art of human rationality". The Sequences on Less Wrong, comprising over two years of blog posts on epistemology, Artificial Intelligence, and metaethics, form the single largest bulk of Yudkowsky's writing.

He contributed two chapters to Oxford philosopher Nick Bostrom's and Milan Ćirković's edited volume Global Catastrophic Risks.

Yudkowsky is the author of the Singularity Institute publications "Creating Friendly AI" (2001), "Levels of Organization in General Intelligence" (2002), "Coherent Extrapolated Volition" (2004), and "Timeless Decision Theory" (2010).

Yudkowsky has also written several works of science fiction and other fiction. His Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality (The New Yorker described it as "a thousand-page online 'fanfic' text called 'Harry Potter and the Methods of Rationality', which recasts the original story in an attempt to explain Harry's wizardry through the scientific method"), and has been favorably reviewed both by author David Brin and by FLOSS programmer Eric S. Raymond.


Further reading

  • Our Molecular Future: How Nanotechnology, Robotics, Genetics and Artificial Intelligence Will Transform Our World by Douglas Mulhall, 2002, p. 321.
  • The Spike: How Our Lives Are Being Transformed By Rapidly Advancing Technologies by Damien Broderick, 2001, pp. 236, 265-272, 289, 321, 324, 326, 337-339, 345, 353, 370.

External links







The article is about these people:   Eliezer Yudkowsky

This information is published under GNU Free Document License (GFDL).
You should be logged in, in order to edit this article.

Discussion

Please log in / register, to leave a comment

Welcome to JewAge!
Learn about the origins of your family