25 Comments
User's avatar
YourBonusMom's avatar

Thank you Émile for liberating me from ever taking any of these wankers seriously ever again. These are deeply disturbed people who should not be in charge of anything ever.

Expand full comment
James Ray's avatar

I personally think that Yudkowsky is a deluded idiot with a few systematic deficiencies which skew his thinking on almost every issue; he's done untold indirect harm to many of my personal acquaintances by creating a cult of autism-for-austistics.

However, most of your complaints here reiterate some flavor of "You're not allowed to have opinions on [big important idea] unless you get enough good boy points from [institution that happens to agree with me on a given issue] or [general public]"

It seems silly and a little intellectually dishonest to take this view these days, in an era when the failures of academia are so dire that the general public elect politicians who make its overthrow a part of their platform. The replication crisis is not solely a statistical phenomenon or piece of trivia!

So you saying "These other people whom I trust don't like Yudkowsky" isn't much of a value-add on your part, and reads more as character assassination or sour grapes. Against your hyperlinks Yudkowsky could stack up the millions of dollars he's moved around by winning the trust of industry leaders, and the professional philosophers like Will MacAskill and Steven Pinker who have at the very least associated with him.

If I were to criticize Yudkowsky myself, I'd root the discussion on those systematic flaws in his reasoning. Fundamentally, he's an autistic man who denies that autism is a disability. His self-help program (Rationality) provides a series of very useful tools for lightly autistic people to ameliorate their deficiency of social intuition, but by no means fully compensates for it. One of the main examples in your post (his statements about animals and children) betrays the under-developed theory of mind characteristic of autism. This is a complaint that could be engaged with on intellectual grounds, rather than defaulting to an institutional dick-measuring contest.

Expand full comment
Evan Wayne Miller's avatar

Well Émile I read this whole article as well as your post on Truthdig, and I can definitely say I enjoyed both of them! I was definitely looking forward to something like this since Yudkowsky and Soares’s book came out.

After reading both pieces, it actually kind of disturbs me how much people listen to the drivel put forth by Yud. Not only did I watch that clip from the Lex Fridman interview but I also looked at the YT video itself to look at the comments of that video, and man was I disappointed! Just praise after praise with people calling him a “genius” or “He trying to dumb down this stuff for us, what an amazing guy” or more egregiously “Nobody is listening to this guy and we’re gonna be sorry”. One of the top comments was a guy saying that “nobody knows anything about the current state of AI and my wife is pregnant with our son and I don’t know if he’ll have a future.” And of course thanks to people like Fridman and Erza Klein (Who’s interview with Yud posted on the Times has equally stupid comments) Yud is now being brought into the mainstream and people blindly listen to him without understanding who he is, what he stands for, or even that many of the current AI people from Kokotajlo to Sutskever to Altman to everyone in Silicon Valley was influenced by him. If anything, Yud is most responsible for starting this stupid AGI/ASI race!

It truly shocks me how much people love him and his ideas, most of which are just stolen from the OGs of Sci-Fi. His whole thing on that “why have a regular person when you can have an enhanced person?” is just stolen straight from Gattaca. But the truly troubling thing about him is how much he pushes this whole doomer thing, how people shouldn’t have happy thoughts because it doesn’t matter cause we’re definitely building an ASI soon is awful. What if, and I hate to say this, a young person maybe 14-15 decides to end their life because of what Yud has said? I think that’s something we need to ask and something Yud needs to acknowledge. But the worst thing is I don’t think him, the TESCREAL movement or anyone on those stupid AI Reddit subs, would even care. And that’s a problem.

Also Émile, I hope you don’t mind me asking this but do I comment on your stuff too much? I’m that guy who sent you a DM a little bit back and I’m worried I overstepped a little. I just find your work fascinating and love trying to interact with likeminded people. I think it’s nice I live in a time where I can contact a philosopher I like and they’ll respond to me. Anyone sorry for rambling, and good work as always.

Expand full comment
paradox's avatar

I sincerely expected something serious but all of it seems unnecessarily dunking on EY. We know what he meant by Loss Function, and he's obviously not advocating for murdering children (explicitly said this in the comment also).

Expand full comment
Émile P. Torres's avatar

What do you think he meant by "loss function"? The example of murdering children illustrates that he seriously entertains views that are *way outside* the bounds of what, say, virtually every moral philosopher would describe as moral acceptability. Such examples show, I think, that Yudkowsky is not a good thinker -- and certainly not a paragon of rationality. Much of his thinking is muddled, sophomoric, and at times outrageously absurd. That's my take, at least!! :-)

Expand full comment
Evan Wayne Miller's avatar

We don’t know what he meant by “Loss Function” though and his idea of it is not what it actually is in Comp-Sci. And sure he’s not outright saying we should murder children, but saying that between certain ages that they matter as much as you eat is, it shows that morally speaking he wouldn’t be opposed to it based on his logic.

Expand full comment
Rocket Cat's avatar

Bullshit stinks no matter how you cook it.

Expand full comment
Émile P. Torres's avatar

Correct. :-) :-)

Expand full comment
cowboykiller's avatar

You can immediately tell that Yud is full of it by how he dresses - the average guy on the street has figured out that how you dress matters. So he is either too dumb to realize that the fedora is actively working against his noble quest to save humanity or, more likely, he realizes a certain kind of overly online dork will see themselves in a fedora-wearing misunderstood supergenius and that's a monetizable opportunity.

Expand full comment
Yvan's avatar

How should he dress in order for you not to think that he "is full of it"?

Expand full comment
T Kamal's avatar

Actually, the first time I encountered “diamondoid” (and similar terms, like “corundumoid”) was this shared worlds/creative writing/future history thing that I found online called Orion's Arm, which had folks like Anders Sandberg as its early contributors. I wonder how much of that influenced or was influenced by Yudkowski.

Also, XML-like language for AI development? Yeah, it's called Lisp, which was more concise (even if you needed to deal with parentheses more than people were comfortable with).

Expand full comment
Émile P. Torres's avatar

Very interesting! (I hadn't seen the word "corundumoid" before.) Thanks for sharing. :-)

Expand full comment
Rainbow Roxy's avatar

Spot on. His unwavering belief that he's an "absolut genius" and the only one who can "save the world" really highlights the terrifying ego at the core of these ideas, wich you've brilliantly identified as a major bug in the system.

Expand full comment
Steffan's avatar

Really appreciate this, I needed this breakdown. I've been in a constant state of confused revulsion after hearing him claim on Sam Harris's podcast that we can know for certain the that the galaxy is empty of all life, because we are still alive without an alien AI destroying the world already. You say he has an ego the size of Jupiter - that may be a huge underestimate, his ego is the size of the galaxy at least.

Expand full comment
Thomas Hutt's avatar

Hey Émile, I linked to this piece in a book review I just posted of Paul Kingsnorth's "Against The Machine". https://egghutt.substack.com/p/blessed-are-the-barbarians

I appreciated Paul's critique of technological society and his recommendations for resistance, but he went off track when he started quoting doomers like Yudkowsky. I'm glad I had this piece of yours to reference in my response!

Expand full comment
Émile P. Torres's avatar

Interesting ... will read Kingsnorth's piece later today! Thanks so much!

Expand full comment
Émile P. Torres's avatar

For whatever reason, I'm reminded (without having read the piece you linked to) that Yudkowsky's arguments are all striking similar to those made by, as I recall, Ted Kaczynski in his "Ship of Fools" article. Maybe I'll add that Kaczynski piece to my reading list for the day ...... might be worth exploring in another newsletter article!

Expand full comment
FionnM's avatar

If two parents were conceiving a child using embryonic trait selection, do you think it would be sensible of them to stipulate that they wanted their child to be paralysed from the waist downwards?

Expand full comment
FionnM's avatar

I don't consider RationalWiki a reputable source, and I don't know why you do.

Expand full comment
Hein de Haan's avatar

"Yudkowsky calls himself a “decision theorist,” and yet has not made a single contribution to the field of decision theory — academics overwhelmingly think Yudkowsky’s own decision theory is fundamentally flawed (to the extent that most don’t even think it’s worth the time to “refute” it)."

What is your strongest argument against his decision theory? (I'm asking because the article you link, and the article IT links, rest on demonstrable errors.)

Expand full comment
Sam Waters's avatar

Why even bother calling your article for TruthDig a book review when most of it is not really about the book's contents and instead focuses on comments the author's have made that are not directly connected to the arguments in the book and, in some cases, aren't even really on the topic of AI?

Expand full comment
FionnM's avatar

While you may have quoted Scott Alexander in order to mock Eliezer, Alexander has praised "Harry Potter and the Methods of Rationality" as a moving, powerful piece of literature on many occasions, even explicitly arguing he thinks it's a superior work of literature to "The Great Gatsby" (https://slatestarcodex.com/2015/03/12/ot16-avada-threadavra/#comment-190053). (I can't comment, having never read "the Methods of Rationality".) I can't imagine Alexander would appreciate being quoted out of context.

Expand full comment
FionnM's avatar

The fact that Eliezer Yudkowsky wrote Harry Potter fanfiction has no bearing on whether or not his ideas have merit or deserve serious consideration. In addition to being a groundbreaking physicist and mathematician, Isaac Newton was an alchemist and occultist. That he was an alchemist and occultist does not invalidate Newtonian mechanics.

Expand full comment