Two days ago, Truthdig published my review of Eliezer Yudkowsky and Nate Soares’ book If Anyone Builds It, Everyone Dies. Please read and share it, if you have the time.
Thank you Émile for liberating me from ever taking any of these wankers seriously ever again. These are deeply disturbed people who should not be in charge of anything ever.
I sincerely expected something serious but all of it seems unnecessarily dunking on EY. We know what he meant by Loss Function, and he's obviously not advocating for murdering children (explicitly said this in the comment also).
We don’t know what he meant by “Loss Function” though and his idea of it is not what it actually is in Comp-Sci. And sure he’s not outright saying we should murder children, but saying that between certain ages that they matter as much as you eat is, it shows that morally speaking he wouldn’t be opposed to it based on his logic.
What do you think he meant by "loss function"? The example of murdering children illustrates that he seriously entertains views that are *way outside* the bounds of what, say, virtually every moral philosopher would describe as moral acceptability. Such examples show, I think, that Yudkowsky is not a good thinker -- and certainly not a paragon of rationality. Much of his thinking is muddled, sophomoric, and at times outrageously absurd. That's my take, at least!! :-)
I personally think that Yudkowsky is a deluded idiot with a few systematic deficiencies which skew his thinking on almost every issue; he's done untold indirect harm to many of my personal acquaintances by creating a cult of autism-for-austistics.
However, most of your complaints here reiterate some flavor of "You're not allowed to have opinions on [big important idea] unless you get enough good boy points from [institution that happens to agree with me on a given issue] or [general public]"
It seems silly and a little intellectually dishonest to take this view these days, in an era when the failures of academia are so dire that the general public elect politicians who make its overthrow a part of their platform. The replication crisis is not solely a statistical phenomenon or piece of trivia!
So you saying "These other people whom I trust don't like Yudkowsky" isn't much of a value-add on your part, and reads more as character assassination or sour grapes. Against your hyperlinks Yudkowsky could stack up the millions of dollars he's moved around by winning the trust of industry leaders, and the professional philosophers like Will MacAskill and Steven Pinker who have at the very least associated with him.
If I were to criticize Yudkowsky myself, I'd root the discussion on those systematic flaws in his reasoning. Fundamentally, he's an autistic man who denies that autism is a disability. His self-help program (Rationality) provides a series of very useful tools for lightly autistic people to ameliorate their deficiency of social intuition, but by no means fully compensates for it. One of the main examples in your post (his statements about animals and children) betrays the under-developed theory of mind characteristic of autism. This is a complaint that could be engaged with on intellectual grounds, rather than defaulting to an institutional dick-measuring contest.
Well Émile I read this whole article as well as your post on Truthdig, and I can definitely say I enjoyed both of them! I was definitely looking forward to something like this since Yudkowsky and Soares’s book came out.
After reading both pieces, it actually kind of disturbs me how much people listen to the drivel put forth by Yud. Not only did I watch that clip from the Lex Fridman interview but I also looked at the YT video itself to look at the comments of that video, and man was I disappointed! Just praise after praise with people calling him a “genius” or “He trying to dumb down this stuff for us, what an amazing guy” or more egregiously “Nobody is listening to this guy and we’re gonna be sorry”. One of the top comments was a guy saying that “nobody knows anything about the current state of AI and my wife is pregnant with our son and I don’t know if he’ll have a future.” And of course thanks to people like Fridman and Erza Klein (Who’s interview with Yud posted on the Times has equally stupid comments) Yud is now being brought into the mainstream and people blindly listen to him without understanding who he is, what he stands for, or even that many of the current AI people from Kokotajlo to Sutskever to Altman to everyone in Silicon Valley was influenced by him. If anything, Yud is most responsible for starting this stupid AGI/ASI race!
It truly shocks me how much people love him and his ideas, most of which are just stolen from the OGs of Sci-Fi. His whole thing on that “why have a regular person when you can have an enhanced person?” is just stolen straight from Gattaca. But the truly troubling thing about him is how much he pushes this whole doomer thing, how people shouldn’t have happy thoughts because it doesn’t matter cause we’re definitely building an ASI soon is awful. What if, and I hate to say this, a young person maybe 14-15 decides to end their life because of what Yud has said? I think that’s something we need to ask and something Yud needs to acknowledge. But the worst thing is I don’t think him, the TESCREAL movement or anyone on those stupid AI Reddit subs, would even care. And that’s a problem.
Also Émile, I hope you don’t mind me asking this but do I comment on your stuff too much? I’m that guy who sent you a DM a little bit back and I’m worried I overstepped a little. I just find your work fascinating and love trying to interact with likeminded people. I think it’s nice I live in a time where I can contact a philosopher I like and they’ll respond to me. Anyone sorry for rambling, and good work as always.
Actually, the first time I encountered “diamondoid” (and similar terms, like “corundumoid”) was this shared worlds/creative writing/future history thing that I found online called Orion's Arm, which had folks like Anders Sandberg as its early contributors. I wonder how much of that influenced or was influenced by Yudkowski.
Also, XML-like language for AI development? Yeah, it's called Lisp, which was more concise (even if you needed to deal with parentheses more than people were comfortable with).
Spot on. His unwavering belief that he's an "absolut genius" and the only one who can "save the world" really highlights the terrifying ego at the core of these ideas, wich you've brilliantly identified as a major bug in the system.
Really appreciate this, I needed this breakdown. I've been in a constant state of confused revulsion after hearing him claim on Sam Harris's podcast that we can know for certain the that the galaxy is empty of all life, because we are still alive without an alien AI destroying the world already. You say he has an ego the size of Jupiter - that may be a huge underestimate, his ego is the size of the galaxy at least.
You can immediately tell that Yud is full of it by how he dresses - the average guy on the street has figured out that how you dress matters. So he is either too dumb to realize that the fedora is actively working against his noble quest to save humanity or, more likely, he realizes a certain kind of overly online dork will see themselves in a fedora-wearing misunderstood supergenius and that's a monetizable opportunity.
I appreciated Paul's critique of technological society and his recommendations for resistance, but he went off track when he started quoting doomers like Yudkowsky. I'm glad I had this piece of yours to reference in my response!
Thank you Émile for liberating me from ever taking any of these wankers seriously ever again. These are deeply disturbed people who should not be in charge of anything ever.
I sincerely expected something serious but all of it seems unnecessarily dunking on EY. We know what he meant by Loss Function, and he's obviously not advocating for murdering children (explicitly said this in the comment also).
We don’t know what he meant by “Loss Function” though and his idea of it is not what it actually is in Comp-Sci. And sure he’s not outright saying we should murder children, but saying that between certain ages that they matter as much as you eat is, it shows that morally speaking he wouldn’t be opposed to it based on his logic.
What do you think he meant by "loss function"? The example of murdering children illustrates that he seriously entertains views that are *way outside* the bounds of what, say, virtually every moral philosopher would describe as moral acceptability. Such examples show, I think, that Yudkowsky is not a good thinker -- and certainly not a paragon of rationality. Much of his thinking is muddled, sophomoric, and at times outrageously absurd. That's my take, at least!! :-)
I personally think that Yudkowsky is a deluded idiot with a few systematic deficiencies which skew his thinking on almost every issue; he's done untold indirect harm to many of my personal acquaintances by creating a cult of autism-for-austistics.
However, most of your complaints here reiterate some flavor of "You're not allowed to have opinions on [big important idea] unless you get enough good boy points from [institution that happens to agree with me on a given issue] or [general public]"
It seems silly and a little intellectually dishonest to take this view these days, in an era when the failures of academia are so dire that the general public elect politicians who make its overthrow a part of their platform. The replication crisis is not solely a statistical phenomenon or piece of trivia!
So you saying "These other people whom I trust don't like Yudkowsky" isn't much of a value-add on your part, and reads more as character assassination or sour grapes. Against your hyperlinks Yudkowsky could stack up the millions of dollars he's moved around by winning the trust of industry leaders, and the professional philosophers like Will MacAskill and Steven Pinker who have at the very least associated with him.
If I were to criticize Yudkowsky myself, I'd root the discussion on those systematic flaws in his reasoning. Fundamentally, he's an autistic man who denies that autism is a disability. His self-help program (Rationality) provides a series of very useful tools for lightly autistic people to ameliorate their deficiency of social intuition, but by no means fully compensates for it. One of the main examples in your post (his statements about animals and children) betrays the under-developed theory of mind characteristic of autism. This is a complaint that could be engaged with on intellectual grounds, rather than defaulting to an institutional dick-measuring contest.
Well Émile I read this whole article as well as your post on Truthdig, and I can definitely say I enjoyed both of them! I was definitely looking forward to something like this since Yudkowsky and Soares’s book came out.
After reading both pieces, it actually kind of disturbs me how much people listen to the drivel put forth by Yud. Not only did I watch that clip from the Lex Fridman interview but I also looked at the YT video itself to look at the comments of that video, and man was I disappointed! Just praise after praise with people calling him a “genius” or “He trying to dumb down this stuff for us, what an amazing guy” or more egregiously “Nobody is listening to this guy and we’re gonna be sorry”. One of the top comments was a guy saying that “nobody knows anything about the current state of AI and my wife is pregnant with our son and I don’t know if he’ll have a future.” And of course thanks to people like Fridman and Erza Klein (Who’s interview with Yud posted on the Times has equally stupid comments) Yud is now being brought into the mainstream and people blindly listen to him without understanding who he is, what he stands for, or even that many of the current AI people from Kokotajlo to Sutskever to Altman to everyone in Silicon Valley was influenced by him. If anything, Yud is most responsible for starting this stupid AGI/ASI race!
It truly shocks me how much people love him and his ideas, most of which are just stolen from the OGs of Sci-Fi. His whole thing on that “why have a regular person when you can have an enhanced person?” is just stolen straight from Gattaca. But the truly troubling thing about him is how much he pushes this whole doomer thing, how people shouldn’t have happy thoughts because it doesn’t matter cause we’re definitely building an ASI soon is awful. What if, and I hate to say this, a young person maybe 14-15 decides to end their life because of what Yud has said? I think that’s something we need to ask and something Yud needs to acknowledge. But the worst thing is I don’t think him, the TESCREAL movement or anyone on those stupid AI Reddit subs, would even care. And that’s a problem.
Also Émile, I hope you don’t mind me asking this but do I comment on your stuff too much? I’m that guy who sent you a DM a little bit back and I’m worried I overstepped a little. I just find your work fascinating and love trying to interact with likeminded people. I think it’s nice I live in a time where I can contact a philosopher I like and they’ll respond to me. Anyone sorry for rambling, and good work as always.
Bullshit stinks no matter how you cook it.
Correct. :-) :-)
Actually, the first time I encountered “diamondoid” (and similar terms, like “corundumoid”) was this shared worlds/creative writing/future history thing that I found online called Orion's Arm, which had folks like Anders Sandberg as its early contributors. I wonder how much of that influenced or was influenced by Yudkowski.
Also, XML-like language for AI development? Yeah, it's called Lisp, which was more concise (even if you needed to deal with parentheses more than people were comfortable with).
Very interesting! (I hadn't seen the word "corundumoid" before.) Thanks for sharing. :-)
Spot on. His unwavering belief that he's an "absolut genius" and the only one who can "save the world" really highlights the terrifying ego at the core of these ideas, wich you've brilliantly identified as a major bug in the system.
Really appreciate this, I needed this breakdown. I've been in a constant state of confused revulsion after hearing him claim on Sam Harris's podcast that we can know for certain the that the galaxy is empty of all life, because we are still alive without an alien AI destroying the world already. You say he has an ego the size of Jupiter - that may be a huge underestimate, his ego is the size of the galaxy at least.
You can immediately tell that Yud is full of it by how he dresses - the average guy on the street has figured out that how you dress matters. So he is either too dumb to realize that the fedora is actively working against his noble quest to save humanity or, more likely, he realizes a certain kind of overly online dork will see themselves in a fedora-wearing misunderstood supergenius and that's a monetizable opportunity.
How should he dress in order for you not to think that he "is full of it"?
Hey Émile, I linked to this piece in a book review I just posted of Paul Kingsnorth's "Against The Machine". https://egghutt.substack.com/p/blessed-are-the-barbarians
I appreciated Paul's critique of technological society and his recommendations for resistance, but he went off track when he started quoting doomers like Yudkowsky. I'm glad I had this piece of yours to reference in my response!