The world's end has been imminent since at least the time of Jesus (himself an apocalypticist). Here's how AGI true believers are preparing for the techno-rapture.
Firstly Émile, hope you had a great Christmas/whatever Holiday you personally choose to celebrate!
Secondly, this post is exactly why I don’t read post/tweets/ YT comments/news pieces or watch YT videos hyping up AI tech (or being Doomers like these guys) because I can’t stand the people who actually think AGI/ASI is coming. I’ve said it before, but thinking about ASI as an “imminent problem that will kill us all” is such a stupid thing.
“Nah, I dropped out of college and spent all my money because the gooner bots created by racists and techno-fascists is definitely gonna morph into a God-Like being somehow (Never mind how much power it needs) and kill me and everyone else. That’s definitely more likely than Russia, the US, North Korea or Israel launching nukes and killing us all. Definitely.”
The TESCREAL crowd sound like cultists when they speak and they're clearly having a harsh impact on some receptive minds.
I intend to keep pointing to useful pieces about the broader war against democracy and its fronts (e.g. the Russian invasion of Ukraine, Trump regime attempting to solidify control in the US, tech CEOs influencing much of anything).
One thought building on your Millerite comparison: the “people stopped plowing their fields” moment is real, but it’s only the micro-phase. What feels missing is the institutional afterlife. The Millerite disappointment didn’t just dissolve—it re-encoded into Seventh-day Adventism, which went on to have outsized, often invisible influence in medicine, nutrition, education, and public health under the banner of “health” rather than prophecy. In that sense, the risk signal isn’t just short-term behavioral distortion when an end feels near, but what happens when apocalyptic expectation hardens into durable institutions that outlive the belief itself. That longer arc feels especially relevant to contemporary AI discourse.
Reading these quotes, and similar one in the tech-bro sphere, I cannot help but be reminded of Alan Watts’ astute declaration: “No valid plans for the future can be made by those who have no capacity for living now.”
Almost all of us have lost this capacity because of a cultural addiction to words and symbols (mistaking the menu for the meal—another one of Watts’ apt aphorisms), but pair this mental habit with engineer’s disease, and you find yourself in a world of hurt.
It doesn’t have to be this way. There’s always time to smell some roses—or the air after a summer rain.
I do appreciate your cataloging the various forms of psychic disturbance oriented around something that no one can quite give a reasoned account as to why we would assume it possible.
In this article I really like the parallel between the millerites not plowing fields and tech bros dropping out of college. It is typically bad to structure your life around future speculative events.
Although for this instance specifically I think that the possibility of AGI does FEEL more tangible given some of the recent thresholds being crossed.
The continual improvement of ai images through the last years (although through stolen data) has been apalling. I find it incredibly hard to parse some images/videos from real ones while the situation we find ourselves in was unthinkable 5 years ago. If another more substantial image generator is produced I am not sure I would be able to discern it at all. Also that this may also be happening to music: https://arxiv.org/html/2506.19085
This is your ultimate tech apocalypse. I didn't intend to write this article now, but it forced itself to the surface. By the pricking of my thumbs, something evil this way comes.
Firstly Émile, hope you had a great Christmas/whatever Holiday you personally choose to celebrate!
Secondly, this post is exactly why I don’t read post/tweets/ YT comments/news pieces or watch YT videos hyping up AI tech (or being Doomers like these guys) because I can’t stand the people who actually think AGI/ASI is coming. I’ve said it before, but thinking about ASI as an “imminent problem that will kill us all” is such a stupid thing.
“Nah, I dropped out of college and spent all my money because the gooner bots created by racists and techno-fascists is definitely gonna morph into a God-Like being somehow (Never mind how much power it needs) and kill me and everyone else. That’s definitely more likely than Russia, the US, North Korea or Israel launching nukes and killing us all. Definitely.”
See how stupid that sounds.
The TESCREAL crowd sound like cultists when they speak and they're clearly having a harsh impact on some receptive minds.
I intend to keep pointing to useful pieces about the broader war against democracy and its fronts (e.g. the Russian invasion of Ukraine, Trump regime attempting to solidify control in the US, tech CEOs influencing much of anything).
One thought building on your Millerite comparison: the “people stopped plowing their fields” moment is real, but it’s only the micro-phase. What feels missing is the institutional afterlife. The Millerite disappointment didn’t just dissolve—it re-encoded into Seventh-day Adventism, which went on to have outsized, often invisible influence in medicine, nutrition, education, and public health under the banner of “health” rather than prophecy. In that sense, the risk signal isn’t just short-term behavioral distortion when an end feels near, but what happens when apocalyptic expectation hardens into durable institutions that outlive the belief itself. That longer arc feels especially relevant to contemporary AI discourse.
Fantastic point! Yes, I totally agree. Thanks so much for sharing this insight!
My prediction is humans will make predictions that fail to materialise. Events are something of a shaken up snow globe these days.
Rosebud.
Reading these quotes, and similar one in the tech-bro sphere, I cannot help but be reminded of Alan Watts’ astute declaration: “No valid plans for the future can be made by those who have no capacity for living now.”
Almost all of us have lost this capacity because of a cultural addiction to words and symbols (mistaking the menu for the meal—another one of Watts’ apt aphorisms), but pair this mental habit with engineer’s disease, and you find yourself in a world of hurt.
It doesn’t have to be this way. There’s always time to smell some roses—or the air after a summer rain.
I do appreciate your cataloging the various forms of psychic disturbance oriented around something that no one can quite give a reasoned account as to why we would assume it possible.
Lol!
In this article I really like the parallel between the millerites not plowing fields and tech bros dropping out of college. It is typically bad to structure your life around future speculative events.
Although for this instance specifically I think that the possibility of AGI does FEEL more tangible given some of the recent thresholds being crossed.
The continual improvement of ai images through the last years (although through stolen data) has been apalling. I find it incredibly hard to parse some images/videos from real ones while the situation we find ourselves in was unthinkable 5 years ago. If another more substantial image generator is produced I am not sure I would be able to discern it at all. Also that this may also be happening to music: https://arxiv.org/html/2506.19085
I also think this is especially worrying given the rapidly increasing ai generated internet content: https://graphite.io/five-percent/more-articles-are-now-created-by-ai-than-humans
I wonder how much of what I read is just chatgpt, and I was not asking that in 2023.
Although the TESCREAL screencapped in this post are far too certain in their views and timelines.
I don’t know how representative this is of Gen Z but if it is, I now understand what’s ailing them.
Timely! Faggela's vision: what's the actual human cost? Insightful, author.
This is your ultimate tech apocalypse. I didn't intend to write this article now, but it forced itself to the surface. By the pricking of my thumbs, something evil this way comes.
https://systemshaywire.substack.com/p/the-advent-of-the-machine-god