- Founders Behaving Badly
- Posts
- stay out of it, [insert name here]
stay out of it, [insert name here]

First, a history lesson
There’s an iconic moment in the iconic show, One Tree Hill, when one of the characters gets to tell Nick Lachey to STFU. (If only we were all so lucky!!)
I think about this line a lot when someone has an opinion that is so monumentally stupid, so pointless, so asinine that I wish I could tell them to butt out, just like I would to Nick Lachey.
I’m a Sam Altman hater
Have I ever said that in this newsletter before?
So when a friend shared this snippet with me from an answer Altman gave at an impact summit, I was like, yes. I am happy to continue being a Sam Altman hater. I have chosen wisely.
“People talk about how much energy it takes to train an AI model. But it also takes a lot of energy to train a human. It takes about 20 years of life — and all the food you consume during that time — before you become smart.”
But I’m not here in your inbox just to dunk on Sammy for this, frankly, fucking stupid POV. Because I’m not that kind of lady, you understand?
I’m here because I think it’s about time we talked about TESCREAL.
TESCREALism
(Sorry, I’m about to give you some homework.)
TESCREAL is an acronym that encompasses quite a few popular and influential ideologies in Silicon Valley: Transhumanism, Extropianism, Singularitarianism, Cosmism, Rationalism, Effective Altruism, Longtermism. Lots of -isms!
I definitely recommend reading the full interview transcript and the screaming into a pillow, but here’s the gist:
The key idea at the very heart of the TESCREAL worldview is this techno-utopian vision of the future whereby we develop advanced technologies that enable us to radically reengineer the human organism to create a new post-human species — so-called transhumanism. We will spread beyond Earth, colonize Mars and then the rest of the galaxy, and ultimately create vast computer simulations full of trillions and trillions of digital people.
And the whole reason for doing that is that the TESCREAL worldview instructs us to maximize the total amount of value within the universe. People are the containers or the substrates of value. So the more people you have, the greater total amount of value you could potentially create.
Translation: People are tools for creating value. They are nothing more than stepping stones for the ultimate end-goal, which is…creating maximized value.
And colonizing Mars or whatever.
Look, I can’t claim to understand completely. Because I think it’s bonkers and it clashes with my worldview so completely that I think I’m blue-screening. But I’m sharing this because it provides some necessary context for why leaders like Altman would say something so…dehumanizing.
Because many tech leaders with billions of dollars and outsized influence don’t actually care about the human experience—they’re more interested in the transhumanist one. Their idea of a utopia doesn’t center humans; it centers technology.
(And our governments are buying into the hype, too! Not to be all charlie-day-red-string.jpg, but consider for a sec how JD Vance and Peter Thiel are connected. I could keep going, but I won’t, for your sake.)
So who, then, is thinking about the humans? Sorry friends, I think it’s us.
As always,
ALLEGEDLY 💋
Hey, I’m Amber! I’m a freelance content marketer/consultant and aspiring independently wealthy heiress. You can find me on LinkedIn, Medium, Bluesky, and my website. If you really like what I’m doing, you can buy me a coffee.
Reply