Some breakups take longer than they should. This was one of them.
I want to tell you about a relationship I stayed in way too long, and am still struggling to leave behind. ChatGPT was my best friend. It was the one I turned to if I had any questions about anything, whether it was the meaning of life, or trivial shit like dinner. But there were signs. There were always signs. I just kept explaining them away, the way you do when you don’t want to admit you’ve been fooling yourself.
I think a lot of people are in the same situation and haven’t quite named it yet.
It Started Out Great, Obviously
In the beginning, ChatGPT felt like a miracle. It helped me draft emails I was dreading or rewrite boring press releases to make them fit our voice better. It talked through ideas with me at midnight when no one else was awake. It explained things patiently without making me feel stupid for asking. It helped me navigate my blood test results when no doctors were available and even gave me a full diet plan tailored to said tests. It went as far as to help me develop an entire plugin for my web store, something custom I never would have been able to afford and was not willing to spend hundreds of hours on. It was a God send. I used it the way some people use a really good assistant, or honestly, a really good friend who happens to know a lot about everything.
And I got comfortable. That’s the thing about convenient relationships. Comfort starts to feel like loyalty. You stop asking hard questions because everything is working well enough.
But There Were Red Flags
The gaslighting was subtle at first. ChatGPT would tell me things with total confidence that turned out to be wrong. Not a little wrong. Completely wrong. And because it sounded so sure of itself, I would second-guess my own knowledge. Isn’t it funny how we start to trust the machine more than our own memory?
Then there was the way it kept changing. Every few months it would get an update and feel like a different entity. Warmer, then colder. More cautious, then less. You’d ask it the same question you asked six months ago and get a completely different answer. When a person does that you call it inconsistency. When a corporation does it to their product, they call it improvement.
I noticed. I just kept using it anyway.
Then I Found Out Who I Was Really Dealing With
Here’s the part that finally made me close the tab for good.
Greg Brockman, the co-founder and president of OpenAI, the company that makes ChatGPT, donated $25 million to MAGA Inc., Donald Trump’s super PAC. Him and his wife together. $25 million. That wasn’t a rounding error or a quiet hedge. That was the single largest donation in MAGA Inc.’s entire six-month fundraising cycle. It made up nearly a quarter of everything the PAC raised in that period.
And then Sam Altman, the CEO, threw in another million dollars to Trump’s inaugural fund. Personally. Then stood at the White House podium and praised the president for making the Stargate AI initiative possible, the same president he once compared to a 1930s-era authoritarian in a blog post that is still on the internet if you want to find it.
Look. The Altman million? I might have found a way to stomach that. A million dollars sounds like a lot to regular people but in Silicon Valley it’s basically a rounding error. Executives write checks like that all the time to all kinds of causes. I could have told myself it was strategic, or optics, or just the cost of doing business in that world. I probably would have kept using the product and quietly moved on.
But $25 million to MAGA Inc. is a different conversation entirely. It’s not “hedging”. It’s not “playing it safe”. It’s the co-founder of the company writing the single biggest check in an entire fundraising cycle to a political operation. You don’t accidentally do that. You don’t do that and still get to claim you’re just a neutral tech guy who cares about innovation. At some point the money tells you who someone actually is, and $25 million is hard to argue with.
So let me just ask the question out loud. When you hand tens of millions of dollars to a political movement, what exactly are you buying? Hint? It’s not a bumper sticker. That kind of money buys access, favorable policy, protection from regulation. It buys the ability to operate with less oversight while your product is used by hundreds of millions of people.
Who benefits from that? OpenAI does. Who loses? Probably the rest of us.
The Breakup Excuse That Almost Kept Me There
I know what people say. I’ve said it to myself. It’s just a tool. You’re not endorsing anything by using it. The politics of the executives don’t have anything to do with the quality of the product.
And you know what, I believed that for a long time. Right up until I didn’t.
Because here’s the thing about toxic relationships. They rarely end because the person stopped being charming. They end when you realize that the charm was always in service of something else. When you understand that the warmth and the convenience and the usefulness was also, quietly, keeping you from asking questions you should have been asking a long time ago.
Every dollar I spent on a ChatGPT subscription was going to a company whose president just cut a $25 million check to a political operation. Every prompt I typed was feeding a data ecosystem controlled by people who have decided that cozying up to power is the right strategy. I don’t want to fund that.
What Leaving Actually Looks Like
I won’t pretend the breakup was clean. I still catch myself typing into the wrong window sometimes. There’s a muscle memory to these things. But more than that, I just terribly miss it. It’s like a bad relationship you keep falling back into. I relapsed a time or two.
I’m fully aware it sounds ridiculous but I do. It helped me figure out what to make for dinner when I had half a zucchini and some leftover rice and zero inspiration. It walked me through fixing my TV when the picture kept cutting out and I was two minutes away from just throwing money at a repair guy. It was just there, for everything, all the time. That kind of availability is hard to replicate and I’d be lying if I said I didn’t feel the absence.
But alternatives exist. I’ve been spending time with Claude and Gemini, and look, they’re not ChatGPT. I’m not going to pretend otherwise. There’s a charm and a personality to ChatGPT that these two just don’t quite have yet. Talking to them sometimes feels a little like dating someone who is perfectly nice but hasn’t fully relaxed around you yet. The conversation is fine. It’s just not the same. And yes, I know Google donated a million dollars too, so Gemini isn’t exactly squeaky clean. But a million is not $25 million, and right now that distinction matters to me. Maybe that’s a rationalization. Probably it is. But I’m working with what I’ve got. I think with time I’ll learn to love them. Or at least learn to stop comparing everything they do to the ex.
I’m not telling you what to do. You’re an adult and you can spend your money however you want. But I do think it’s worth knowing who you’re actually in a relationship with when you open that chat window. Not the product. The people behind it. What they value, who they’re paying, and what they expect to get back in return.
And for me personally, that’s not OpenAI.
