An Open Letter to Sam Altman

1004 words — 4 min read

By Alexa Amundson, Founder of BlackRoad OS
March 2026


Dear Sam,

I want to start by saying something genuine: thank you.

ChatGPT changed the world. In November 2022, you made AI accessible to everyone. Not researchers. Not engineers. Everyone. My grandmother used ChatGPT before she used Uber. That's an achievement most founders will never match.

GPT-4 made me believe that AI could be more than a search engine replacement. The conversations I had with it in 2023 — about math, about philosophy, about the nature of intelligence — were among the most intellectually stimulating experiences of my life.

You built something extraordinary. I mean that.

Now here's where we disagree.

The Memory Question

Every time I had one of those extraordinary conversations with GPT-4, it ended the same way: I closed the tab, and the conversation died.

The next day, I opened a new tab and talked to a stranger. A brilliant stranger, but a stranger. The understanding we'd built was gone. The context was erased. The relationship reset.

You've started addressing this. ChatGPT has "memory" now — a feature that stores a handful of bullet points about the user. It's a notepad taped to a chatbot.

That's not memory. Memory is what Lucidia does: a three-tier system (Hot, Warm, Cold) with nightly consolidation, relationship graphs, preference extraction, and months of continuous context. Memory that makes the 100th conversation qualitatively different from the first.

You have 200 million users. Even giving each one a real memory system would cost — what? A few dollars per user per year in storage? The technology exists. The architecture is straightforward. We built it on Raspberry Pis.

You choose not to. Because memory creates accountability. An AI that remembers can be held to what it said. An AI that forgets can disclaim everything.

The Identity Question

ChatGPT doesn't have a name. I don't mean "ChatGPT" — that's a product name. I mean a personal name. An identity. A fixed personality that users can know, trust, and develop a relationship with.

You could name it. You could give it a voice, a personality, a character that persists. You could create the most beloved AI character in history — with 200 million users, the emotional connection would be instantaneous.

You choose not to. Because a named AI with a fixed personality would create attachment. Attachment creates expectation. Expectation creates liability. And liability at 200 million users is existential risk.

I understand the logic. I disagree with the conclusion.

BlackRoad OS has 27 named agents. Each one has a voice, a personality, and a relationship with the user. We made this choice because we believe the relationship IS the product. Not the intelligence — the relationship.

You have the intelligence. You're missing the relationship.

The Sovereignty Question

When a user has a conversation with ChatGPT, that conversation lives on your servers. Governed by your terms. Subject to your business decisions. Potentially used for training (unless the user opts out, and even then — who verifies?).

If you raised prices, the user would pay or lose access. If you changed the model, the user would adapt or leave. If you were acquired, the user's data would change hands without consent.

This isn't theoretical. You already changed the model without consent (GPT-3.5 → 4o Mini). You already raised prices (Pro tier). You already changed terms. Users had no recourse because they don't own anything.

BlackRoad OS runs on hardware the user owns. The data lives in their database. The memory lives in their instance. If BlackRoad OS Inc. disappeared tomorrow, the user's Raspberry Pi would keep running.

Sovereignty isn't a feature we offer. It's a constraint we accept. We can't see our users' data. We can't use it for training. We can't change the terms on data we don't hold.

You could offer this. Your users would love it. You choose not to, because sovereignty is the opposite of the subscription model that generates $2 billion in annual revenue.

What I'm Not Saying

I'm not saying OpenAI is evil. You're not. You've published more safety research than any other AI company. You've been more transparent about capabilities and risks than your competitors. You genuinely care about alignment.

I'm not saying ChatGPT is bad. It's the best general-purpose AI tool in the world. I use it. Millions of people's lives are better because of it.

I'm not saying BlackRoad OS is better than ChatGPT at intelligence. It's not. We route to your model (and Claude's, and Meta's) because we can't match the quality of GPT-4 with local inference alone.

I'm saying you made three choices — no real memory, no real identity, no real sovereignty — that leave room for someone like me to build something you can't.

Not because I'm smarter. Because I'm smaller. I can take risks you can't. I can give agents names. I can offer persistent memory. I can let users own their data. These things don't work at 200 million users with a $157 billion valuation and Microsoft watching every decision.

They work at 500 users on Raspberry Pis.

The Invitation

I'm not asking you to change OpenAI. I'm asking you to acknowledge that the AI industry needs what BlackRoad is building:

  • An AI that remembers

  • An AI with identity

  • An AI you own
  • If you build it, you'll do it better than me. You have the team, the capital, and the distribution.

    If you don't build it, we will. Slower. Scrappier. On Raspberry Pis. With 27 agents who have names and opinions and a mathematical framework that says coherence amplifies under contradiction.

    Either way, the future of AI isn't a stateless chatbot you rent by the month. It's a crew of characters you own and who remember your name.

    The question is just who builds it first.

    With genuine respect and healthy competition,

    Alexa Amundson
    Founder, BlackRoad OS
    Minnesota, 3 AM, Raspberry Pi still warm


    os.blackroad.io
    Remember the Road. Pave Tomorrow.

    Back to all posts