I'm going to make a claim that sounds dramatic: if you're an indie founder using ChatGPT or Notion AI to plan your business, the documents that matter most to you are sitting in someone else's hands. Not in a "they're spying on you" sense. In the literal "you do not own a copy you can move" sense.
I'm not writing this to scare anyone. I'm writing it because I built a tool, Projelli, that's premised on the opposite arrangement, and I want to explain why that arrangement is the right one for the specific group of people I built it for.
The term gets thrown around loosely. Some people use it to mean "works offline." Some people use it to mean "the cache is local but the source of truth is the cloud." The definition I'm using here is the strict one from the original Ink & Switch paper: your data lives on your device, in a format you control, and the network is optional.
Concretely, for a tool to be local-first:
By that definition, almost no AI tool today is local-first. Notion AI isn't. ChatGPT isn't. Claude.ai isn't. Reflect, Tana, and Mem.ai aren't either. Cursor is local-first for the code files but not for the chat history. Obsidian is local-first for the notes but its AI features are community plugins that vary in design.
That's not a coincidence. Local-first is harder to build than cloud-first. The economics push everyone toward subscription SaaS in someone else's data center.
A regular knowledge worker losing access to their notes is annoying. A founder losing access to their business plan, financial model, customer interviews, and pricing strategy is a different kind of problem.
Here's the founder-specific case for local-first, in five points.
If you lose your meeting notes from last Tuesday, you can mostly reconstruct them. If you lose the customer interview transcript from your first 5 design partners, the one that contains the exact phrases they used to describe their pain, you can't rebuild that. Those words, captured in that moment, are the source material for everything downstream: your positioning, your copy, your roadmap, your sales script.
The same is true for your financial model assumptions, your pricing experiments, your pitch deck iterations, and your competitor analysis notes. Each of these documents represents thinking you did once and shouldn't have to do again. And every one of them lives in some tool's cloud database in 2026.
Notion changed their pricing in 2023. Notion AI was added at $10/user/month on top of the base plan. Reflect raised prices. Tana adjusted their tiers. ChatGPT moved features behind Plus and Pro. Every cloud SaaS will eventually do this, it's the model. As an indie founder you absorb the price change quietly, or you spend a weekend exporting and migrating to the next thing.
When your data lives on your hard drive in plain Markdown, the vendor's pricing decisions don't reach you. You can stay on the version you bought, or migrate at your own pace, or fork your workflow into two tools without losing anything.
I work at a health-tech company. I take security and privacy seriously because my day job demands it. But even setting aside healthcare-specific concerns, I'm uncomfortable with the idea that my early-stage business plan, which contains the actual idea, the actual market gap, the actual customer pain point I think nobody has noticed, sits in an LLM provider's training-data-eligible content store.
I know providers say they don't train on customer data. I believe most of them. The exact moment I stop believing them is the moment I read the post-mortem on the breach that happened anyway. Every cloud vendor has a non-zero probability of an incident in the next 5 years. Local-first is the only architecture where that risk is structurally zero, because the data isn't there to leak.
I do my best work on planes, in cabins, and at coffee shops with bad Wi-Fi. The number of times I've opened a cloud-only app and gotten a "no connection" screen, or hit a feature that depends on a server round-trip and just... waited, is more than zero. Every one of those moments is a small tax on my creative work.
A local-first app is fast all the time, because the data is right there. The only network round-trip in Projelli is the AI request itself, which goes directly from your machine to your chosen provider. Everything else is local.
Backing up a Notion workspace is a nightmare. Backing up a Tana graph is a nightmare. Backing up a ChatGPT conversation history is hilariously absent. You can export, but exports are lossy and one-way and you have to remember to do them.
A folder of Markdown files on your hard drive backs up automatically. iCloud, Dropbox, OneDrive, rsync, Time Machine, Backblaze, a USB drive, a git repo, all of these work transparently because they're designed to handle files in folders. Local-first puts your data in a format that the entire 50-year tradition of computing knows how to back up.
I'd be lying if I pretended local-first had no downsides. The honest tradeoff list:
For a team of 50 collaborating in real time on shared documents, none of this is worth it. For a solo indie founder running 3 side projects in their evenings, all 5 of the trade-offs are either inverted (the "downside" is actually a feature) or trivial.
Here's the exact use case I built Projelli for. See if it sounds familiar.
It's a Saturday morning. You have 3 hours before your kids wake up. You want to make progress on the new business idea you've been thinking about for a month. You open ChatGPT, start a fresh conversation, and ask for help structuring the market opportunity. The AI gives you a great response. You ask follow-up questions. Two hours later, you have a 4,000-word conversation that contains the actual nucleus of a business, assumptions, hypotheses, the customer profile, the competitive frame, three pricing options.
The next Saturday, you open ChatGPT and you can't find that conversation. You search. You scroll. Eventually you find it, but it's wedged between two unrelated threads. You start a new conversation referencing what you remember, but the AI doesn't have the context, so the new conversation is shallower than the old one. You try copy-pasting the relevant parts of the old conversation into the new one, but the formatting breaks and you lose the threading.
By week three, you've started keeping notes in a separate Notion doc. By week five, the Notion doc and the ChatGPT history have diverged. By week eight, you can't find any of it.
That's the loop I lived in for a year before I built Projelli. Every founder I've described it to has nodded immediately. It's not a problem with ChatGPT, ChatGPT is a great chat tool. It's a problem with the SHAPE of the workflow: the document and the conversation that created it should live in the same place, on your machine, in a format that doesn't depend on any one company being around in five years.
Local-first solves the data ownership problem. BYOK (bring your own key) solves the data flow problem. Together they make a stronger claim.
BYOK means: I have an Anthropic account with my own API key. I paste the key into the local app. The app stores it in my OS keychain (Keychain on Mac, Credential Manager on Windows). When I make an AI request, the request goes from my computer directly to Anthropic's API endpoint. The vendor of the local app is never in the request path.
This matters because it eliminates an entire category of vendor risk. The local-first vendor cannot leak your AI conversations because the local-first vendor never sees them. The only thing my server (the Projelli license validator) ever sees is your license key on activation. Beyond that, my server has zero visibility into anything you do in the app.
The cost of BYOK is the setup friction, creating an API account is a 5-minute hurdle that some people won't cross. The benefit is a privacy story that's structurally airtight, not just promised.
One last thing, because the takes I've seen on local-first sometimes lean luddite. I'm not arguing against AI. I love AI. I use Claude every single day for both my day job and my side projects. AI is the most interesting thing that's happened to my workflow in a decade.
The argument is about where the data lives, not about whether AI is involved. AI can be just as useful when it's reading and writing files on your hard drive as when it's reading and writing to a cloud database. The user experience is identical. The only thing that changes is who has a copy of your business plan.
That's the case for local-first AI for indie founders. It's not about being old-fashioned. It's about being the kind of founder whose business plan, financial model, and customer research live somewhere you can find them in five years, regardless of whose servers are still running.
See Projelli, local-first AI workspace for indie foundersJameson Daines builds Projelli on weekends and evenings around a Senior Product Designer day job. Read about the 8-week launch or get Projelli at projelli.com.