The real AI war is over your memory
Google launched tools to import your ChatGPT memories and chat histories. Apple is turning Siri into a marketplace where every AI assistant plugs in — for a 30% cut. OpenAI is wiring Codex into every work tool you touch. Shopify made every AI conversation a storefront. The model quality race is giving way to something more consequential: a fight over who accumulates the deepest context about how you work, shop, and think — because whoever owns that context makes switching nearly impossible.
9to5Google
Google Gemini launches tools to import your ChatGPT and Claude memories and chat histories
Google introduced 'Import Memory' and 'Import Chat History' tools that let users migrate their preferences, saved context, and up to 5 GB of conversation archives from ChatGPT and Claude directly into Gemini.
9to5google.com

The benchmark wars are over. Nobody switched AI providers last week because one model scored two points higher on MMLU. But this week, four companies made moves that will actually determine where people land and stay. The new contest is over context: who knows how you work, what you buy, and how you think. Because once an AI has that, you are not leaving.
Google made the most explicit play. 9to5Google reported that Gemini now lets you import your ChatGPT and Claude memories and full chat histories, up to 5 GB per upload. The feature exists because Google knows that months of accumulated preferences and project context are the real lock-in, and they want to break someone else's. The industry has quietly acknowledged that portable context is a competitive weapon. You don't poach users by being smarter. You poach them by promising they won't lose what the other model already knows about them.
Apple is taking a different approach. MacRumors reported that iOS 27 will let third-party AI assistants plug directly into Siri. Claude, Gemini, Grok — all routed through a switchboard, with Apple collecting 30% on subscriptions. Apple doesn't need to win the model race. It needs to own the surface through which every model reaches a billion iPhones, accumulating usage patterns regardless of which provider the user picks.
OpenAI is betting on breadth. The Decoder reported that Codex now has a plugin marketplace connecting it to Slack, Figma, Notion, Gmail, and more. With 1.6 million weekly active users, every plugin installed is another thread of context, from design files to email drafts. The more Codex touches, the harder it becomes to rip out.
Then there's commerce. Shopify announced that every eligible merchant's products are now discoverable inside ChatGPT, Gemini, and Copilot, with AI-attributed orders up 11x year-over-year. An AI that remembers your sizing and purchase history becomes the obvious place to buy things.
The real lock-in
I think most people are still framing this as a model quality race. It isn't. The model is the commodity; the context is the moat. Google's memory import tool makes the point: if the model alone were the product, there would be no need to bring your history along. What matters is the accumulated understanding of your preferences and work patterns. It takes months to build and seconds to lose if you switch.
The parallel to social media holds. Nobody stayed on Facebook because it had the best algorithm. They stayed because their photos, friends, and memories were there. The AI version is more personal. It's your cognitive graph: how you reason, what you care about, what you've been working on for six months. That's harder to export than a photo album.
For builders, the strategic question is no longer "which model should I integrate?" It's "who is accumulating context about my users, and how do I keep that from becoming someone else's moat?"
Read the original on 9to5Google
9to5google.com