Iris Coleman
Apr 11, 2026 15:21
LangChain argues closed AI agent harnesses create harmful vendor lock-in by means of proprietary reminiscence programs, pushing builders towards open-source options.
LangChain is sounding alarms a couple of rising drawback in AI growth: firms constructing brokers on closed platforms threat shedding management of their most useful asset—consumer reminiscence information.
The blockchain and AI infrastructure firm printed an in depth evaluation on April 11, 2026, arguing that “agent harnesses”—the scaffolding programs that handle how AI brokers work together with instruments and information—have gotten inseparable from reminiscence storage. When builders select proprietary harnesses, they’re successfully handing over their customers’ interplay historical past to 3rd events.
Why This Issues for Builders
Agent harnesses have grow to be the usual structure for constructing AI programs. Claude Code alone reportedly incorporates 512,000 traces of harness code, based on leaked documentation referenced by LangChain. Even mannequin suppliers with essentially the most superior AI are investing closely in these orchestration layers.
The issue? Reminiscence is not a plugin you possibly can swap out. As Letta CTO Sarah Wooders put it in a publish cited by LangChain: “Asking to plug reminiscence into an agent harness is like asking to plug driving right into a automotive.”
Quick-term reminiscence (dialog historical past, software outputs) and long-term reminiscence (cross-session preferences, discovered behaviors) each stream by means of the harness. If that harness sits behind a proprietary API, the info stays locked in.
The Lock-In Spectrum
LangChain outlined three ranges of threat:
Gentle: Utilizing stateful APIs like OpenAI’s Responses API or Anthropic’s server-side compaction shops state on their servers. Wish to swap fashions mid-conversation? Robust luck.
Dangerous: Closed harnesses like Claude Agent SDK work together with reminiscence in undocumented methods. Even when artifacts exist client-side, their format stays proprietary and non-transferable.
Worst: Full harness-as-a-service choices like Anthropic’s Claude Managed Brokers put the whole lot—together with long-term reminiscence—behind an API. Zero visibility, zero possession.
OpenAI’s Codex generates encrypted compaction summaries unusable exterior their ecosystem, the evaluation famous. Mannequin suppliers are incentivized to maneuver extra performance behind APIs exactly as a result of reminiscence creates stickiness that uncooked mannequin entry would not.
The Sticky Issue
LangChain’s Harrison Chase shared a private instance: an inside electronic mail assistant constructed on their Fleet platform collected months of discovered preferences. When by chance deleted, recreating it from the identical template produced a noticeably worse expertise. All these discovered behaviors—tone, preferences, patterns—gone.
“With out reminiscence, your brokers are simply replicable by anybody who has entry to the identical instruments,” the publish said. Reminiscence transforms a generic AI into a personalised system that improves over time.
The Open Different
LangChain is positioning its Deep Brokers framework as the answer—open supply, model-agnostic, with plugins for MongoDB, Postgres, and Redis for reminiscence storage. The framework makes use of open requirements like brokers.md and helps deployment by means of LangSmith or customary website hosting.
Whether or not the business follows stays unsure. Mannequin suppliers have sturdy incentives to seize customers by means of proprietary reminiscence programs, and plenty of builders prioritize getting brokers working earlier than worrying about information portability.
However for groups constructing manufacturing AI programs, the query deserves consideration now: Who really owns the info your agent learns from customers? The reply would possibly decide whether or not you possibly can ever swap suppliers—or whether or not your AI’s collected intelligence belongs to another person solely.
Picture supply: Shutterstock

