
DeepSeek Sparks a Major Shake-Up in the AI Agent Sector? Should You Buy the Dip or Exit Now?
TechFlow Selected TechFlow Selected

DeepSeek Sparks a Major Shake-Up in the AI Agent Sector? Should You Buy the Dip or Exit Now?
More powerful models are always beneficial for Agents.
Author: Azuma, Odaily Planet Daily

The AI Agent sector continued its sharp pullback today. Aside from some DeepSeek-related memes, major framework/launchpad/agent tokens such as VIRTUAL, AI16Z, AIXBT, ARC, along with numerous smaller tokens, all posted double-digit losses.
The downturn is undoubtedly tied to broader market weakness. Yet, judging from online discussions, some investors are attributing this broad decline in AI Agents specifically to the explosive rise of DeepSeek—believing that DeepSeek, as a newer and more cost-effective LLM, could disrupt or even reshape the current Crypto AI landscape.
I personally disagree with this view.
The core of this argument hinges on one question: Will advancements in LLMs render existing Agents obsolete? Setting aside debates over whether DeepSeek's current inference quality surpasses ChatGPT or Claude, let’s hypothetically assume DeepSeek has stronger potential for future iterations from an economic sustainability standpoint. Would an evolved DeepSeek then spawn a new generation of Agents, making today’s projects obsolete?
To answer this, we first need to clarify the relationship between LLMs and Agents. Let’s use DeepSeek itself and another Chinese AI star, Kimi, to explore this (to be honest, Kimi seems to deliver a slightly better response on this very question).
DeepSeek:
The relationship between AI Agents and large language models (LLMs) lies in the fact that LLMs can serve as a core component of an Agent, especially in tasks involving natural language understanding and generation. An AI Agent is an intelligent system capable of autonomously performing tasks and interacting with its environment. The LLM provides the Agent with powerful language processing capabilities, enabling it to better understand and generate human language, thus enhancing performance in dialogue, recommendation, analysis, and other scenarios. In short, LLMs are a key technological foundation enabling intelligent language interaction within AI Agents.
Kimi:
Agents typically rely on LLMs to understand and generate natural language. LLMs provide robust language processing abilities, allowing Agents to engage in natural language interactions with users, comprehend user needs, and produce appropriate responses.
However, Agents are not solely dependent on the language-processing capabilities of LLMs. They also integrate decision-making, problem-solving, environmental interaction, and action execution. While LLMs provide the foundational ability to understand and generate language, Agents build upon this by planning and executing tasks.
Agents can also extend their functionality by integrating other tools and APIs. For example, an Agent might call upon search engines, database query tools, or file read/write utilities to accomplish more complex tasks. In this process, the LLM enables language understanding and generation, allowing the AI Agent to interact effectively with these external tools.
In summary, while Agents require support from LLMs, the quality of their output and interaction does not depend entirely on the LLM. In fact, what truly differentiates one Agent from another lies precisely in those capabilities beyond the LLM.
For instance, the reason aixbt “crushes” other similar Agents in output quality fundamentally comes down to superior prompt engineering, post-processing mechanisms, context management, fine-tuning strategies, randomness control, integration with external tools, and user feedback systems. Whether you call it first-mover advantage or a moat, that’s where aixbt currently stands out.
With this relationship clarified, let’s now revisit the earlier central question: Will the evolution of LLMs disrupt existing Agents?
The answer is no. Agents can easily evolve by integrating next-generation LLMs via API, thereby improving interaction quality, efficiency, and expanding application scenarios—especially considering that DeepSeek itself offers an API format compatible with OpenAI.
In fact, the most agile Agents have already completed integration with DeepSeek. This morning, Shaw, founder of ai16z, noted that Eliza—the AI Agent development framework built by the ai16z DAO—has supported DeepSeek for two weeks already.

Given current trends, it’s reasonable to expect that after ai16z’s Eliza, other major frameworks and Agents will swiftly follow suit in integrating DeepSeek. Thus, even if there’s short-term pressure from newly launched DeepSeek-powered Agents, long-term competition among Agents will still center on the non-LLM capabilities mentioned earlier—where accumulated development advantages from early movers will once again prove decisive.
Finally, let’s include some insights from industry figures to reinforce confidence in the AI Agent ecosystem.
Frank, founder of DeGods, said yesterday: “People are wrong about this idea—that DeepSeek will disrupt old markets. Current AI projects will benefit from new models like DeepSeek. They just need to replace their OpenAI API calls with DeepSeek’s, and overnight their outputs will improve. New models won’t disrupt Agents—they’ll accelerate their development.”
Daniele, an AI-focused trader, added: “If you’re selling AI tokens because DeepSeek’s model is cheap and open-source, you should realize that DeepSeek actually helps lower the barrier for AI applications to reach millions of users through affordable pricing. This might be the best thing to happen to the industry so far.”
This morning, Shaw also published a detailed response addressing concerns around DeepSeek’s impact, opening with this sentence: “More powerful models are always good news for Agents. For years, leading AI labs have been outdoing each other. Sometimes Google leads, sometimes OpenAI, sometimes Claude—and today, it’s DeepSeek…”
Join TechFlow official community to stay tuned
Telegram:https://t.me/TechFlowDaily
X (Twitter):https://x.com/TechFlowPost
X (Twitter) EN:https://x.com/BlockFlow_News












