
Artificial Intelligence in DeFi
TechFlow Selected TechFlow Selected

Artificial Intelligence in DeFi
What are the applications and potential impacts of AI/LLMs in the cryptocurrency space?
Author: DEFI EDUCATION
Translation: Baihua Blockchain

As you may have seen on Twitter, we’re very interested in the current AI/LLM space. While there’s still much to be desired regarding accelerating research, we see its potential.
The emergence of large language models (LLMs) in crypto is transforming how non-technical participants interact with, understand, and contribute to the industry.
Previously, if you couldn’t code, you’d feel completely lost. Now, LLMs like ChatGPT bridge the gap between complex programming languages and everyday speech. This is significant because crypto has largely been dominated by those with technical expertise.
If you encounter something you don’t understand or suspect a project is intentionally obscuring the truth about its underlying system, you can ask ChatGPT and get a fast, nearly free answer.
DeFi is democratizing access to finance, while large language models are democratizing access to DeFi.
In today’s article, we’ll explore some ideas on how LLMs might impact DeFi.
1. DeFi Security
As we’ve noted, DeFi is transforming financial services by reducing friction and overhead costs and replacing large teams with efficient code.
We’ve detailed where DeFi is headed. DeFi:
-
Reduces friction costs — gas fees will eventually go down
-
Cuts overhead costs, as there are no physical locations, only code
-
Lowers labor costs — you’ve replaced thousands of bankers with 100 programmers
-
Allows anyone to offer financial services (like lending and market making)
-
Represents a leaner operational model that doesn’t rely on intermediaries to execute.
In DeFi, "counterparty risk" is replaced by software security risk. The code and mechanisms protecting your assets and facilitating your transactions are constantly exposed to external threats attempting to steal and exploit funds.
AI, particularly LLMs, plays a crucial role in automating the development and auditing of smart contracts. By analyzing codebases and identifying patterns, AI (over time) can detect vulnerabilities and optimize smart contract performance, reducing human error and increasing the reliability of DeFi protocols. By comparing contracts against databases of known vulnerabilities and attack vectors, LLMs can highlight areas of risk.
One area where LLMs are already a viable and accepted solution for software security is assisting in writing test suites. Writing unit tests can be tedious but is a critical part of software quality assurance, often neglected due to rushed time-to-market pressures.
However, there's also a "dark side." If LLMs can help you audit code, they can also help hackers find ways to exploit code in the open-source, transparent world of crypto.
Fortunately, the crypto community is full of white hats and has bounty systems that help mitigate some of these risks.
Cybersecurity professionals do not advocate "security through obscurity." Instead, they assume attackers are already familiar with the system’s code and vulnerabilities. AI and LLMs can help automatically detect insecure code at scale—especially for non-programmers. The number of smart contracts deployed daily exceeds what humans can audit. Sometimes, to seize economic opportunities (like yield farming), users must interact with new and trending contracts without waiting for extended testing periods.
This is where platforms like Rug.AI come into play, offering automated assessments of new projects against known code vulnerabilities.
Perhaps the most revolutionary aspect is LLMs’ ability to help write code. As long as users have a basic understanding of their needs, they can describe what they want in natural language, and LLMs can convert those descriptions into functional code.
This lowers the barrier to creating blockchain-based applications, enabling a broader range of innovators to contribute to the ecosystem.
We’re still in the early stages. We personally find LLMs better suited for refactoring code or explaining what code does to beginners rather than building entirely new projects from scratch. Providing context and clear specifications to your model is crucial—otherwise, it’s garbage in, garbage out.
LLMs can also help non-programmers by translating smart contract code into natural language. Maybe you don’t want to learn programming, but you do want to verify that the code of a protocol you’re using aligns with its promises.
While we doubt LLMs will replace high-quality developers anytime soon, developers can use LLMs as an additional layer of rational review for their work.
Conclusion? Crypto becomes simpler and safer for all of us. Just be careful not to over-rely on LLMs. They sometimes confidently give wrong answers. LLMs’ ability to fully understand and predict code behavior is still evolving.
2. Data Analysis and Insights
When gathering data in crypto, you’ll eventually come across Dune Analytics. If you haven’t heard of it, Dune Analytics is a platform allowing users to create and share data analytics visualizations, primarily focused on the Ethereum blockchain and other related blockchains. It’s a useful and user-friendly tool for tracking DeFi metrics.
Dune Analytics already has GPT-4 functionality that explains queries in natural language.
If you're confused by a query or want to create and edit one, you can turn to ChatGPT. Note that it performs better if you provide example queries within the same conversation, and you should still aim to learn yourself to verify ChatGPT’s output. Nevertheless, it’s a great way to learn by asking—treating ChatGPT like a mentor.

LLMs significantly lower the entry barrier for non-technical crypto participants.
However, when it comes to insights, LLMs are disappointing in delivering unique perspectives. In complex, rational financial markets, don’t expect LLMs to give you the correct answer. If you’re someone who operates on intuition, you’ll likely find LLMs fall far short of expectations.
Still, we’ve found one effective use case—checking whether you’ve missed something obvious. You’re unlikely to discover non-obvious or contrarian insights—the kind that actually generate returns. This isn’t surprising (if someone developed AI capable of generating outsized market returns, they wouldn’t release it publicly).
3. “The Disappearance of Discord Moderators?”
In crypto, managing a group of enthusiastic but demanding users around a hot project is one of the most underappreciated and painful jobs. The same common questions get asked repeatedly, sometimes nonstop. This seems like a clear pain point that LLMs could easily solve.
LLMs have also shown decent accuracy in detecting self-promotional messages (spam). We expect this capability could extend to identifying malicious links (or other hacking attempts). Managing an active Discord server with thousands of members regularly posting information is genuinely difficult, so we look forward to LLM-powered Discord bots stepping in to help.
4. “Whimsical Ideas”
A recurring meme in crypto is launching currencies based on trending memes. These range from enduring memes like DOGE, SHIB, and PEPE to random coins based on daily热搜terms that vanish within an hour (mostly scams—we avoid participating).
If you have access to the Twitter Firehose API, you could track crypto sentiment in real-time and train an LLM to flag trends, then use humans to interpret nuances. A simple application example: when a viral moment emerges, you could launch a meme coin based on sentiment analysis.
Perhaps there are ways to build a low-cost version of an emotion scraper that monitors a subset of popular crypto influencers across multiple social media channels, avoiding the cost and bandwidth of “firehose”-type API data sources.
LLMs are well-suited here because they can grasp context (interpreting online sarcasm and parody to derive genuine insights). This LLM partner would evolve and learn alongside the crypto industry, where much of the discourse happens on crypto Twitter. With its public debate forums and open-source technology, the crypto industry offers a unique environment for LLMs to capture market opportunities.
But to avoid being fooled by intentional social media manipulation, this technology would need to become more sophisticated—addressing orchestrated grassroots campaigns, undisclosed sponsorships, and bot networks. In another article, we covered an interesting third-party research report suggesting some entities may consciously manipulate social media to inflate the value of crypto projects linked to FTX/Alameda.
NCRI analysis showed that bot-like accounts made up a significant portion (about 20%) of online discussions mentioning FTX-listed tokens.
This bot-like activity preceded price movements for many FTX tokens in the dataset.
Following FTX’s promotion, activity around these tokens became increasingly artificial over time: the proportion of fake, bot-like comments steadily increased, eventually accounting for roughly 50% of total discussion volume.
Join TechFlow official community to stay tuned
Telegram:https://t.me/TechFlowDaily
X (Twitter):https://x.com/TechFlowPost
X (Twitter) EN:https://x.com/BlockFlow_News














