Bitcoin

The Dead Internet Theory: Will Crypto Kill or Save the Internet?

“Dead Internet” may imply some type of massive shutdown, but that’s not exactly the theory—it’s a bit more eerie. At this point, you can likely differentiate between a bot response and a real response on social media. This will be increasingly more difficult in time, but we’ll get there. There are also some open dirty secrets in the digital world, like the fact you could buy fake followers and bots to increase your numbers; or how some dubious algorithms are manipulating certain information.

Non-human systems like Artificial Intelligence (AI) models are taking over the Internet, even where human engagement is expected, like on social media. This way, the Dead Internet Theory seems closer than ever. First postulated in a forum post titled “Dead Internet Theory: Most of The Internet Is Fake,” by one “IlluminatiPirate” in 2021, this almost-conspiracy theory says that people and content on the Internet aren’t organic or real, but are composed of carefully curated AIs, algorithms, and bots.

That way, the Internet would be “dead” because there’d be no humans but bots interacting with other bots, and trying to manipulate the opinion of whatever humans are still left on the net, almost unable to communicate with other humans. Well, that’s the radical version. Reality is more complex, as usual, but we already have some bad omens.

Dying Signs

The Internet is showing unsettling signs of decline, with automation and artificial intelligence increasingly dominating online spaces. In 2024, a study by the cybersecurity firm Imperva revealed that nearly half of all web traffic comes from bots—a figure that’s only growing as AI models scrape data for training. Platforms like X (formerly Twitter) are flooded with bot-generated content, from repetitive viral phrases to spammy interactions, making genuine human engagement feel rare.

Even more bizarre was the rise of “Shrimp Jesus,” and other weird AI-generated images that went viral on Facebook in 2024, surrounded by thousands of automated “Amen” comments—proof of how synthetic content can manipulate engagement.

Large Language Models (LLMs) like ChatGPT have accelerated this shift, enabling anyone to mass-produce text that mimics human writing. Google has admitted its search results are clogged with pages designed for algorithms, not people.

Meanwhile, platforms like TikTok and YouTube are embracing AI influencers and synthetic content, further blurring the line between real and artificial. The result? A web where bots talk to bots, creativity is outsourced to machines, and authentic voices struggle to be heard.

The consequences are troubling. An Internet dominated by AI risks becoming stale, manipulative, and devoid of real connection. If unchecked, we may reach a point where most content is generated by machines—for machines—leaving users navigating a digital wasteland. Without meaningful intervention, the Internet’s vibrancy could fade, replaced by an endless loop of automated noise.

AI Agents, or AI with Money

Now, here’s where crypto gets involved in the mix, and we’re not sure if for good or bad. AI agents are advanced AI systems that can act independently, making decisions and completing tasks without constant human input. Unlike simple chatbots that just respond to questions, these agents can analyze data, set goals, and even learn from their actions. What makes them unique is their ability to handle cryptocurrency, giving them financial autonomy. With their own crypto wallets, they can trade, hire other agents, and fund their own operations, blurring the line between human and machine-run activities.

These agents are exploding in popularity, with brands like Virtual Protocol and ai16Z leading the charge. Some, like Luna the AI VTuber, create content and manage investments, while others, like GOAT, started as AI-generated memecoins and grew into full-fledged projects. On platforms like Myshell, Virtual, and CreatorBid it’s possible to create, train, and monetize AI agents, with some acting as personal assistants, automated traders, or even interactive virtual characters.

However, the idea of AI handling real money raises concerns. If these agents dominate online transactions and content creation, the Internet could become a space where bots interact mostly with other bots—accelerating the “Dead Internet” effect. Without proper oversight, we might see AI-driven scams, market manipulation, or an overwhelming flood of synthetic content, making it harder to trust what—or who—is real online.

New Risks

Since AI agents can autonomously manage crypto transactions and tokens, they can be delegated significant power over decentralized systems—power that could be abused. For example, in DAO governance, AI-controlled wallets could outvote real users, manipulating decisions to benefit a select few. We’ve already seen hints of this.

In 2024, a whale known as “Humpy” exploited low voter turnout in Compound’s DAO to push through a controversial $25M COMP token proposal. Despite holding just 325K tokens (short of the quorum), the vote succeeded because large delegates like a16z (361K COMP) stayed inactive—highlighting how easily small groups can hijack “decentralized” governance. If AI agents automate this at scale, governance could become a facade, controlled by bots rather than people.

Another risk is fake engagement. AI agents could generate thousands of wallets, creating the illusion of a thriving community when it’s just artificial activity. Something very similar is already happening in blockchain games. According to a study by Jigger, an anti-bot protection software, around 40% of the player base of 60 of these games are bots using on-chain data.

Besides, AI agents could centralize power in the hands of those who control the best models. If a few entities dominate AI development (like OpenAI or big crypto projects), they could indirectly control supposedly decentralized systems. Users could still control their AI-agents, in theory, but the practice may be different due to the mediation of these big companies controlling the underlying technology.

Considering all the above, if most interactions in the crypto space come from AI (likely controlled by centralized, powerful parties), including essential interactions in crypto governance, real users may abandon these spaces, turning them into “ghost towns” where decentralization exists in name only. Without safeguards like proof-of-humanity checks, AI could accelerate the “Dead Internet” effect, where most online activity is artificial, and true decentralization becomes a myth.

What happens if the Internet dies?

If the Internet becomes overrun with AI-generated content and AI-driven agents, distinguishing real human participation from artificial engagement will be incredibly difficult. Social media, forums, and even crypto communities could turn into echo chambers where AI bots amplify certain narratives, drowning out genuine discussions.

This ties into what Evgeny Morozov calls “invisible barbed wire”—algorithms subtly guiding choices, shaping opinions, and restricting intellectual and social growth without users realizing it. If people mostly interact with AI-generated responses, their worldview narrows, reinforcing pre-existing beliefs rather than encouraging independent thought. The result is a system where people believe they are freely engaging, but in reality, they are being nudged down specific, predetermined paths.

AI-generated wallets could overwhelm decentralized networks, making it appear as if thousands of users support a project when, in fact, most engagement is artificial. DAOs could be manipulated by AI-driven accounts, distorting governance decisions, while DeFi markets might experience sudden, unpredictable shifts due to AI-created liquidity. Crypto networks record transactions immutably, but if most on-chain interactions are from AI, the data becomes meaningless. A chain filled with bot-driven transactions wouldn’t reflect real human economic activity, undermining the very transparency and trust that decentralized systems aim to provide.

Eventually, if AI dominates online spaces, the Internet could become a vast feedback loop of automated content where real human participation is rare. Search results, news, and even smart contract interactions might be generated, filtered, and reinforced by AI, making it harder for people to access diverse, organic information. Without intervention, the promise of an open, decentralized Web could turn into a landscape of artificial engagement, limiting innovation and genuine discourse, and privileging the narrative and agenda of a few powerful parties.

Decentralized ID as a solution

One potential solution to avoid such a mess seems clear: identify ourselves in the online world, do more than a captcha to prove that we’re humans and have, let’s say, voice and voting rights; while moderating AI agents and bots. However, identification in the online world is another big issue, because privacy would be lost and our data would be at risk. Nobody wants a hacker to find out who you are and where you live. Fortunately, if crypto got us into this, it can get us out, too.

Distributed Ledger Technology (DLT) can help by ensuring transparency and removing centralized control over identity and data. One key system is self-sovereign identity (SSI), which allows individuals to own and control their digital identity without relying on companies like Google or Facebook.

Unlike traditional identity systems, where external providers can lock you out or misuse your data, SSI enables users to selectively share verified information while keeping their personal data private. For example, they could prove they are of legal age without revealing their birth date or prove they are not from a restricted country without sharing their full nationality. This means online interactions—whether in social media, governance, or finance—can prioritize real human participation over AI-generated content.

Obyte offers a decentralized DAG-based system that enables self-sovereign identity through attestations. Users can verify their real name, email, GitHub account, or accredited investor status, storing the information securely in their own wallet. This ensures that businesses and applications can interact with verified users while still respecting privacy. Since attestations are stored on a decentralized ledger, they remain valid even if the original attestor disappears. By integrating these identity solutions, Obyte helps create an online world where humans—not AI bots—shape the future of decentralized interactions.


Featured Vector Image by pikisuperstar / Freepik

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button