Who is creating these ugly memes?

This article is machine translated
Show original
Original title: The ugly memes driving crypto sales
Original author: Adam Alexsic, Financial Times
Compiled by: Peggy, BlockBeats

Editor's Note: When AI, algorithmic recommendations, and crypto speculation are combined, online memes are being systematically "manufactured" to harvest attention and money.

This article examines a series of offensive content that went viral on social media platforms, revealing how these seemingly absurd trends serve the propagation logic of crypto scams. It reminds us that when trends no longer arise naturally but are designed to generate profit, the internet is becoming more chaotic and more dangerous.

The following is the original text:

The author of this article is known online as Etymology Nerd and is the author of the book "Algospeak: How Social Media Is Changing the Future of Language".

This year, a dark and unsettling new side has emerged on Instagram Reels: offensive memes are being systematically created to promote cryptocurrency scams—and almost no one is seriously trying to remove them.

Since January of this year, a group of bizarre and twisted characters has begun to spread on this social media platform. This phenomenon is closely related to the widespread availability of AI tools and the relaxed regulation of hate speech on the Meta platform.

These include "George Droyd," a cyborg "reincarnation" modeled after George Floyd, created in April of this year to promote a cryptocurrency called $FLOYDAI; and "Kirkinator," created in September, shortly after the death of political commentator Charlie Kirk, to hype up the $KIRKINATOR token. In addition, there are a series of recurring "supporting characters," such as "Epstron" and "Diddytron," respectively alluding to Jeffrey Epstein and rapper Sean Combs (also known as Diddy).

These accounts exist within the same narrative universe, often gaining traffic by catering to racist and anti-Semitic stereotypes, accumulating millions of views. The short videos frequently feature discriminatory language and repeatedly revolve around themes of so-called "racial purification."

These horrifying contents serve only one purpose: to generate interaction and engagement. The ultimate goal is to draw public attention to so-called "meme coins," a cryptocurrency that theoretically rises in value as memes spread. Early meme coins (such as $DOGE) largely capitalized on existing online culture, while derivative figures like George Droyd and their counterparts are entirely artificial creations of crypto speculators.

This scheme typically begins with pump.fun, a platform that allows users to easily register and trade digital tokens. Once a developer creates a token, they share it in trusted Telegram groups or X communities, where investors work together to artificially generate attention for the meme, a process known as "mindshare." Next, they use AI to generate provocative videos, hoping to make the meme go viral and attract "ordinary people"—retail investors unfamiliar with meme culture who might be drawn in. Once the price rises, the initial core group engages in a "rug pull," selling off their holdings and cashing out their profits.

In reality, only a few thousand people actually buy these tokens. But precisely because the barriers to creating cryptocurrencies and publishing AI-generated junk content are extremely low, coin creators can easily repeat this process repeatedly, profiting by "creating cultural phenomena."

Meanwhile, these memes often begin to "grow on their own." When other creators realize their potential for viral spread, they imitate and reproduce them for money or online buzz. The characters "Kirkinator" and "George Droyd" have been repeatedly used by several influencers who have no connection to the original token creators.

But with each iteration, crypto brokers continue to profit. For example, a tweet about Kirkinator in October garnered 8 million views, causing the price of $KIRKINATOR to surge fivefold before falling back within days. For investors who sold at the peak, this profit was built on millions of X users watching a video—a video depicting "George Droyd being killed by Kirkinator after stealing the Epstein Files."

Unfortunately, the more sensational the video, the easier it is to go viral. Violent and offensive imagery generates more comments and longer viewing times, both of which are rewarded by algorithms. Cryptocurrency creators have learned to exploit this mechanism for personal gain. Even Instagram or X users unaware of these cryptocurrencies may be repeatedly bombarded with these highly disturbing clickbait contents.

We are being caught in a vortex: loosely regulated cryptocurrency websites, readily available AI tools, and social platforms that allow offensive memes to proliferate, all overlapping each other.

As a scholar studying the evolution of internet language, I am deeply disturbed by this: internet trends are being artificially created with the sole purpose of manipulating us. We can no longer be certain that memes are "naturally generated"—they could be part of some profit-driven chain at any time.

Even if a meme isn't created directly by crypto brokers, it will almost certainly be appropriated by them immediately. Every new cultural term will almost instantly be registered as a token on pump.fun and artificially inflated, just so that certain people can profit from it.

The end result is that we are all becoming more loosely connected to reality. More and more memes will be invented or amplified, forcing netizens to constantly question what they can still believe in; and continued exposure to this repulsive discourse environment will make it seem "more acceptable." The only way out is to fight to take back the internet and stop those who try to poison it.

[ Original Link ]

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments