Pump.fun Suspends Livestream Indefinitely
On 25 November, Pump.fun's livestreaming board featured explicit and racially offensive content, exposing serious flaws in its content moderation.
The livestream feature, initially intended for token promotion, devolved into a platform for extreme shock tactics, raising concerns of potential civil or criminal liability, according to legal experts.
Content ranged from suicide threats and animal abuse to pornography, with few repercussions beyond content removal.
Industry leaders called for immediate action, prompting Pump.fun to suspend the feature indefinitely until adequate moderation measures are in place.
Some creators used the platform to stage provocative challenges to boost token prices.
One pledged to remain in a toilet until his token hit a $50 million valuation, while another fired a gun with each price increase.
Disturbingly, a creator threatened self-harm if their token failed to reach a specific market cap—a situation reported by Beau, a safety programme manager for Pudgy Penguins, who urged the platform to disable livestreaming.
Active for less than six months since its May launch, Pump.fun's livestreaming feature will remain offline until the platform can effectively manage the surge in harmful activity.
Pump.fun Quick to Respond
Pump.fun co-founder "Alon" addressed the platform's moderation shortcomings in a public statement on X (formerly known as Twitter), acknowledging gaps in oversight.
However, in a follow-up post, he defended the company's efforts, emphasizing that Pump.fun actively moderates all content, including images, videos, livestreams, and comments.
The platform noted recent steps to improve moderation, including doubling the number of human moderators and deploying automated content filters.
Despite these measures, Pump.fun admitted the efforts were insufficient to manage the surge in harmful activity.
To enhance accountability, the company announced plans to release detailed guidelines for user behavior in the near future.
The company expressed in a blog post:
“We're navigating challenges that many other social platforms have faced during their growth phases and are taking similar steps to address the core issues.”
Will Pump.fun's Luck Run Out with the Regulatory Authorities?
Pump.fun has emerged as one of the year's standout success stories, fuelling Solana's meme coin surge by simplifying token creation with its code-free launch services.
Yet, despite its popularity, the platform has evaded regulatory scrutiny, even as it has become the origin of numerous scam tokens and rug pulls.
Recent controversies involving provocative and harmful livestream content, however, have prompted market observers to call for stricter oversight.
Yuriy Brisov, a partner at Digital and Analogue Partners, noted:
“It's a legitimate reason for a criminal investigation and civil lawsuits.”
Mads Eberhardt, a senior crypto analyst at Steno Research, added:
“This is not a good look for the industry. I believe there's no stopping the snowball effect of growing institutional adoption, but the industry would be in a better place if this were not happening simultaneously.”
Brisov notes that existing laws already address fraudulent activities involving cryptocurrencies.
Token creators leveraging Pump.fun's platform could potentially face charges such as wire fraud—a white-collar crime conducted through electronic communication.
He warned:
“For reasons that are not understandable to me, entrepreneurs who deal with memecoins, they think that laws do not apply to them. If you are wash trading, if you are rug pulling on Pump.fun, the same laws will apply to you, and the [United States] Department of Justice will come to you, and you will be prosecuted and put in jail for many years.”
Compounding these concerns, Pump.fun lacks prominently displayed terms, conditions, or disclaimers on its website, a potential liability in the event of investigations or lawsuits.
Brisov warns that the Department of Justice may not be the only agency Pump.fun should be concerned about as regulatory scrutiny intensifies.
The platform's rapid rise now faces a pivotal challenge: ensuring compliance while maintaining its innovative edge.
He said:
“Any token has the potential to be seen as a security, meaning there is also a high risk of unregistered security offerings, which is also a violation of the Securities Act and the relevant European Union regulations.”
Should Platforms be Liable for Their Content?
Pump.fun's struggles with livestream content have reignited a broader debate about the accountability of platforms hosting user-generated content, such as X or YouTube.
Despite advances in content moderation technology, illicit material often bypasses safeguards, raising questions about platform responsibility.
Mikko Ohtamaa, co-founder of Trading Strategy, weighed in on X, stating that Pump.fun faces two stark outcomes: immediate closure by law enforcement for opting out of moderation or eventual shutdown as regulators catch up.
Alon, Pump.fun's co-founder, acknowledged the platform's shortcomings, admitting that moderation efforts are "not perfect.”
He pointed to an NSFW toggle designed to hide extreme videos and highlighted a "large" team of moderators working around the clock.
Additionally, Pump.fun has invited its community to report unmoderated content through support channels.
However, critics argue these efforts fall short.
Livestreams often go unchecked until flagged, automated systems fail to scale with demand, and human moderators are overwhelmed.
In the US, Section 230 of the Communication Decency Act offers platforms immunity for user-generated content, provided they act responsibly in moderation.
But platforms that knowingly allow harmful content risk legal consequences.
As calls for transparency grow, Pump.fun's ability to address its moderation failures will determine whether it can fulfill its promises—or remain a cautionary tale for the crypto space.