Why Meta’s New Content Moderation Policy Could Pose Risks to Its Stock

The firm’s content moderation approach raises concerns about regulation and advertisers.

Leslie Norton 18 March, 2025 | 9:40AM
Facebook Twitter LinkedIn

The Meta logo is seen at the Vivatech show in Paris, France, June 14, 2023.

Key Takeaways

After a massive two-year rally, Meta stock faces new risks from the firm’s content moderation policy.

The spread of false information may expose Meta to additional regulations and could reduce user engagement and advertising spending.

New rules could add to uncertainty for the stock, which has fallen sharply in recent weeks over macro concerns and uneasiness about Trump administration initiatives.

While Meta Platforms’ META stock has taken a beating over the past month, it had staged a massive rally over the prior two years, as the firm’s vast scale made it a dominant force in social media. However, a significant change in how Meta runs its feeds could pose fresh risks to the company and its stock.

Meta will begin testing its new content moderation policy on March 18. The company is moving away from third-party fact-checking of user-posted content toward a user-led community notes approach, like the one employed at X. “Facebook, Instagram, and Threads have billions of users worldwide. A change in content moderation policies stands to have a material impact on the content available,” explains Morningstar analyst Malik Ahmed Khan.

Jennifer Vieno, director of environmental, social, and governance research for Morningstar Sustainalytics, says these changes elevate Meta’s risk. She writes that the company’s social media platforms “disseminate large amounts of content, including misinformation, disinformation, and hate speech. Recent content moderation changes are likely to increase the latter’s volume and spread.”

This new policy comes as Meta’s stock has been sliding along with the other mega-cap stocks that led the bull market in 2023 and 2024. After hitting a seven-year low of $93.16 per share in October 2022, Meta’s stock roared higher on the back of a revived business strategy to an all-time high of $736.67 on Feb. 16 of this year. Since then, the stock has fallen back by 16%.

Khan sets a fair value estimate of $770 per share for Meta, meaning the stock is currently undervalued. He thinks the questions raised by the new content moderation strategy may not be reflected in the stock’s current price: “The downdraft in the stock over the last couple of weeks has been primarily a result of some of these macro-level changes and uncertainty caused by the current administration. I don’t know if investors have focused on this one angle.”

Meta's Stock Price Over Five Years

undefined

The Possible Risks to Meta From the Spread of False Information

One of the foundational problems Vieno sees is the societal risk of spreading false information. For the second consecutive year, the World Economic Forum’s 2025 Global Risks Report cited misinformation as the highest short-term global risk. If societal risk rises, pressure on governments to act could intensify, exposing Meta to “additional regulations, compliance costs and fines or sanctions for violations,” Vieno writes. That’s particularly true if Meta exports its content moderation strategy to Europe.

Could European Regulators Target Meta Again?

The top regulation for investors to consider is the EU Digital Services Act. Vieno explains how the European Commission is looking into whether Facebook and Instagram have breached the regulation. The DSA imposes its toughest rules on large online platforms like Facebook and Instagram. These include obligations to mitigate “disinformation or election manipulation, cyber violence against women, or harms to minors online,” making platforms more accountable for their actions and any systemic risks they pose. Noncompliance can lead to fines of up to 6% of global turnover. Meta’s revenue in 2024 was $164.5 billion. The European Commission could request “temporary suspension of services as a measure of last resort,” says Vieno.

Financial Risks From the New Content Moderation Policy

Additional financial risks rise if content moderation changes “lead to reduced user engagement, brand safety, and advertising spending,” Vieno writes. Advertising accounts for 97% of Meta’s revenue. Vieno observes that several companies, citing brand safety concerns, paused or pulled their ads from X after their content appeared next to antisemitic, pro-Nazi, and other harmful content. “Some have since resumed ad spending, however at much lower levels.” Meanwhile, Vieno notes that a study by market research firm Kantar indicates about 25% of advertisers plan to cut spending on X in 2025, and only 4% of the marketers surveyed believe that X provides brand safety.

Is Meta More Insulated From Content Risks?

Khan notes that most of Meta’s advertising comes from small and medium-size businesses, unlike with X. That makes the firm less prone to threats from large advertisers. On the other hand, he says an uncertain economy could affect smaller businesses and their advertising spend.

In the United States, prospects for additional regulation are “quite unlikely, especially with the current administration and political climate,” says Khan. However, the EU has fined Meta many times, including a $263 million fine for a 2018 data breach last year. In 2019, the FTC fined Meta $5 billion after investigations over privacy concerns. Separately, Meta faces antitrust concerns in the US, including an April 25 trial.

Before Meta changes its content moderation in Europe, Khan expects the firm to work with the US government and others in Big Tech to clear any regulatory hurdles. President Donald Trump has threatened to impose tariffs on countries that fine US companies. “If Meta had to pay the fine stipulated under a violation of the DSA, that would be disastrous,” says Khan. He expects “a movement to broader agreement” between the current administration, Big Tech, and the EU this year that would govern content moderation and other issues. “Meta is not going to put itself in a position where it would be liable for massive fines before implementing content moderation changes more globally.”


The author or authors do not own shares in any securities mentioned in this article. Find out about Morningstar's editorial policies.

The information contained within is for educational and informational purposes ONLY. It is not intended nor should it be considered an invitation or inducement to buy or sell a security or securities noted within nor should it be viewed as a communication intended to persuade or incite you to buy or sell security or securities noted within. Any commentary provided is the opinion of the author and should not be considered a personalised recommendation. The information contained within should not be a person's sole basis for making an investment decision. Please contact your financial professional before making an investment decision.

Facebook Twitter LinkedIn

About Author

Leslie Norton  is Editorial Director of Sustainability at Morningstar

© Copyright 2025 Morningstar, Inc. All rights reserved.

Terms of Use        Privacy Policy        Modern Slavery Statement        Cookie Settings        Disclosures