Meta's moderation of self-harm content on Instagram is inadequate. A study found zero self-harm images were removed out of 85 posted. Instagram's algorithm may promote self-harm networks rather than limit them. EU regulations require platforms to manage risks to mental wellbeing. Experts warn about rising suicide rates linked to unmoderated content.
Inadequate content moderation can lead to regulatory scrutiny, impacting META's reputation and stock.
Continued scrutiny and potential fines can affect META's long-term profitability and operational practices.
Issues surrounding content moderation directly affect Meta's operations and public trust, which can impact profitability.