Home / Amnesty International / Meta content moderation shortcomings risk Bangladesh violence

Meta content moderation shortcomings risk Bangladesh violence

Meta content moderation shortcomings risk Bangladesh violence


Amnesty International has today warned that Bangladesh could face more incidents of serious human rights abuses unless Meta takes timely and effective action to address harmful online content on its Facebook platform. 

In the lead-up to the country’s 12 February parliamentary elections, Amnesty International and others observed a rise in harmful online content, some of which came from outside Bangladesh. This included the spread of misleading and inflammatory content in relation to political parties and minority communities, and the amplification of sectarian narratives or beliefs that exaggerate divisions between religious or community groups. Most of the content from outside Bangladesh came from India, according to media reports. Together, this content could heighten the risk of sectarian tensions, discrimination and violence, particularly against minority communities.   

Events in the lead-up to the election, including attacks on some media outlets in Bangladesh, mirror a dangerous trajectory seen before in multiple countries. In these cases, online incitement, misinformation, disinformation, and coordinated harassment campaigns can quickly spill offline into discrimination, violence and other human rights abuses, especially when amplified by platforms’ algorithms. 

“Bangladesh is not yet in a human rights crisis, but the warning signs are visible. The combination of cross-border harmful content, political tension, sectarian narratives, and algorithmic amplification creates a volatile environment that could put freedom of expression and the rights of minority communities at risk,”

Alia Al Ghussain, Head of Big Tech Accountability at Amnesty International

Violence and online content  

On 18 December 2025, the offices of The Daily Star and Prothom Alo, two leading media outlets, were attacked by violent mobs. According to investigations led by The Daily Star and Dismislab, a local fact-checking organization, threats against both outlets had been circulating on social media for months before the attacks. Both outlets were portrayed by many social media users as “Indian agents” and “anti-national forces”, reflecting a broader online narrative accusing the outlets of serving Indian interests and undermining Bangladesh, alongside calls to burn and attack their offices. According to the Daily Star and Dismislab investigations, there was a direct link between online incitement of violence, and the mob attacks. Bangladeshi authorities reportedly warned Meta about delays in addressing posts calling for violence and expressed concern about the impact on public security and minority communities.  

Amnesty International is concerned that such incidents are not isolated. Previous reports by international organizations and media outlets have highlighted the divisive role of online disinformation involving misleading and exaggerated narratives about sectarian violence in Bangladesh, including content originating from India. This online content is reported to have contributed to fear and heightened tensions among minority communities, according to Al Jazeera.  

“The risk is clear that online harms do not remain in the digital space. They can shape public perception, inflame tensions and enable real-world violence and unrest,” said Alia Al Ghussain. “This is a moment for prevention and taking responsibility for the power that social media companies wield in this space. The world has seen too often how harmful online online content can evolve into real-world violence. There is still an opportunity to stop that trajectory in Bangladesh and it is up to Meta to take action now.” 

Amnesty International has previously documented how Facebook was used to promote violence against the Rohingya in Myanmar and contributed to abuses during the Tigray conflict in Ethiopia, and believes Bangladesh is at an important juncture where timely preventative action from Meta could reduce the risk of escalation. 

Surveillance-based business model can amplify harm 

Meta’s surveillance-based business model, built on maximizing engagement, can incentivize the amplification of sensational, polarizing and harmful content. While not all harmful content is unlawful, even lawful material can pose human rights risks when amplified. When inflammatory content receives more interaction, recommendation systems may further promote them, increasing reach and potential real-world impact.  

Amnesty International and others have previously called for the adoption of emergency mitigation measures in conflict and high-risk contexts. Meta itself has acknowledged that heightened safeguards, sometimes referred to as ‘break the glass’ measures, may be necessary in such situations. The warning signs currently visible in Bangladesh underscore why such measures warrant urgent consideration. 

Amnesty International wrote to Meta on 10 February ahead of the elections asking the company to explain what measures it would take to ensure Facebook did not pose a human rights risk, including how it assesses risks to groups in vulnerable situations, including minorities, and whether it had identified cross-border content affecting Bangladeshi users. Meta replied that it would not be able to respond within the two-week timeframe provided. 

“The risk is clear that online harms do not remain in the digital space. They can shape public perception, inflame tensions and enable real-world violence and unrest,” 

Alia Al Ghussain

Companies have a responsibility to respect human rights under international standards to ensure they are not involved in any human rights abuses. This includes taking proactive measures to prevent and mitigate human rights harms linked to their operations. This responsibility exists independently of state regulation and requires continuous risk assessment, transparency and effective mitigation measures.  

Amnesty International has also requested data from Meta on reports of harmful content targeting minority communities, enforcement actions taken, staffing capacity in Bangla-language moderation and the provision of emergency mitigation measures ahead of elections.  

Background 

Mass student‑led protests in July 2024 forced former Prime Minister Sheikh Hasina to step down and flee to India. A close ally of India, Hasina remains there despite requests for her extradition to Bangladesh to face accountability for the deadly crackdown which led to at least 1,400 deaths. She has since been tried in absentia and received the death sentence for crimes against humanity. India’s refusal to extradite Hasina has strained relations between the two countries.  



Source link

Sign Up For Daily Newsletter

Stay updated with our weekly newsletter. Subscribe now to never miss an update!

I have read and agree to the terms & conditions