After failing to stem the hate speech and disinformation that fueled genocide in Myanmar, Facebook is now announcing its intention to take proactive content moderation measures following an ongoing military coup in the country.
In an internal message posted Monday evening and viewed by BuzzFeed News, Rafael Frankel, Asia Pacific director of public policy, told employees that the social network is monitoring the “volatile situation” in Myanmar “with grave concern” and described a series of measures to crack down on those who used it to spread disinformation or threaten violence.
As part of the measures, Facebook has designated Myanmar as a “temporary high-risk location” for two weeks, allowing the company to remove content and events in the country that include “any call to bring weapons.” The social network had previously applied this designation to Washington, DC, following the U.S. Capitol uprising on January 6.
The social network, which touted its efforts to protect the integrity of Myanmar’s national elections in November, also said it would protect posts that criticized the military and its coup, and would follow reports of pages and accounts hacked or taken over by the military. .
“Myanmar’s November elections were an important moment in the country’s transition to democracy, although it was not without challenges, as international human rights groups have pointed out,” wrote Frankel. “This turn of events reminds us of the days we hoped to be in Myanmar’s past and reminds us of basic rights that should never be taken for granted. “
Facebook’s measures come after Burmese army chief Gen. Min Aung Hlaing seized control of the country’s government and arrested his elected leader Aung San Suu Kyi and other members of his National League party on Monday. of Democracy (NLD). Following the election in which the NLD won the majority of seats in Myanmar’s parliament, military-backed opposition groups called the results fraudulent and demanded a vote.
Tuesday, the US State Department officially designated the takeover of the army in Myanmar as a coup, triggering financial sanctions.
“After a review of all the facts, we assessed that the actions of the Burmese army on February 1, in removing the duly elected head of government, amounted to a military coup,” said an official from the Department of State in a briefing, using the name the US government uses to refer to the country.
In a statement to BuzzFeed News, Facebook confirmed the actions described in Frankel’s post and said it would remove content praising or supporting the coup.
“We are prioritizing the safety of people in Myanmar and removing content that violates our rules on violence, hate speech and harmful disinformation,” Frankel said. “This includes removing misinformation that delegitimizes the November election result.”
Facebook is taking action in a country where it has already been condemned by the international community for its handling of the displacement and genocide of Rohingya Muslims that began in 2016. In 2018, United Nations investigators discovered that senior military officials in Myanmar had used Facebook, which was not the case. have content moderators in the country, instigate fear and spread hate speech.
The “extent to which Facebook’s posts and messages have led to real-world discrimination must be independently and thoroughly investigated,” UN investigators concluded in their report.
In Monday’s post, Frankel said Facebook was using “a number of product interventions that have been used in the past in Myanmar and in the US elections, to ensure the platform is not used. to spread disinformation, incite violence or coordinate damage “.
The company works to secure the accounts of activists and journalists “who are in danger or have been arrested” and to remove content that threatens or calls for violence against them, Frankel wrote. The company will also protect “critical information about what is happening on the ground”, given the restrictions placed on the country’s news organizations.
Facebook’s work is an ongoing effort. On Tuesday, he deleted a page from the Burmese military television channel on Monday evening, following inquiries by the wall street journal. While the company banned a page for the Myawaddy TV network in 2018 during a crackdown on hundreds of accounts linked to the Burmese army, a new page had reappeared and had collected 33,000 likes.
Facebook has often been criticized for facilitating the growth of violent and extremist groups and its ineffectiveness in stemming disinformation. More recently, a tech watch group accused the company of instigating the unrest that led to the murderous coup attempt in the United States.
“[Facebook] spent the last year failing to eliminate extremist activity and President Trump-fueled election-related conspiracy theories that have radicalized a large section of the population and led many people down a dangerous path, ”the Tech Transparency Project (TTP) said in a report.
The report revealed specific threats made by pro-Trump and activist groups on Facebook before and after Joe Biden’s election victory in November.