About three weeks ago, Facebook announced it will increase its efforts against hate speech and misinformation in Myanmar before the country’s general election on November 8, 2020. Today, it gave some more details about what the company is doing to prevent the spread of hate speech and misinformation. This includes adding Burmese language warning screens to flag information rated false by third-party fact-checkers.
In November 2018, Facebook admitted it didn’t do enough to prevent its platform from being used to “foment division and incite offline violence” in Myanmar.
This is an understatement, considering that Facebook has been accused by human rights groups, including the United Nations Human Rights Council, of enabling the spread of hate speech in Myanmar against Rohingya Muslims, the target of a brutally violent ethnic cleansing campaign. A 2018 investigation by The New York Times found that members of the military in Myanmar, a predominantly Buddhist country, instigated genocide against Rohingya, and used Facebook, one of the country’s most widely used online services, as a tool to conduct a “systematic campaign” of hate speech against the minority group.
In its announcement several weeks ago, Facebook said it will expand its misinformation policy and remove information intended to “lead to voter suppression or damage the integrity of the electoral process” by working with three fact-checking partners in Myanmar — BOOM, AFP Fact Check and Fact Crescendo. It also said it would flag potentially misleading images and apply a message-forwarding limit it introduced in Sri Lanka in June 2019.
Facebook also shared that in the second quarter of 2020, it had taken action against 280,000 pieces of content in Myanmar that violated its Community Standards against hate speech, with 97.8% detected by its systems before being reported, up from the 51,000 pieces of content it took action against in the first quarter.
But, as TechCrunch’s Natasha Lomas noted, “without greater visibility into the content Facebook’s platform is amplifying, including country specific factors such as whether hate speech posting is increasing in Myanmar as the election gets closer, it’s not possible to understand what volume of hate speech is passing under the radar of Facebook’s detection systems and reaching local eyeballs.”
Facebook’s latest announcement, posted today on its News Room, doesn’t answer those questions. Instead, the company gave some more information about its preparations for the Myanmar general election.
The company said it will use technology to identify “new words and phrases associated with hate speech” in the country, and either remove posts with those words or “reduce their distribution.”
It will also introduce Burmese language warning screens for misinformation identified as false by its third-party fact-checkers, make reliable information about the election and voting more visible, and promote “digital literacy training” in Myanmar through programs like an ongoing monthly television talk show called “Tea Talks” and introducing its social media analytics tool, CrowdTangle, to newsrooms.
from Social – TechCrunch https://ift.tt/3iY3rEB
No comments:
Post a Comment