Meta in Myanmar, Part II: The Crisis
This is the second post in a series on what Meta did in Myanmar and what the broader technology community can learn from it. It will make a lot more sense if you read the first post—these first two are especially tightly linked and best understood as a single story. There’s also a meta-post with things like terminology notes, sourcing information, and a corrections changelog.
But in case you haven’t read Part I, or in case you don’t remember all billion words of it…
Let’s recap
In the years leading up to the worst violence against the Rohingya people, a surge of explicit calls for the violent annihilation of the Rohingya ethnic minority flare up across Myanmar—in speeches by military officers and political party members, in Buddhist temples, in YouTube videos, through anonymous Bluetooth-transmitted messages in cafes, and, of course, on Facebook.
What makes Facebook special, though, is that it’s everywhere. It’s on every phone, which is in just about every home. Under ultra-rigid military control, the Burmese have long relied on unofficial information—rumors—to get by. And now the country’s come online extremely quickly, even in farming villages that aren’t yet wired for electricity.
And into all the phones held in all the hands of all these people who are absolutely delighted to connect and learn and better understand the world around them, Facebook is distributing and accelerating professional-grade hatred and disinformation whipped up in part by the extremist wing of Myanmar’s widely beloved Buddhist religious establishment.
It’s a very bad setup.
The dangers rising in Myanmar in the mid-2010s aren’t only clear in hindsight: For years, Burmese and western civil society experts, digital rights advocates, tech folks—even Myanmar’s own government—have been warning Meta that Facebook is fueling a slide toward genocide. In 2012 and 2014, waves of—sometimes state-supported—communal violence occur; the Burmese government even directly connects unchecked incitement on Facebook to one of the riots and blocks the site to stop the violence.
Meta has responded by getting local Burmese groups to help it translate its rules and reporting flow, but there’s no one to deal with the reports. For years, Meta employs a total of one Burmese-speaking moderator for this country of 50M+ people—which by the end of 2015 they increased to four.
This brings us to 2016, when Meta doubles down on connection.
The next billion
In 2013, Mark Zuckerberg announces the launch of Facebook’s new global internet-expansion initiative, Internet.org. Facebook will lead the program with six other for-profit technology companies: two semiconductor companies, two handset makers, a telecom, and Opera. There’s a launch video, too, with lots of very global humans doing celebratory human things set to pensive piano notes with a JFK speech about world peace playing over it.1
Alongside the big announcement, Zuckerberg posts a memo about his plans, titled “Is Connectivity a Human Right?” Facebook’s whole deal, he writes, is to make the world more open and connected:
But as we started thinking about connecting the next 5 billion people, we realized something important: the vast majority of people in the world don’t have any access to the internet.
The problem, according to Zuckerberg, is that data plans were too costly—which is because of missing infrastructure. His memo then makes a brief detour through economics, explaining that internet access == no more zero-sum resources == global prosperity and happiness:
Before the internet and the knowledge economy, our economy was primarily industrial and resource-based. Many dynamics of resource-based economies are zero sum. For example, if you own an oil field, then I can’t also own that same oil field. This incentivizes those with resources to hoard rather than share them. But a knowledge economy is different and encourages worldwide prosperity. It’s not zero sum. If you know something, that doesn’t stop me from knowing it too. In fact, the more things we all know, the better ideas, products and services we can all offer and the better all of our lives will be.
And in Zuckerberg’s account, Facebook is really doing the work, putting in the resources required to open all of these benefits to everyone:
Since the internet is so fundamental, we believe everyone should have access and we’re investing a significant amount of our energy and resources into making this happen. Facebook has already invested more than $1 billion to connect people in the developing world over the past few years, and we plan to do more.2
As various boondoggles have recently demonstrated, social media executives are not necessarily brilliant people, but neither is Mark Zuckerberg a hayseed. What his new “Next Billion” initiative to “connect the world“ will do is build and reinforce monopolistic structures that give underdeveloped countries not real “internet access” but…mostly just Facebook, stripped down and zero-rated so that using it doesn’t rack up data charges.
The Internet.org initiative debuts to enthusiastic coverage in the US tech press, and many mainstream outlets.3 The New York Times contributes a more skeptical perspective:
[Social media] companies have little choice but to look overseas for growth. More than half of Americans already use Facebook at least once a month, for instance, and usage in the rest of the developed world is similarly heavy. There is nearly one active cellphone for every person on earth, making expansion a challenge for carriers and phone makers.
Poorer countries in Asia, Africa and Latin America present the biggest opportunity to reach new customers—if companies can figure out how to get people there online at low cost.4
In June of 2013, Facebook had 1.1 billion monthly active users, only 198 million of which were in the US. As I write this post in 2023, the number of monthly active users is up to 3 billion, only 270 million of which are in the US. So usage numbers in the US have only risen 36% in ten years, while monthly active users everywhere else went up 188%.5 By 2022, 55% of all social media use was in Asia.6
Whenever you read about Meta’s work “connecting the world,” I think it’s good to keep those figures in mind.
But just because the growth was happening globally didn’t mean that Meta was attending to what its subsidized access was doing outside the US and Western Europe.
In An Ugly Truth, their 2021 book about Meta’s inner workings, New York Times reporters Sheera Frenkel and Cecelia Kang write that no one at Meta was responsible for assessing cultural and political dynamics as new communities came online, or even tracking whether they had linguistically and culturally competent moderators to support each new country.
A Meta employee who worked on the Next One Billion initiative couldn’t remember anyone “directly questioning Mark or Sheryl about whether there were safeguards in place or raising something that would qualify as a concern or warning for how Facebook would integrate into non-American cultures.”7
In 2015, Internet.org rebrands as Free Basics after the initiative attracts broad criticism for working against net neutrality—it’s a PR move that foreshadows the big rebrand from Facebook to Meta shortly after Frances Haugen delivers her trove of internal documents to the SEC in 2021.8
In 2016, it’s time to roll out Free Basics in Myanmar, alongside a stripped-down version of Facebook called Facebook Flex that lets people view text for free and then pay for image and video data.9 Facebook is already super-popular in Myanmar for reasons covered in the previous post, but when Myanmar’s largest telecom, MPT, launches Free Basics and Facebook Flex, Facebook’s Myanmar monthly active user count more than doubles from a little over 7 million users in 2015 to at least 15 million in 2017. (Several US media sources say 30 million, though I don’t think I believe them.)10
But I want to be clear—for a ton of people across Myanmar, getting even a barebones internet was life-changingly great.
“Before, I just had to watch the clouds”
In early 2017, journalist Doug Bock Clark interviewed people in Myanmar—including MIDO cofounder Nay Phone Latt—about the internet for Wired.
Clark quotes a farmer who cultivates the tea plantation his family has worked for generations in Shan State:
I have always lived in the same town with about 900 people, which is in a very beautiful forest but also very isolated. When I was a child, we lived in wooden houses and used candles at night, and the mountain footpaths were too small even for oxcarts. For a long time, life didn’t change.
In 2014, the tea farmer’s town got a cell tower, and in 2016 a local NGO demonstrated an app that offered weather forecasts, market prices, and more. That really changed things:
Being able to know the weather in advance is amazing—before, I just had to watch the clouds! And the market information is very important. Before, we would sell our products to the brokers for very low prices, because we had no idea they sold them for higher prices in the city. But in the app I can see what the prices are in the big towns, so I don’t get cheated…
This brings me back to Craig Mod’s essay about his ethnographic work in rural Myanmar that I quoted from a lot in Part I of this series. Here, Mod is talking about internet use with a group of farmers: “The lead farmer mentions Facebook and the others fall in. Facebook! Yes yes! They use Facebook every day. They feel that spending data on Facebook is a worthwhile investment.”
One of the farmers wants to show Mod a post, and Mod and his colleagues speculate while the post loads:
Earlier, he said to us, lelthamar asit—Like any real farmer, I know the land. And so we wonder: What will he show us? A new farming technique? News about the upcoming election? Analysis on its impact on farmers? He shows us: A cow with five legs. He laughs. Amazing, no? Have you ever seen such a thing?11
It’s a charming story. But it’s hard not to feel a little ill, reading back from the perspective of 2023.
In the middle of a video podcast interview, Frances Haugen relates a story in the context of Meta trying to make tooling for reporting misinformation:
And one of our researchers said, you know, that sounds really obvious. Like that sounds like it would be a thing that would work. Except for when we went in and did interviews in India, people are coming online so fast that when we talk to people with master’s degrees… They say things like, why would someone put something fake on the Internet? That sounds like a lot of work.12
This anecdote is meant to point to the relative naiveté of Indian Facebook users, but honestly I recognize the near-universal humanity of the idea—that all of that manufacturing would just be too much work for regular people to do! It’s the argument against conspiracies in general. For those of us whose brains haven’t been ruined by the internet, it’s reasonable to think that regular people just wouldn’t go to all that trouble.
As it happens, in Myanmar and lots of other places, it’s not only regular people doing the work of disinformation and incitement, and we’ll get to that later. But regular people across Myanmar are reading all these anti-Rohingya messages and looking at the images and watching the videos, and…a lot of them are buying it.
“Everyone knows they’re terrorists”
This brings me back to Faine Greenwood’s essay that I also quoted from a lot in the previous post, and specifically to Greenwood’s “honest-to-god Thomas Friedman moment” in a Burmese cab back in 2013:
The driver was a charming young Burmese man who spoke good English, and we chatted about the usual things for a bit: the weather (sticky), how I liked Yangon (quite a bit, hungry dogs aside), and my opinion on Burmese food (I’m a fan).
Then he asked me what I was in town for, and I told him that I’d come to write about the Internet. “Oh, yes, I’ve got a Facebook account now,” he said, with great enthusiasm. “It is very interesting. Learning a lot. I didn’t know about all the bad things the Bengalis had been doing.”
“Bad things?” I asked, though I knew what he was going to say next.
“Killing Buddhists, stealing their land. There’s pictures on Facebook. Everyone knows they’re terrorists,” he replied.
“Oh, fuck,” I thought.13
Greenwood’s story closely parallels one Matt Schissler tells reporters Sheera Frenkel and Cecilia Kang for An Ugly Truth. (Schissler is one of the people delivering dire warnings to Meta in Part I of this series.)
In Schissler’s story, it’s also 2013, and he’s starting to see some really hair-raising stuff. His Buddhist friends start relating their conspiracy theories about the Rohingya and showing him “grainy cell phone photos of bodies they said were of Buddhist monks killed by Muslims.” They’re telling him ISIS fighters are on their way to Myanmar.
This narrative is even coming from a journalist friend, who calls to warn Schissler of a Muslim plot to attack the country. The journalist shows him a video as proof:
Schissler could tell that the video was obviously edited, dubbed over in Burmese with threatening language. “He was a person who should have known better, and he was just falling for, believing, all this stuff.”14
It’s miserably hot in Myanmar when Craig Mod is there in 2016—steam-broiling even in the shade, and the heat shows up a lot in Mod’s notes. His piece ends with a grace note about a weather forecast:
Farmer Number Fifteen loves the famous Myanmar weatherman U Tun Lwin, now follows him on Facebook. I hunt U Tun Lwin down, follow him too, in solidarity, although I’m pretty sure I know what tomorrow’s weather will be.15
When I reread Mod’s essay about halfway through my research for this series, my eye caught on that name: U Tun Lwin. I’d just seen it somewhere.
It was in the findings report of the United Nations Human Rights Council’s Independent International Fact-Finding Mission on Myanmar (just “the UN Mission” in the rest of this post).
It was there because in the fall of 2016, about a year after Craig was in Myanmar and as a wave of extreme state violence against the Rohingya is kicking off, there’s this Facebook post. The UN Mission reports that “Dr. Tun Lwin, a well-known meteorologist with over 1.5 million followers on Facebook, called on the Myanmar people to be united to secure the ‘west gate.’” (The “west gate” is the border with Bangladesh, and this is a reference to the idea that the Rohingya are actually all illegal “Bengali” immigrants.)
Myanmar, Tun Lwin continued in his post, “does not tolerate invaders,” and its people must be alert “now that there is a common enemy.” As of August 2018, when the UN Mission published their report, Tun Lwin’s post was still up on Facebook. It had 47,000 reactions, over 830 comments, and nearly 10,000 shares. In the comments, people called the existence of the Rohingya in Rakhine State a “Muslim invasion” and demanded that the Rohingya be uprooted and eradicated.16
The longest civil war
I need to say a little bit about the Tatmadaw, for reasons that will almost immediately become clear.
Tatmadaw (literally “grand army”) is the umbrella term for Myanmar’s armed forces—it includes the army, navy, and air force, but a Tatmadaw officer also oversees the national police force. There’s a ton of history I have to elide, but the two crucial things to know are that Tatmadaw generals have been running Myanmar (or heavily influencing its government) since the country gained independence, and the military’s been at war with multiple ethnic armed groups throughout Myanmar since just after the end of WWII.17
These conflicts—by some accountings, the longest-running civil war in the world—have been marked by the Tatmadaw’s intense violence against civilians. The UN Mission findings report that I cite throughout this series includes detailed accounts of Tatmadaw atrocities targeting civilian members of ethnic minorities in Kachin and Shan States. Human Rights Watch and many other organizations have detailed Tatmadaw brutalities focusing on ethnic minorities in Karen State and elsewhere in Myanmar.18
Information about these conflicts and atrocities was readily available in English throughout Meta’s expansion into the region. I include this brief and inadequate history to explain that it was not difficult, in this period, to learn what the Tatmadaw really was, and what they were capable of doing to civilians.
Which brings us, finally, to what happened to the Rohingya in 2016 and 2017.
Clearance operations
Content warning for these next two sections: I’m going to be brief and avoid graphic descriptions, but these are atrocities, including the torture, rape, and murder of adults and children.
2016 was supposed to be the first year in Myanmar’s new story. In the landmark 2015 general elections in Myanmar, Aung San Suu Kyi’s party wins a supermajority, and takes office in the spring of 2016. This is a huge deal—obviously most of all within Myanmar, but also internationally, because it looks like Myanmar’s moving closer to operating as a true democracy. But the Rohingya are excluded from the vote, and from a national peace conference held that summer to try to establish a ceasefire between the Tatmadaw and armed ethnic minority groups.19
The approximately 140,000 Rohingya people displaced in the 2012 violence are at this point largely still living in IDP (internally displaced person) camps and deprived of the neccesities of life, and the Myanmar government has continued tightening—or eliminating—the already nearly impossible paths to citizenship and a more normal life for the Rohingya as whole.20
The violence has continued, as well. According to a 2016 US State Department “Atrocities Prevention Report,” the Rohingya also continued to experience extremist mob attacks, alongside governmental abuses including “torture, unlawful arrest and detention, restricted movement, restrictions on religious practice, and discrimination in employment and access to social services.”21
This is all background for what happens next.
I give this accounting not to be shocking or emotionally manipulative, but because I don’t think we can assess and rationally discuss Meta’s responsibilities—in Myanmar and elsewhere—unless we allow ourselves to understand what happened to the human beings who took the damage.
In October of 2016, a Rohingya insurgent group, the Arakan Rohingya Salvation Army (ARSA), attacks Burmese posts on the Myanmar-Bangladesh border, killing nine border officers and four Burmese soldiers. The Tatmadaw respond with what they called “clearance operations,” nominally aimed at the insurgents but in fact broadly targeting all Rohingya people.22
A 2016 report from Amnesty International—and, later, the UN Human Rights Council’s Independent Fact-Finding Mission in Myanmar—document the Tatmadaw’s actions, including the indiscriminate rape and murder of Rohingya civilians, the arbitrary arrests of hundreds of Rohingya men including the elderly, forced starvation, and the destruction of Rohingya villages.23 Tens of thousands of Rohingya flee over the border to Bangladesh.24
Through the winter of 2016 and into 2017, bursts of violence continue—Tatmadaw officers beating Rohingya civilians, Buddhist mobs in Rakhine State attacking Rohingya people, Rohingya militants killing people they saw as betrayers. Uneasy times.
Then, on the morning of August 25th, 2017, ARSA fighters mount crude, largely unsuccessful attacks on about 30 Burmese security posts.25 Simultaneously, according to an Amnesty investigation, ARSA fighters murder at least 99 Hindu civilians, including women and children, in two villages in Northern Rakhine State.26 (Despite the mass-scale horrors that would follow, this act was, by any measure, an atrocity.)
And after that, everything really goes to hell.
In response to the ARSA attacks, the Tatmadaw begins its second wave of clearance operations and begins, in Amnesty International’s words, systematically attacking “the entire Rohingya population in villages across northern Rakhine State.”27
Accelerating genocide
I’ve worked with atrocity documentation before. I still don’t know a right way to approach what comes next. I do know that the people who document incidents of communal and state violence for organizations like Medicins Sans Frontieres and the UN Human Rights Council use precise, economical language. Spend enough time with their meticulous tables and figures the precision itself begins to feel like rage.
Based on their extensive and intimate survey work with refugees who escaped to Bangladesh, Médecins Sans Frontières estimates that in a single month between August 25th and September 24th of 2017, about 11,000 Rohingya die in Myanmar, including 1,700 children. Of these, about 8,000 people are violently killed, including about 1,200 children under the age of five.28
The UN Mission’s report notes that in attacks on Rohingya villages, women and children, including infants, are “specifically targeted.”29 According to MSF, most of the murdered children under five are shot or burned, but they note that about 7% are beaten to death.30
In what Amnesty International calls “a relentless and systematic campaign,” the Tatmadaw publicly rape hundreds—almost certainly thousands—of Rohingya women and girls, many of whom they also mutilate. They indiscriminately arrest and torture Rohingya men and boys as “terrorists.” They push whole communities into starvation by burning their markets and blocking access to their farms. They burn hundreds of Rohingya villages to the ground.31
Over the ensuing weeks, more than 700,000 people (”more than 702,000 people,” Amnesty writes, “including children”) flee to squalid, overcrowded, climate-vulnerable refugee camps in Bangladesh.32 That’s more than 80% of the Rohingya previously living in Rakhine State.
The UN Mission’s findings report comes out about a year later.
I’ve cited it a lot already in this post and the previous one. The document runs to 444 pages, opens with a detailed background for the 2017 crisis and then becomes a catalog of thousands of collective and individual incidents of the Tatmadaw’s systematic torture, rape, and murder of members of Rohingya—and, to a lesser but still horrific extent, of other ethnic minorities across Myanmar. The scale and level of detail are beyond anything else I’ve encountered; accounts of mutilations, violations, and the murder of children in front of their parents go on page after page after page. My honest advice is that you don’t read it.33
Classifying incidents of violence as genocide is a lengthy, fraught, and uneven process. The UN Human Rights Council’s High Commissioner calls the events in Myanmar “a textbook example of ethnic cleansing.”34 The International Court of Justice is currently hearing a case against Myanmar brought under the international Genocide Convention.35 The US State Department officially classifies the events in Myanmar as a genocide, as do many genocide scholars and institutions. In this series, I follow the usage of the United States Holocaust Memorial Museum in Washington, DC, in whose work I have complete confidence.36
But…Facebook?
If you’ve read this far, then first, thank you. Maybe get a drink of water or something.
Second, I think you may be—probably should be—wondering how many of the things I’ve just related can be connected to something as relatively inconsequential as Facebook posts.
I want to do a tiny summary and then preview some arguments that I won’t really be able to dig into until the end of this post and especially in the next one, when I finally get into the documents and investigations that show what was happening under the hood of Meta’s content recommendation engines.
The escalation from relatively isolated incidents of anti-Rohingya violence pre-2012 into the two big waves of attacks that year, the semi-communal semi-state violence in 2016, and the full-on Tatmadaw-led genocide in 2017 was accompanied by an overwhelming rise in Facebook-mediated disinformation and violence-inciting messages.
And as I’ve tried to show and will keep illustrating with examples, these messages built intense anti-Rohingya beliefs and fears throughout Myanmar’s mainstream Buddhist culture. Those beliefs and fears quite clearly led to direct incidents of communal (non-state) violence.
Determining whether those beliefs also constituted even a partial manfactured mainstream consent to the Tatmadaw’s actions in 2016 and 2017 is both out of my lane and honestly maybe unknowable, given the impossibility of untangling what was known by whom, and when. What I think I can say is that they ran in exact parallel to the Tatmadaw’s genocidal operations.
The overwhelming volume and velocity of this hate campaign would not have been possible without Meta, which did four main things to enable it:
- Meta bought and maneuvered its way into the center of Myanmar’s online life and then inhabited that position with a recklessness that was impervious to warnings by western technologists, journalists, and people at every level of Burmese society. (This is most of Part I.)
- After the 2012 violence, Meta mounted a content moderation response so inadequate that it would be laughable if it hadn’t been deadly. (Discussed in Part I and also below.)
- With its recommendation algorithms and financial incentive programs, Meta devastated Myanmar’s new and fragile online information sphere and turned thousands of carefully laid sparks into flamethrowers. (Discussed below and in Part III.)
- Despite its awareness of similar covert influence campaign based on “inauthentic behavior”—aka fake likes, comments, and Pages—Meta allowed an enormous and highly influential covert influence operation to thrive on Burmese-language Facebook throughout the run-up to the peak of the 2016 and 2017 “ethnic cleansing,” and beyond. (Part III.)
The lines of this argument have all been drawn by better informed people than me. Amnesty International’s 2022 report, “The Social Atrocity: Meta and the Right to Remedy for the Rohingya,” directly implicates Meta in the genocidal propaganda campaigns and furor that led up to the Tatmadaw’s atrocities in Rakhine State. The viral acceleration of dehumanizing and violent posts in 2017, Amnesty writes, made those messages “appear ubiquitous on Burmese-language Facebook, creating a sense that everyone in Myanmar shared these views, helping to build a shared sense of urgency in finding a ‘solution’ to the ‘Bengali problem’ and ultimately building support for the military’s 2017 ‘clearance operations’.”37
And as I noted in the intro to the first post in this series, the UN Mission’s own lead investigator stated that Facebook played “a determining role” in the violence.38
But again, I think it’s reasonable and important to ask whether that can really be possible, and to look carefully at the evidence.
On one hand it seems obvious that Meta was indeed negligent about expanding content moderation, and deeply misguided in continuing to expand into Myanmar without fixing the tide of genocidal messages that experts had been warning them about since at least 2012. Meta’s behavior, after all those years of warnings, is hard to describe as anything but callous.
But does any of that make them responsible for what the Tatmadaw did?
Let’s start with the content moderation problem. Which means that we have to look at some of the actual content Meta allowed to circulate on Burmese-language Facebook during the waves of violence in 2016 and 2017.
Rumors and lies
Content warning: Hate speech, ethnic slurs.
On September 12, 2017, during the peak of the Tatmadaw’s genocidal attacks on the Rohingya, the Institute for War and Peace Reporting released an update on their two-year project in Myanmar with a dozen-odd local journalists and monitors who tracked and reported on hate speech and incitement to violence.
The post is called “How Social Media Spurred Myanmar’s Latest Violence,” and it’s written by IWPR’s regional director, Alan Davis. It’s both cringey—Davis starts with a dig at how backward and superstitious the Buddhist establishment is—and obviously rooted in real moral anguish at having failed to prevent the disaster. Much of the meat of the post is focused on Facebook, and Davis’s observations are sharp (emphasis mine):
The vast majority of hate speech was on social media, particularly Facebook.… while not all hate speech was anti-Muslim or anti-Rohingya, the overwhelming majority certainly was. Much was juvenile and just plain nasty, while a good deal was insidious and seemed to be increasingly organised. A lot of it was also smart and it was clear a great deal of time and energy had gone into some of the postings.
Over time, we saw the hate speech becoming more targeted and militaristic. Wild allegations spread, including claims of Islamic State (IS) flags flying over mosques in Yangon where munitions were being stored, of thwarted plots to blow up the 2,500 year-old Shwedagon Pagoda in Yangon and supposed cases of Islamic agents smuggling themselves across the border.
…we felt a clear sense that in the absence of any kind of political leadership that a darkening and deepening vacuum that would ultimately result in a violent reckoning.… Most importantly, we warned that rumours and lies peddled and left unchecked might end up creating their own reality. 39
On October 30, 2017, just after the full-scale ethnic cleansing began, Sitagu Sayadaw, a Buddhist monk and one of the most respected religious leaders in Myanmar, gave a sermon to an audience of soldiers—and to the rest of the country, via a Facebook livestream. His sermon featured a passage from the Mahavamsa in which monks comfort a Buddhist king consumed by guilt after leading a war in which millions died:
“Don’t worry, your Highness. Not a single one of those you killed was Buddhist. They didn’t follow the Buddhist teachings and therefore they did not know what was good or bad. Not knowing good or bad is the nature of animals. Out of over five hundred thousand you killed, only one and a half were worth to be humans. Therefore it is a small sin and does not deserve your worry.40
The UN Mission’s report includes many other examples of religious, governmental, and military figures comparing Rohingya people to fleas, weeds, and animals—and in some cases, making explicit reference to the necessity of emulating both the Holocaust and the United States’ bombing of Hiroshima and Nagasaki.41
The report also includes specific examples of the kinds of dehumanizing and inciting posts and comments going around Facebook in 2017. I’m only going to include a few, but I think it’s important to be clear about what Meta let circulate, months into a full-on ethnic cleansing operation:
- In early 2017, a Burmese “patriot” posts a graphic video of police beating citizens in another country, with the comment: “Watch this video. The kicks and the beatings are very brutal. I watch the video and feel that it is not enough. In the future […] Bengali disgusting race of Kalar terrorists who sneaked into our country by boat, need to be beaten like that. We need to beat them until we are satisfied.” (The post was still up on Facebook in July 2018.)
- A widely shared August 2017 post: “…the international community all condemned the actions of the Khoe Win Bengali [“Bengali that sneaked in”] terrorists. So, in this just war, to avenge the deaths of the ethnic people who got beheaded, and the policemen who got hacked into pieces, we are asking the Tatmadaw to turn these terrorists into powder and not leave any piece of them behind.”
- Another post: “Accusations of genocide are unfounded, because those that the Myanmar army is killing are not people, but animals. We won’t go to hell for killing these creatures that are not worth to be humans.”
- Another post: “If the (Myanmar) army is killing them, we Myanmar people can accept that… current killing of the Kalar is not enough, we need to kill more!”42
Let’s look at some quantifiable data on the volume of extremist posts during the period—we don’t have much, because only Meta really knows, but we do have a couple of windows into the way things escalated.
By 2016, data analyst Raymond Serrato, who eventually goes to work for the Office of the United Nations High Commissioner for Human Rights, has been studying social media in Myanmar for a couple of years. So when when the Tatmadaw’s clearance operations swing into action in 2016, he’s already watching what’s happening in a big (55k member) Facebook group run by Ma Ba Tha supporters—a “hangout for Buddhist patriots,” as Seratto describes it.43
What Serrato sees in this group is a rising curve in posting volume in the late summer of 2017 before the Arakan Rohingya Salvation Army attacks, and then spiking hard immediately after the attacks, as the Tatmadaw began the concentrated genocidal operation against the Rohingya.
Visualization by Raymond Serrato.
Serrato’s research is limited in scope—he’s only using the Groups API—but his snapshot of how hardline nationalist post volume went through the roof in 2017 clearly runs alongside the qualitative reports from Burmese and western observers—and victims.
What Meta did about it
Across the first-person narratives from Burmese and western tech and civil society people, there’s a thread of increasingly intense frustration—bleeding into desperation—among the people who tried, over and over, to get individual pieces of dehumanizing propaganda, graphic disinformation, and calls to violence removed from Facebook by reporting them to Facebook.
They report posts and never hear anything. They report posts that clearly call for violence and eventually hear back that they’re not against Facebook’s Community Standards. This is also true of the Rohingya refugees Amnesty International interviews in Bangladesh—they were also reporting posts demonizing and threatening their communities, and it didn’t help.44
Writing on behalf of the Burmese and western people in the private Facebook group with Facebook employees, Htaike Htaike Aung and Victoire Rio summarize the situation in 2016, during the first wave of “clearance operations”:
…Facebook was unequipped to proactively address risk concerns. They relied nearly exclusively on us, as local partners, to point them to problematic content. Upon receiving our escalations…they would typically address the copy we escalated but take no further steps to remove duplicate copies or address the systemic policy or enforcement gaps that these escalations brought to light.… We kept asking for more points of contact, better escalation protocols, and interlocutors with knowledge of the language and context who could make decisions on the violations without requiring the need for translators and further delays. We got none of that.45
And as we now know, Meta’s fleet of Burmese-speaking contractors had grown to a total of four at the end of 2015. According to Reuters, in 2018, Meta had about 60 people reviewing reported content from Myanmar via the Accenture-run “Project Honey Badger” contract operation in Kuala Lumpur, plus three more in Dublin, to monitor Myanmar’s approximately 18 million Facebook users.46 So in 2016 and 2017, Meta has somewhere between 4 and 63-ish Burmese speakers monitoring hate speech and violence-inciting messages in Myanmar. And zero of them, incidentally, in Myanmar itself.
I don’t know how many content reviewers Meta employed globally in 2016 and 2017, so we have to skip ahead to get an estimate. In his 2018 appearance before the US House Energy and Commerce Committee, Mark Zuckerberg is asked by Texas House Representative Pete Olson whether Meta employs about 27,000 people. Zuckerberg says yes.
OLSON: I’ve also been told that about 20,000 of those people, including contractors, do work on data security. Is that correct?
ZUCKERBERG: Yes. The 27,000 number is full time employees. And the security and content review includes contractors, of which there are tens of thousands. Or will be. Will be by the time that we hire those.47
There are several remarkable things about this exchange, including that when Rep. Olsen afterward sums up, incorrectly, that this means that more than half of Meta’s employees deal with “security practices,” Zuckerberg doesn’t correct him, but I’ll just emphasize that Meta is claiming to have (or be hiring!) tens of thousands of contractors to work on security and content review, in 2018. And for Myanmar, where by 2018 the genocide of the Rohingya has already peaked, they’ve managed to assemble about 63.
As it turns out, even the United Nations’ own Mission, acting in an official capacity, can’t get Facebook to remove posts explicitly calling for the murder of a human rights defender.
“Don’t leave him alive”
Both the UN Mission’s findings and Amnesty International’s big report tell the story of this person—an international aid worker targeted for his alleged cooperation with the Mission. He’s unnamed in the UN report; Amnesty calls him “Michael.”
Here’s how it happens: “Michael” does an interview with a local journalist in Myanmar about the situation he’d observed in Rakhine State, and the interview goes viral on Facebook.
The response by anti-Rohingya extremists is immediate and intense: The most dangerous Facebook post made about Michael features a picture of his opened passport and describes him as a “Muslim” and “national traitor.” The comments on the Facebook post call for Michael’s murder: “If this animal is still around, find him and kill him. There needs to be government officials in NGOs.” “He is a Muslim. Muslims are dogs and need to be shot.” “Don’t leave him alive. Remove his whole race. Time is ticking.”48
Strangers start recognizing Michael from the viral posts, and warning him that he’s in danger. The threats expand to include his family.
The UN Mission team investigating the attacks on the Rohingya knows Michael. They get involved, reporting the post with the photo of Michael’s passport in it to Facebook four times. Each time, they get the same response: the post had been reviewed and “doesn’t go against one of [Facebook’s] specific Community Standards.”49
By this point, the post has been shared more than 1,000 times, and many others have appeared. Michael’s friends and colleagues in Myanmar and in the US are reporting everything they can find—some posts get deleted, but hundreds more appear, “like a game of whack-a-mole.”50
The UN team escalates and emails an official Facebook email account; no one responds. At this point, the team tells Michael that it’s time to get out of Myanmar—it’s too dangerous to stay.
Several weeks later, the UN Mission is finally able to get Facebook to take down the original post, but only then with the help of a contact at Facebook. And copies of the post keep circulating on Facebook.
The Mission team write that they encountered “many similar cases where individuals, usually human rights defenders or journalists, become the target of an online hate campaign that incites or threatens violence.”
In their briefing document about the many attempts to get Facebook to stop fueling the violence in Myanmar, Htaike Htaike Aung and Victoire Rio write:
Despite the escalating risks, we did not see much progress over that period, and Facebook was just as unequipped to deal with the escalation of anti-Rohingya rhetoric and violence in August 2017 as they had been in 2016.… Ultimately, it was still down to us, as local partners, to warn them. We simply couldn’t cope with the scale.51
Meta’s active harms: the incentives
In a 2016 interview, Burmese civil-society and digital-literacy activists Htaike Htaike Aung and Phyu Phyu Thi speak about the work their organization, MIDO, was doing to counter hate speech and misinformation. Which was a lot: They’re doing digital literacy and media literacy training, they’ve built more than 60 digital literacy centers throughout Myanmar, they monitor online hate speech, and they run a “Real or Not” fact-checking page for Burmese users.52
Even so, Myanmar’s civil society organizations and under-resourced activists simply can’t keep pace with what’s happening online—not without action on Meta’s part to sharply reduce and deviralize genocidal content at the product-design level.
There were—and are—ways for Meta to change its inner machinery to reduce or eliminate the harms it does. But in 2016, the company actually does something that makes the situation much worse.
In addition to continuing to algorithmically accelerate extremist messages, Meta introduces a new program that took a wrecking ball to Myanmar’s online media landscape: Instant Articles.
If you’re from North America or Europe, you probably know Instant Articles as one of the ways Meta persuaded media organizations to publish their work directly on Facebook, ostensibly in exchange for fast loading and and shared ad revenue.
Instant Articles was kind of a bust for actual media organizations, but in many places, including in Myanmar, it became a way for clickfarms to make a lot of money—ten times the average Burmese salary—by producing and propagating super-sensationalist fake news.
“In a country where Facebook is synonymous with the internet,” the MIT Technology Review’s Karen Hao writes, “the low-grade content overwhelmed other information sources.”53
The result for Myanmar’s millions of Facebook users is an explosive decompression of its online information sphere. In 2015, before Instant Articles expands to Myanmar, 6 out of 10 websites getting the most engagement on Facebook in Myanmar are “legitimate” media organizations. A year after Instant Articles hits the country, legitimate publishers make up only 2 of the top 10 publishers on Facebook. By 2018, the number of legit publishers on the list is zero—all 10 are fake news.54
This is the online landscape in place in 2016 and 2017.
Then there are the algorithms.
“People saw the vilest content the most”
When he speaks to Amnesty International about his experience being targeted on Facebook, Michael (who was in Myanmar 2013–2018) also talks about what Facebook’s News Feed looked like in Myanmar in more general terms:
“The vitriol against the Rohingya was unbelievable online—the amount of it, the violence of it. It was overwhelming. There was just so much. That spilled over into everyday life…
The news feed in general [was significant]—seeing a mountain of hatred and disinformation being levelled [against the Rohingya], as a Burmese person seeing that, I mean, that’s all that was on people’s news feeds in Myanmar at the time. It reinforced the idea that these people were all terrorists not deserving of rights. This mountain of misinformation definitely contributed [to the outbreak of violence].”
And elsewhere in the same interview:
The fact that the comments with the most reactions got priority in terms of what you saw first was big—if someone posted something hate-filled or inflammatory it would be promoted the most—people saw the vilest content the most. I remember the angry reactions seemed to get the highest engagement. Nobody who was promoting peace or calm was getting seen in the news feed at all.”55
So let’s remember, by 2016, active observers of social media—and Facebook in particular—have a pretty good sense of what makes things go viral. And clearly there are organized groups in Myanmar—MaBaTha’s hardline monks, for one—who are super skilled at getting a lot of eyes on their anti-Rohingya Facebook.
But the big, super-frustrating problem with trying to understand Facebook’s effects through accounts like these is that they only describe what can be deduced from the network’s exterior surfaces—what people see, what they report, what happens afterward. I believe these accounts—I especially trust the statements from Burmese people working on the ground—but they’re all coming from outside Facebook’s machinery.
Which is why we’re incredibly lucky to get, just a few years later, an inside view of what was really happening—and what Meta knew about it as it happened.
Next up: Part III: The Inside View.
“Technology Leaders Launch Partnership to Make Internet Access Available to All,” Facebook.com August 20, 2013, archived at Archive.org. The promotional video is “Every one of us,” Internet.org, August 20, 2013.↩︎
“Is Connectivity A Human Right?” Mark Zuckerberg, Facebook.com, August 20, 2013 (the memo is undated, so I’m taking the date from contemporary reports and other launch documents).↩︎
“Facebook And 6 Phone Companies Launch Internet.org To Bring Affordable Access To Everyone,” Josh Constine, TechCrunch, August 20, 2013; “Facebook’s internet.org initiative aims to connect ‘the next 5 billion people’,” Stuart Dredge, The Guardian, August 21, 2013; “Facebook project aims to connect global poor,” AlJazeera America, August 21, 2013.↩︎
“Facebook Leads an Effort to Lower Barriers to Internet Access,” Vindu Goel, The New York Times, August 20, 2013.↩︎
Meta Earnings Presentation, Q2 2023, July 26, 2023 (date from associated press release); “Facebook’s Q2: Monthly Users Up 21% YOY To 1.15B, Dailies Up 27% To 699M, Mobile Monthlies Up 51% To 819M,” TechCrunch, July 24, 2013. (The actual earnings presentation deck from the 2013 call doesn’t seem to be online except as a few screencaps here and there, which is irritating.)↩︎
“Distribution of Worldwide Social Media Users in 2022, by Region,” Statista, 2022.↩︎
An Ugly Truth: Inside Facebook’s Battle for Domination, Sheera Frenkel and Cecilia Kang, HarperCollins, July 13, 2021 (Chapter Nine: “Think Before You Share”).↩︎
“What Happened to Facebook’s Grand Plan to Wire the World?” Jessi Hempel, Wired, May 17, 2018; “Facebook is changing its name to Meta as it focuses on the virtual world,” Elizabeth Dwoskin, The Washington Post, October 28, 2021. (That should be a paywall-free WaPo link, but it doesn’t always work.)↩︎
“Myanmar’s MPT launches Facebook’s Free Basics,” Joseph Waring, Mobile World Live, June 7, 2016.↩︎
“Hatebook: Why Facebook is losing the war on hate speech in Myanmar,” Reuters, August 15, 2018. (You may see bigger numbers elsewhere—in a 2017 New York Times article, Kevin Roose claims that Facebook had 30 million users in Myanmar in 2017. Roose doesn’t cite his sources, but the same range his uses, from two million to more than 30 million, shows up in The Atlantic and CBS News. I don’t think there’s any way this number can be right, but Meta doesn’t disclose this information.)↩︎
“The Facebook-Loving Farmers of Myanmar,” Craig Mod, The Atlantic, January 21, 2016.↩︎
“Facebook is Worse than You Think: Whistleblower Reveals All | Frances Haugen x Rich Roll,” The Rich Roll Podcast, September 7, 2023. This is a little outside my usual sourcing zone—Roll is a vegan athlete…influencer, I gather?—but Haugen does a lot of interviews, and sometimes the least formal ones turn up the most interesting statements. The context for the bit I quote comes in around 8:30 and the quote is at 9:19.↩︎
“[“Facebook Destroys Everything: Part 1,” Faine Greenwood, August 8, 2023.↩︎
An Ugly Truth: Inside Facebook’s Battle for Domination, Sheera Frenkel and Cecilia Kang, HarperCollins, July 13, 2021.↩︎
“The Facebook-Loving Farmers of Myanmar,” Craig Mod, The Atlantic, January 21, 2016.↩︎
“Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar,” United Nations Human Rights Council, September 17, 2018—the “report landing page includes summaries, metadata, and infographics. Content warnings apply throughout, this is atrocity material.↩︎
“Ethnic Insurgencies and Peacemaking in Myanmar,” Tin Maung Maung Than, The Newsletter of the International Institute for Asian Studies, No.66, Winter 2013.↩︎
“Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar,” United Nations Human Rights Council, September 17, 2018; “They Came and Destroyed Our Village Again: The Plight of Internally Displaced Persons in Karen State,” Human Rights Watch, June 9, 2005.↩︎
“Civil War in Myanmar,” the Center for Preventive Action at the Council on Foreign Relations, publish date not provided; updated April 25, 2023.↩︎
The Burmese Labyrinth: A History of the Rohingya Tragedy, Carlos Sardiña Galache, Verso, 2020. The “140,000” figure is drawn from “One year on: Displacement in Rakhine state, Myanmar,” a briefing note from the UN Human Rights Council published June 7, 2013.↩︎
“Atrocities Prevention Report: Targeting of and Attacks on Members of Religious Groups in the Middle East and Burma,” US Department of State, March 17, 2016.↩︎
Myanmar: Security Forces Target Rohingya During Vicious Rakhine Scorched-Earth Campaign, Amnesty International, December 19, 2016.↩︎
Myanmar: Security Forces Target Rohingya During Vicious Rakhine Scorched-Earth Campaign, Amnesty International, December 19, 2016; “Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar,” United Nations Human Rights Council, September 17, 2018.↩︎
“21,000 Rohingya Muslims Flee to Bangladesh to Escape Persecution in Myanmar,” Ludovica Iaccino, The International Business Times, December 6, 2016.↩︎
“Rohingya Crisis: Finding out the Truth about Arsa Militants,” Jonathan Head, BBC, October 11, 2017.↩︎
“Myanmar: New evidence reveals Rohingya armed group massacred scores in Rakhine State,” Amnesty International, May 22, 2018.↩︎
“The Social Atrocity: Meta and the Right to Remedy for the Rohingya,” Amnesty International, September 29, 2022.↩︎
“Rohingya Crisis—A Summary of Findings from Six Pooled Surveys,” Médecins Sans Frontières, December 9, 2017.↩︎
“Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar,” United Nations Human Rights Council, September 17, 2018↩︎
“Rohingya Crisis—A Summary of Findings from Six Pooled Surveys,” Médecins Sans Frontières, December 9, 2017.↩︎
“Crimes Against Humanity in Myanmar,” Amnesty International, May 15, 2019 (dated PDF version).↩︎
“The Social Atrocity: Meta and the Right to Remedy for the Rohingya,” Amnesty International, September 29, 2022.↩︎
“Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar,” United Nations Human Rights Council, September 17, 2018.↩︎
“UN Human Rights Chief Points to ‘Textbook Example of Ethnic Cleansing’ in Myanmar,” UN News, September 11, 2017.↩︎
“World Court Rejects Myanmar Objections to Genocide Case,” Human Rights Watch, July 22, 2022.↩︎
“Genocide, Crimes Against Humanity and Ethnic Cleansing of Rohingya in Burma,” Anthony Blinken, US Department of State, March 21, 2022. “Country Case Studies: Burma,” United States Holocaust Memorial Museum, undated resource.↩︎
“The Social Atrocity: Meta and the Right to Remedy for the Rohingya,” Amnesty International, September 29, 2022.↩︎
“U.N. investigators cite Facebook role in Myanmar crisis,” Reuters, March 12, 2018.↩︎
“The Social Atrocity: Meta and the Right to Remedy for the Rohingya,” Amnesty International, September 29, 2022.↩︎
“Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar,” United Nations Human Rights Council, September 17, 2018.↩︎
“Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar,” United Nations Human Rights Council, September 17, 2018.↩︎
“Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar,” United Nations Human Rights Council, September 17, 2018.↩︎
“Revealed: Facebook hate speech exploded in Myanmar during Rohingya crisis,” Michael Safi, The Guardian, April 2, 2018.↩︎
An Ugly Truth: Inside Facebook’s Battle for Domination, Sheera Frenkel and Cecilia Kang, HarperCollins, July 13, 2021; “The Social Atrocity: Meta and the Right to Remedy for the Rohingya,” Amnesty International, September 29, 2022;↩︎
“Rohingya and Facebook,” Htaike Htaike Aung, Victoire Rio, possibly others, August 2022.↩︎
“Hatebook: Why Facebook is losing the war on hate speech in Myanmar,” Reuters, August 15, 2018.↩︎
“The Social Atrocity: Meta and the Right to Remedy for the Rohingya,” Amnesty International, September 29, 2022.↩︎
“The Social Atrocity: Meta and the Right to Remedy for the Rohingya,” Amnesty International, September 29, 2022.↩︎
“The Social Atrocity: Meta and the Right to Remedy for the Rohingya,” Amnesty International, September 29, 2022.↩︎
“The Social Atrocity: Meta and the Right to Remedy for the Rohingya,” Amnesty International, September 29, 2022.↩︎
“Rohingya and Facebook,” Htaike Htaike Aung, Victoire Rio, possibly others, August 2022.↩︎
“‘If It’s on the Internet It Must Be Right’: an Interview With Myanmar ICT for Development Organisation on the Use of the Internet and Social Media in Myanmar,” Rainer Einzenberger, Advances in Southeast Asian Studies (ASEAS), formerly the Austrian Journal of South-East Asian Studies, December 30, 2016.↩︎
“How Facebook and Google Fund Global Misinformation,” Karen Hao, The MIT Technology Review, November 20, 2021. Karen Hao is so good on all of this, btw. One of the best.↩︎
“Revealed: Facebook hate speech exploded in Myanmar during Rohingya crisis,” Michael Safi, The Guardian, April 2, 2018.↩︎
“The Social Atrocity: Meta and the Right to Remedy for the Rohingya,” Amnesty International, September 29, 2022.↩︎