The Facebook Dilemma | Interview Of David Madden: Tech Entrepreneur

The Facebook Dilemma | Interview Of David Madden: Tech Entrepreneur
The Facebook Dilemma | Interview Of David Madden: Tech Entrepreneur

David Madden is the founder of Phyandeeyar, an innovation lab that invests in local technology startups and trains entrepreneurs in Myanmar. He is also an entrepreneur-in-residence at Omidyar Network, an impact investment firm. This is the transcript of an interview with Frontline’s James Jacoby conducted on June 19, 2018. It has been edited in parts for clarity and length.

Why Were You In Myanmar?

My wife works in international development. She had previously worked in Myanmar after Cyclone Nargis, which was a big cyclone that devastated the delta area, the low-lying area in the south of Myanmar in 2008. She works in international development, and when Myanmar began to open up, she received her dream job offer, so we packed up our little family and moved to Myanmar in 2012 for my wife’s job.

What Did You Do In Myanmar Starting In 2012? What Was Your – What Were You Up To?

Well, it was a very interesting time. I initially began doing some marketing work, mostly some marketing and consulting work, but I was very interested in the telecoms liberalization process that was just starting when I arrived in 2012. I have a background in technology. I’ve spent most of my life working in technology and startups, and that’s what I was real excited about. I was excited about what was going to happen when Myanmar got connected. It wasn’t connected at that time, but it was going to get connected, and it was pretty clear that if the government made some key decisions correctly, connectivity was going to explode.

Myanmar’s Connectivity Revolution

Set the scene for me a little bit about that explosion that happens and what your view of it was at the time.

Until very recently, Myanmar was almost completely disconnected. At the time that I arrived in Myanmar in 2012, the country had rates of mobile and Internet penetration lower than even North Korea. It was largely disconnected. I remember when I bought my first SIM card, which was in July of 2012, it cost $250, and it wasn’t because I got some black-market SIM; that was just the cost. There was state telecoms company, and it was a monopoly, and there was only a few million people had these SIM cards.

The country was largely disconnected. It was hard to load your Gmail or watch a YouTube video or any of the things that people in places like the U.S. are used to doing. The transitional government, the quasi-civilian government that was running Myanmar at that time, had recognized that this was a real problem for Myanmar’s development, and it had made the decision to open up its telecommunications market and to issue at least two new licenses for companies to start mobile operators. That process takes place between 2012 and 2014.

What happens in 2014 is that two international telecoms companies, Telenor from Norway and Ooredoo from Qatar, are given mobile operating licenses, and they begin to roll out their services in the second half of 2014. Suddenly, those $250 SIM cards now cost just $1.50. That’s the beginning of Myanmar’s connectivity revolution.

Politically, at that point in time, it’s kind of a fragile time politically as well. This is a fledgling democratic state. How would you describe in very basic terms the political climate as it’s opening up to connectivity?

Well, it’s a very interesting time. There’s really a political opening happening at the same time as this connectivity revolution. When the 2010 elections were conducted in Myanmar, most people, most international observers, regarded them as neither free nor fair. What happened when the ruling party was-won in a landslide in 2010 is that the generals took off their uniforms, and you had this quasi-civilian administration. And Aung San Suu Kyi Daw Aung San Suu Kyi, was released from house arrest. Lots of other political prisoners were released as well, and this political transition begins.

There’s a by-election happens in 2012 to fill some vacancies in parliament, and Aung San Suu Kyi’s party is able to run and agrees to participate. She, in fact, is elected to parliament in 2012, so it’s a period of political transition. Myanmar has flagged that they will have an election at the end of 2015; the National League for Democracy, which is Daw Aung San Suu Kyi’s party, is participating in the political process; and there is a sense that the country is opening up.

Facebook In Myanmar

… When is it that you start to see Facebook coming into Myanmar?

So there’s a lot of different dynamics at play obviously in this transition period in Myanmar. You definitely have a political transition going on, and as part of the political transition, you start to see an opening up of the political space in Myanmar. Part of what comes with that opening up is we start to see some underlying tensions in Myanmar society coming to the surface.

One of the dynamics that you see is the rise of a kind of extremist Buddhist nationalist movement. …There is a sense that this is a tough time for Buddhism in Myanmar and in the region. Some Buddhist leaders look around the region, and they see countries like Indonesia, predominantly Muslim, Malaysia, places like neighboring Thailand, where Buddhism doesn’t hold as much sway as it once did.

And so this sort of extreme Buddhist national movement becomes very, very active. That’s one of the things at play. And in this transitional period, you see some moments of real violence – a few notable ones in 2012, when there’s conflict between the ethnic Rakhine in Rakhine State, which is in the western part of Myanmar, and some of the Muslim groups, predominantly the Rohingya Muslim groups in Rakhine State. And you end up with about 140,000 Rohingya displaced in this Rakhine State, and they end up predominantly in IDP [internationally displaced persons’] camps. There’s also a wave of anti-Muslim violence through the country in 2013, and some of that happens in 2014 as well.

Within this political transition, there are – some of the tensions in Myanmar society start to come to the fore. Now, Facebook obviously has a very small footprint in Myanmar in these early, early days. The country is largely disconnected. Most people don’t have a phone; most people don’t have a SIM card; and most people don’t have access to the Internet. But there are already a few hundred thousand Facebook users in Myanmar around about this time, and hate speech is already a problem.

I remember in 2012, there was an article on an online publication run by the Australian National University called New Mandala, which documented efforts by local civil society groups to try to tackle the hate speech problem on Facebook. So as early as 2012, when there were relatively few Facebook users in Myanmar, the hate speech problem was significant enough that academics or quasi-academics were writing articles about the efforts of civil society to push back on hate speech on Facebook. And that was from 2012.

When you say hate speech, what kind of hate speech are we talking about at that point in time?

We’re really talking about hate speech that predominantly targets the Muslim minority.

And that was on Facebook as early as 2012?

Mm-hmm.

And sotell me then, at that point in time, Facebook is a small player in the game. Bring me through the trajectory of how it goes from there.

Yeah. As I mentioned, there was some violence. There was some anti-Muslim violence during the course of this period, and one of those incidents happens in Mandalay in July of 2014. There was reports, later proved to be false, that some Muslim men had raped a Buddhist woman in Mandalay, which is one of the main cities in Myanmar. This was originally posted on a blog, but it was shared by [Ashin] Wirathu, who is one of the prominent leaders of the Buddhist National Movement. It was shared by him on Facebook.

And what happens is that Mandalay erupts. There are riots for several days, at the end of which I think two people were killed, a Muslim person and a Buddhist person, and I think maybe 20 people are injured. But this riot had been sparked by the posting of this rumor of this alleged rape. And so, it was a very, very scary moment, and the government at the time really didn’t know how to respond. They could see how important Facebook was to this situation that had erupted in Mandalay, but they didn’t know how to contact Facebook.

There are some efforts made to get in touch with Facebook, and when they don’t work, the government actually blocks access to Facebook. So part of the government’s response to the situation on the ground in Myanmar, they implement a curfew, and they also block access to Facebook to correspond with the curfew.

Because it was clear that a rumor had in some ways incited this riot, a rumor on Facebook?

I think it was clear to everybody that what sparked this riot was this posting on Facebook of the alleged rape.

So the government shuts down Facebook. What happens right after that basically? How long do they shut it down? Does Facebook respond? What happens next there?

Well, the government manages to get control of the situation in Mandalay after a few days. I can’t say exactly what happened back at Facebook headquarters, but I’m pretty sure that when Facebook gets turned off or gets blocked, that must sound some kind of alarm back at Menlo Park. So what happened was that the director of policy for the Australia-Pacific region gets on a plane and comes to Myanmar to try to address the situation. That happened later, in July in 2014.

This is obviously Mia Garlick, right, who comes?

Yes.

Tell me who she is. Who is Mia Garlick?

Mia Garlick is an Australian policy expert. She spent time working for an Australian minister, has a strong background in telecommunications, and she’s based at the Sydney office of Facebook. She has responsibility for the whole region.

She comes to Myanmar at that point, and what did she do? What did she say?

Well, I wasn’t at that meeting on the 20th of July, 2014, and the meetings that she held in country at the time. But she makes a number of commitments about what Facebook is going to do in Myanmar, and part of that is to try to get better resources for Facebook users in Myanmar. Facebook has something called its Community Standards. Facebook’s Community Standards are the rules that are required to exist within the Facebook community. And if you breach Facebook’s Community Standards, content that breaches Facebook Community Standards can be removed. If you do it enough, you yourself, the individual, the user can be banned or removed.

The Facebook Community Standards include things like you can’t spread hate speech or incite violence. You also can’t sell drugs and prohibited things like weapons. There’s a whole bunch of things in the Facebook Community Standards. Now, at the time, the Facebook Community Standards weren’t available in the Myanmar language, so all these early users of Facebook in Myanmar, if they could read English, they could have read the Facebook Community Standards, but they weren’t available in the Myanmar language at that point, so one of the things that Mia commits to do is to get the Community Standards translated into Burmese, into the Myanmar language, and make it available to users.

Did they do that?

They do. They do.

How long did it take them to do that?

It takes them a while. It takes them a while. And the organization that I founded, an organization called Phandeeyar, we’re actually involved in some of that work, not in 2014 but in 2015. It takes a while for Facebook to get to this piece of work. Indeed, by March of 2015, it still hadn’t really happened….

Myanmar’s Connectivity Revolution

Myanmar’s opened up. Connectivity is started because these two telecoms come in. Tell me about just how rapid a change it is there and how Facebook plays into that.

At the time of these riots in Mandalay in July of 2014, the new telecoms companies haven’t actually started. So there is still relatively few mobile users in Myanmar at that time, probably just a few million, and the number of Facebook users is a subset of that, probably a few hundred thousand at that point – unlikely that it was more than a million.

What happens in the second half of 2014 is these two new mobile operators launch, and connectivity explodes. There’s incredible pent-up demand, and the telecoms companies are obliged to roll out rapidly. They’ve promised the government that they will get the country connected quickly, and they do. There’s this incredible flurry of activity to build towers all over the country and ensure that the country can be rapidly connected.

Suddenly SIM cards are only $1.50, and everyone wants one. Some people want two or three. And within the space of less than three years, you end up with a situation where Myanmar has more active SIM cards than people. So the country gets connected, and one of the operators rolls out a 3G network; the other operator follows suit quite closely. And then later there’s 4G networks rolled out.

One thing that’s very interesting about Myanmar’s connectivity revolution is that the country leapfrogs directly to smartphones and data. Most people never even had a landline in Myanmar. By the time the country gets connected, you can buy a smartphone in Myanmar for as little as $23. So even in a very poor country like Myanmar, that’s affordable to most people. And of course, there’s real value in having a smartphone, so most people try to get them. The situation now is that Myanmar has the highest smartphone penetration rate of any developing country in the world. More than 80 percent of phones in Myanmar are smartphones.

This is a very important thing to understand, is that when Myanmar gets connected, it does this leapfrog straight to smartphones and data. And of course, the two key services that people want when they get their smartphone: a messaging app called Viber – which is a little bit like WhatsApp or WeChat in China, but it’s the most popular one in Myanmar – and Facebook.

And sonormally, the average person’s experience in the second half of 2014 or 2015 is that they would go to a mobile phone store and buy their phone, get their SIM card, and the mobile phone store owner will help them get connected, and as part of that getting connected service, they would install certainly those two key apps for you – help get you set up on Facebook, get you setup on Viber.

And so when you look at the number of users of these services, they track very closely to the overall number of active SIM cards in the market, meaning that as the number of phone users increases in Myanmar, so, too, do the users of Facebook and this other service, Viber. Not quite to the same level – it’s not a 100 percent; in fact it’s probably more like a quarter, but you see these two graphs tracking very closely. Is that enough detail?

Sure. But putting Viber aside, is Facebook essentially becoming the main – the internet there? Explain the importance of Facebook at that point in time to Myanmar in terms of where you get your information, in terms of what that actually is as a social utility there.

Most people in Myanmar haven’t been able to access the internet before, and Facebook basically becomes the default internet. It’s incredibly easy to use; it’s very intuitive; all your friends and family are on it, so it becomes the main way in which people get information and connect to the internet. Every organization from the local tea shop where people go for their morning cup of tea or their afternoon cup of tea through to all the local papers, this becomes the key tool that they use to connect with the community.

And so for the Myanmar user, Facebook is almost everything. It’s really the key information service. If you scroll through the average Myanmar user’s Facebook feed, you’ll see everything from all the major media posts through to what the local community organizations are doing. It is literally a news feed.

It’s astonishing. What, from your perspective – was Facebook making a big push at that time in Myanmar to, and seeing this as a tremendous opportunity for market growth? What were you detecting actually of what Facebook’s efforts were at that point in time?

I think Facebook’s work in the country was very low-key at this point. Most of this connectivity and this growth in its users was happening organically. Sure, they had someone who was responsible for the Myanmar market, and that person worked with the local telecoms operators to ensure that Facebook had lots of prominence in the offering, but this is a tremendous example of network effects.

Here was this service that was very easy to use, everyone who’s already connected largely was already on it, and everybody wanted to be there. So much of this growth is happening organically. As I said, it’s just part of the process. You would go; you’d buy a phone; you’d get a SIM card; you’d get set up on Facebook.

Did Facebook have any employees or an office in Myanmar at that time?

… No, Facebook doesn’t have an office in Myanmar. It has a regional headquarters in Singapore. Mia is based in Sydney and periodically comes to Myanmar, sometimes with some of her teammates, sometimes by herself.

And why was it that you started, and when was it that you started to get concerned about the way Facebook was being used in Myanmar? What was the inflection point for you?

I had been in Myanmar since the middle of 2012, and I was deeply interested in this connectivity revolution, the way the country was opening up and the way in which technology was going to be used. So I had been following this very, very closely, and I had been reading, for example, that those posts on New Mandala or about hate speech – I had participated as a trainer at a tech camp that was held at the beginning of 2014 with lots of other civil society groups in which people were already expressing real concern about hate speech as early as January 2014, so I was well aware of this situation and frankly just wasn’t surprised by what happened in Mandalay in July of 2014.

Indeed at the time that I thought that this was an example of how Myanmar could become – Facebook and Myanmar could become what radios had been in Rwanda. If this problem wasn’t addressed quickly, then you could easily see how the sort of thing that happened in Mandalay, where, as a result of things that were posted on Facebook, you have real-world violence. It was very easy to see how this could, there could be more of that in Myanmar.

Hate Speech On Facebook

After the Mandalay incident, do you see more? Is there an accumulation of proof that you’re seeing that shows you that? Bring me through the evolution of where things go from there.

…I’m very interested in this connectivity revolution that’s happening in Myanmar, and I’m really interested in how technology could be used to help accelerate change and development in the country. And so in 2014, even before any of these new towers go up, I organized the country’s first-ever hackathon. In March of 2014, I bring the local community of technologists together with a group of civil society organizations to have a 48-hour hackathon.

And that was partly an experiment by me in seeing what could the technology community in Myanmar do together. It was real exciting to see the energy and the enthusiasm among the tech community and among civil society organizations for how technology could advance their work. Out of these hackathons, I was able to raise some money to create a physical space, and at the end of 2014, I opened this space called Phandeeyar.

Phandeeyar means “creation place” in the Myanmar language. Phandeeyar is a community technology hub. Later we had a startup accelerator and all kinds of other things, but it starts as a community technology hub. It’s a physical space where the community can come together and figure out how to use these tools to advance Myanmar.

One of the first big events that we have – in fact, the first big event that we have is in January of 2015. It’s a big three-day event, and the whole focus of the event is how technology could be used to address this hate speech problem. At this event, more than 100 civil society leaders come together with about two dozen technologists, some of them local, some of them international, and they spend these three days trying to figure out what could be done about this problem.

Moving through 2015, our organization, Phandeeyar, got incredible visibility on this problem of hate speech. I already had some sense of it, but working with these local organizations who were on a day-to-day basis confronted with the reality of hate speech and incitements to violence really gave us a lens into the depth of the problem.

What was it that you were seeing specifically? Hate speech is such a tough term because it’s not that specific, but when you say – what were the sorts of things that you had a lens into?

To be clear, there were a whole range of problems associated with Facebook. It was not just hate speech. So I mean really, you had an entire population of users who really had no experience of using the internet and internet services, so you can imagine the range of challenges that this posed. At the time there were all kinds of Facebook policies that proved challenging. An example of that would be Facebook’s real-name policy. The real-name policy sounds quite sensible, but when you take it to a place like Myanmar, and you start thinking about vulnerable groups – LGBTQ groups, for example; activists and others – you can be put in a very, very difficult situation as you’re outed through having to use your real name.

So there was a whole range of problems. But the most serious and significant was this hate speech problem, and what we were seeing was just this incredible volume of animosity and targeting predominantly of the Muslim community in Myanmar. There’s well-established literature on how these things escalate, but you would see the use of memes, of images, things that were degrading and dehumanizing and meant to marginalize certain parts of Myanmar’s people.

And incitements to violence as well?

Sure. You’d see that as well.

Tell me about those.

I think one thing that’s important to understand is that there was this sort of extremist Buddhist Nationalist Movement that was or is – it saw itself as mounting a defense of Buddhism, so things that were seen as a challenge to this majority Buddhist country needed to be defended. And so people who were seen as advancing the interest of Muslim communities in ways that were a threat to them, you’d see posts targeting them, urging people to address the problem, to tackle these people. They were the most concerning things that we saw, this explicit targeting of individuals and in some cases organizations.

Can you explain what you mean by targeting? When it comes to these individuals, were they naming them, were they telling people to go to their house?

Yeah. An example of this, and it comes a little bit later, but an example of this occurs in 2016, where a local journalist is targeted. So this local journalist, who happens to be a Muslim, is accused of being part of a small terrorist organization in Rakhine State, of advancing their interest, and his face is splashed on Facebook posts, and people are urged to go and kill him. And so that’s an example of a very explicit targeting of an individual.

Posts like that would go viral, rapidly go viral. Within a matter of hours, it would be shared hundreds of times, quickly thousands of times. That post was shared more than 5,000 times before it was actually taken down.

In terms of virality and hate speech, are you also talking about rumors and the ability for rumors to spread?

I think it’s important to understand that hate speech is just part of what’s going on on the platform. There’s lots of other bad things that are happening on the platform that maybe don’t qualify as hate speech. You certainly in Myanmar have a proliferation of mis- and disinformation. Rumors have always existed in Myanmar society, and now they travel at the speed of Facebook. So you would have not just hate speech but misinformation, disinformation, rumors, extremist propaganda, all kinds of bad content, content that breached Facebook’s Community Standards.

It wasn’t just hate speech. Hate speech was where this immediate possibility of real-world harm was most obvious, but there was this broader context where, in many ways, a lot of what happened on Myanmar Facebook was terrible. It was just a sort of a cesspool.

Warning Facebook About Dangers

It’s interesting, because Facebook has taken the stance that this tool, they’re almost bestowing a great thing on the world, a great tool for democratization. Was there any evidence that at the time that Facebook saw themselves as a force for good in Myanmar, that their tool was – it’s a great gift to make the world more open and connected?

I just don’t think Facebook was that engaged with what was happening in Myanmar or in many of these other emerging markets. Here was this platform. It was a platform, so it could be rolled out anywhere, and it was growing incredibly quickly. There were some problems. They were doing a few things, but they never really wrapped their head around the scale and the seriousness of the problem.

When was Facebook made aware of these problems?

I can’t say when Facebook first became aware of these problems. I know that as early as 2013, people were directly communicating to Facebook that this was a serious problem in Myanmar. At the end of 2013, an international journalist who had won a Knight fellowship was in Silicon Valley, and she had spent a lot of time in Myanmar, had done a lot of reporting in Myanmar and was intimate with the hate speech problem. Her name is Aela Callan, and [ she’d done a lot of reporting for Al Jazeera at the time. And so Aela went and saw Elliot Schrage, who I believe at the time was the head of Facebook’s communications team, and she raised explicitly this hate speech problem in Myanmar and urged them to take it seriously. My understanding of that meeting – I wasn’t there – but my understanding is that Elliot wasn’t that interested in the hate speech problem. This was seen as a tremendous market opportunity, certainly in terms of user growth. Unlikely that Facebook was going to make much money out of Myanmar for sometime, but certainly in terms of user growth it was a great opportunity.

Aela didn’t really have much luck sounding the alarm in 2013. She actually goes back the following year. She goes back in 2014 and again tries to impart to Facebook the seriousness of this problem, and again, we don’t really see anything substantial coming out of that. Obviously, after these riots in Mandalay in July of 2014, people of Facebook knew, they clearly knew, here was an example of where content on Facebook had led to people being killed.

So Facebook absolutely knew by 2014, and Mia Garlick flies into the country in July of 2014. They do a few things after that, including the translation of the Community Standards into the Myanmar language. They work with a collection of local civil society groups to develop a digital sticker campaign called Panzagar, which means “flower speech.” This flower speech campaign, this idea that people would only speak nicely or say good things on Facebook, had been a grassroots community effort since at least 2014 by a broad collection of civil society organizations.

The sticker campaign was an attempt to take that online to this Facebook platform that was growing so rapidly. So there were a few things that Facebook had done, but it wasn’t clear even at that time that they really appreciated just the risks in Myanmar.

Bring me to your meeting and your presentation to them, and what was the impetus for doing that, and why was it so urgent for you?

I was concerned. After working with local organizations through my organization, Phandeeyar, in the course of 2015, I was really concerned that the seriousness of this was not understood. And so I made a presentation at Facebook headquarters in May of 2015, and the purpose of that presentation largely was to try to help people understand what was going on, help Facebook decision makers understand what was going on in Myanmar at the time, and just how dangerous the situation was.

…Who was your audience for this presentation?

Mia organized the presentation, pulled it all together. I can’t tell you exactly who was there actually because I made the presentation in one of the theaters at Facebook headquarters. There were a bunch of people physically in the room, and then there were a bunch of people who joined via video. Some of those people I believe were elsewhere on campus. Facebook headquarters is pretty large. It’s a sprawling, sprawling campus. And some, I think, were joining via video from London. I think that was the Facebook security, policy and security team; part of them were based in London.

There were a range of different people participating in this meeting, not all of them were in the room. I couldn’t tell you who everyone was who participated in that meeting, but it was a good collection of people who were interested in this question of Myanmar’s connectivity revolution, and particularly the problem of hate speech and how it might be addressed.

Would you consider it a warning to them?

Yes. I was pretty explicit about the state of the problem.

I really tried to help people understand just how significant the risk was and how bad the consequences could be. At that presentation in 2015, I drew the analogy with what had happened in Rwanda. There had been genocide in Rwanda, and radios had played a really key role in the execution of this genocide in Rwanda.

And my concern was that Facebook would play a similar role in Myanmar, meaning it would be the platform through which hate speech was spread and incitements to violence were made. And so I said very clearly to them that Facebook runs the risk of being in Myanmar what radios were in Rwanda. I said that very clearly; I said it very explicitly. It wasn’t the last time that I said it. I said it on many occasions after that. But I think that was the first time that I had said it to them.

That essentially Facebook could facilitate a genocide?

I didn’t use those words, but clearly the implication was that this platform could be used to foment hate and to incite violence.

Facebook’s Reaction To Warnings

What was the reaction to that at Facebook?

Well, it was a fairly confronting thing to say, obviously. I got an email from Mia shortly after that meeting to say that what had been discussed at that meeting had been shared internally and discussed and debated and apparently taken very seriously. I can’t say exactly what the reaction internally was.

What would you have liked to have seen as a result of that meeting?

Facebook is a company that’s driven by its product. And so if you want to tackle hate speech or misinformation, disinformation, really, any of the problems that exist on the Facebook platform, you really need a product solution. And so it’s great to have policy people involved and the safety folks, but at the end of the day, what Facebook needed to be able to do was rapidly identify content that breached its Community Standards and get that down. And that’s what wasn’t happening in 2015, and it still wasn’t happening several years later.

That remains – to this day, that remains the central problem in Myanmar, is that there is a proliferation of content that clearly breaches its Community Standards, including hate speech and incitement to violence, and that content does not come down fast enough, if it comes down at all. The systems that they have to try to identify that content and to get it down quickly are woefully inadequate.

How do you know that?

…We’ve seen content that breaches Facebook’s own Community Standards persist on the platform even after it’s been reported many times.

Going back in time to your meeting at Facebook, at that time, do you have any sense as to how many content reviewers Facebook had that were Burmese-speaking?

One of the challenges that we had with Facebook, that we had throughout our time dealing with Facebook, has been [that] a lot of the process, a lot of the way they operate is a black box, and we just haven’t been able to get good information on it. It was obviously a key question for us is, who is reviewing this content? Who do you have on your team that is making the judgments about whether this content breaches your own Community Standards?

Even to this day, we still don’t really know how many Myanmar speakers that Facebook has on its team, so this has been a real challenge. This content review process, the content management system that Facebook uses to try to determine whether content should stay or be removed, it’s a black box.

… Did you ask at the time what their process was, how many people spoke the Myanmar language, where those people were, how they determine this?

Certainly that was part of the questions that we’re asking of Facebook in this period 2015, 2016: How does the process work? Who’s involved in reviewing this? Who’s making these decisions? I don’t think we ever really got clarity on that. We never really got clarity on that.

… … Did you get a sense that they were understaffed or that they were under-resourced or that they just haven’t invested in this or that it was too complicated? What was your sense of why this was such a problem of a system for them?

Our clear sense at the time was that Facebook didn’t have the resources. It hadn’t invested in the resources necessary to be able to make good judgments about what was problematic in Myanmar on Facebook. So when content was reported for whatever reason, perhaps the people didn’t read Myanmar language, or perhaps they read Myanmar language but they’d been out of the country for so long they didn’t really understand the context in which it would be interpreted, but for whatever reason, content that was clearly deeply problematic was persisting on the platform.

And it was not until it went for further review processes when those of us who were involved in you know signaling the alarm about certain content, it’s not till it got sent for further review that people then realized how problematic it was.

… Going back to your meeting for a second, here you’d been in Myanmar; you’ve been on the ground. You’re then going into the Silicon Valley bubble, in a way. Was there a weird cultural thing almost? I know you have a tech background, but coming from a place where you’re on the ground where there’s real-world consequences to some things that are playing out on Facebook, what was it like going into Silicon Valley, into Menlo Park and to the belly of Facebook at that point in time?

Well, Facebook headquarters one hack away is very different from downtown Yangon, and the sort of situation that existed in Myanmar at the time literally couldn’t be further from the valley. It is literally a whole other world. And that’s why in our dealings with Facebook we spent so much time trying to help them understand some of the broader political and social dynamics at play in Myanmar. It didn’t feel like people appreciated how fragile things were and still are in Myanmar. There are real cleavages and real tensions at play, and we had seen it erupt into violence on numerous occasions.

…What would the responsible thing for Facebook to have done when you left that building?

… I think the key thing that we were hoping to get out of that meeting in May of 2015 is a response from Facebook that dealt more with the specific circumstances in Myanmar. So there were all these kinds of different challenges that the community was experiencing, and it felt like it needed more attention; it needed more resources. Some things were already happening. There was this digital sticker campaign, the Panzagar sticker campaign. We were working with Facebook to develop a campaign to educate people about the Community Standards. There were some processes in place for people in the community to flag problematic content. So there were a few things happening, but it was fairly piecemeal, and it wasn’t really product-related.

Was it adequate?

No, it was inadequate.

What was the follow-up after that meeting? Did you say anything that was – did you present any prescriptions as to, on the product side of things that they could possibly do at that point in time. I know you did in 2017, but was there anything you did at that point?

My recollection is at the May 2015 meeting, we did flag some of the challenges of the product in Myanmar that was specific to Myanmar. The fact that people often had an outdated version of the app – and so they didn’t necessarily have the latest version of the app. The tools, the app itself, the platform wasn’t in Burmese, for example. We did flag a number of product challenges, and some of those challenges did become even more acute over time. The fact that people had outdated versions of the app so they didn’t have the latest reporting tools or that their reporting tools weren’t easy to understand. Or the reporting tools weren’t in Burmese. This is challenging.

We have to understand, we have – we’re rapidly getting millions of Facebook users, and these are new users, and they don’t necessarily have a good understanding either of the general rules of the platform, the Community Standards, or of the ways of responding when they see content that’s problematic.

… It’s important to understand that much of Facebook’s strategy to date for dealing with problematic content has been to rely on users. That’s been the key strategy. Mark Zuckerberg now talks extensively about using AI and for this stuff to happen automatically. But to date, and frankly even today, it’s primarily put back on the users to flag, to signal problematic content.

And so if you have an old version of the app that doesn’t have those reporting tools, or if the reporting tools are not in your language, or you’re just a new user and following the reporting flow is difficult, then you’re not going to get that much reporting. And if that’s the way people back at headquarters are understanding the magnitude of the problem, then they’re going to be missing a lot of it. So this was definitely flagged at the time.

Problems In Myanmar Grow

…Coming out of May of 2015, what was the trajectory of things that rapidly escalated there?

After May of 2015, this connectivity revolution continues to happen, and within a period of about 18 months, certainly less than two years, you have most of the country connected – not all of the country, but you have most of the country connected, and you go from having a few million Facebook users to tens of millions of Facebook users. That all happens in less than two years.

The 2015 election happens largely peacefully, and it is widely regarded as a success. There were real concerns going into the 2015 election that that wouldn’t be the case. Lots of people were worried that there may be violence, that Facebook could be used as a platform to incite violence in the lead-up to 2015. That didn’t happen, fortunately. We do still continue to see this extremist Buddhist National Movement flexing its muscles and continuing to utilize the Facebook platform.

In what way?

Well, in the previous years, there had been all kinds of different media utilized. Speeches by Wirathu would be distributed on DVDs around the communities. There would be pamphlets; there would be all kinds of traditional media. And [now] you see increasing use of Facebook as the number of users grow.

As a tool for this movement?

Yes.

In order to do what?

I think the movement, the Buddhist Nationalist, this extremist Buddhist Nationalist Movement, is really trying to assert the primacy of Buddhism in Myanmar.

…You understand – you’re a technologist – and Facebook has an algorithm, right? Its News Feed is an algorithm. I’m just curious about what, if you – step back from it or you analyze it, what was the role that that algorithm, that News Feed algorithm was playing in Myanmar?

It’s hard to assess exactly what the role of the News Feed algorithm in all of this was. I think what we know is that for a long time, certainly until very recently, the algorithm has prioritized popular content, content that people are really engaging with. So a consequence of this is that if you’re putting out hate speech or misinformation or disinformation and you are able to get high levels of engagement with that, then it will have this mutually reinforcing effect.

For people who put out bad content, if they can get that going in the community, then it will continue to be more visible in people’s News Feed, and that’s absolutely what we see happen in Myanmar. Exactly how the algorithm plays into that, I don’t know. I’m not intimately familiar with the Facebook algorithm obviously, but what we do know is that priority is given to content that is proving to be popular and that lots of people are engaging with.

And so if you’re very effective at crafting content and engaging a user base to engage with that content, if you’re very effective at getting people to engage with that content, then that content is going to be widely seen, increasingly widely seen. This is definitely what we see, a dynamic that we see at play on Facebook in Myanmar, where content that is deeply problematic will somehow miraculously go viral in the space of hours.

In early 2017, Daw Aung San Suu Kyi’s lawyer, U Ko Ni, is assassinated. Returning from a trip overseas, he’s shot dead outside, at Yangon Airport, and very quickly, we see a post on Facebook celebrating this assassin, claiming him as a hero for killing this Muslim lawyer. U Ko Ni was Muslim.

That post goes viral incredibly quickly. I have a background in digital marketing, and there’s a difference between when things go viral organically and when they’re carefully crafted to go viral. This is a key problem in Myanmar, is that content that breaches Facebook’s own Community Standards is allowed to persist on the platform, and it goes viral very quickly, and that’s where the real-world harm comes.

Is your suspicion that this content was designed to misinform and that there’s actually campaigns being waged gaming the Facebook algorithm in order to incite violence?

I think when you look at what happens on Facebook in Myanmar, it’s clear there are some pretty sophisticated users.

 – that might look to do harm to others and put out content that incites that?

It certainly looks like that’s the case.

Have you made Facebook aware of that?

I think we made it very clear to Facebook, especially in a meeting at Facebook headquarters in January of 2017, that there was deliberate, deliberate misuse of the platform; that there was systematic and deliberate attempts to spread hate speech, misinformation, disinformation, propaganda. We said that very, very clearly.

So bring me to that meeting and what the impetus was there, because it was obviously much more specific. What brought you there in January 2017?

Just be clear, I wasn’t physically there.

Oh, you weren’t.

A colleague was physically there.

OK, got it.

I was on the –

On the blower [telephone].

On the blower, yeah.

Facebook’s Reaction To Warnings

… How did you end up having, getting a meeting with Facebook in January 2017?

I think it’s important to understand that we engage with Facebook on a variety of different levels and in a variety of different ways. Part of what we were doing was developing digital literacy materials that would educate people about Facebook’s Community Standards. They were supportive of the work that local civil society groups were doing to try to combat hate speech on Facebook, to promote a more positive narrative. They were supportive of that.

They were supportive of efforts that were happening to develop the start-up ecosystem, so there was engagement with Facebook on multiple different levels across a whole range of different things, not just the hate speech, just to be clear about that. Facebook would come, representatives of Facebook, usually Mia, would come periodically to Myanmar, and we would meet. We had communication channels around problematic content, not just Phandeeyar, but other local organizations that would enable the most problematic content or urgent situations to be flagged.

There was regular engagement. There was regular engagement with Facebook throughout the rest of 2015 and during the course of 2016. … At the end of 2016, we wanted to have a more comprehensive meeting with Facebook about the situation in Myanmar. That was organized for January 2017. One of my colleagues went to Facebook headquarters; myself and some other team members were on the phone from Myanmar. The objective of this meeting was really to be crystal clear about just how bad the problem was.

There were a number of things that were going on at the time. Firstly, the situation in Myanmar was really quite problematic in a number of different ways. There was active conflict in Kachin [State], in Shan State, in these two other states in Myanmar. The situation in Rakhine State – this western part of Myanmar where the Rohingya are – that situation had gotten much, much worse during the course of 2016. Essentially you had more than 100,000 people in these IDP camps, and that was a ticking time bomb. In late 2016 some parts of the Rohingya community are radicalized and actually take up arms and attack a Myanmar security post, and this provokes a very strong reaction by the Myanmar military.

And Myanmar Facebook quickly rallies behind the military, even though there are some awful things that happen in the process of what they refer to as a “clearance operation.” The situation in Myanmar was really concerning. We were genuinely worried about where things might go from there, and the situation on Facebook was even worse because what was really apparent by now was just how rife the hate speech problem was, and importantly, just how inadequate Facebook’s response was. So we were very clear at that meeting that their systems just didn’t work. The processes that they had in place to try to identify and pull down problematic content, they just weren’t working. And we were deeply concerned about where this was going to go, and the possibility that something even worse was going to happen imminently.

We were very plain, and we were very blunt in that meeting in January 2017. We went through all the problems as we saw them, and the essence of that was that their content management system was not effective. It was not good at identifying content that breached their own Community Standards. They made errors; content wouldn’t come down, or it would come down too slowly, and that was a huge issue.

The related issue was that the tools that they had in place to get more information about this problematic content, they didn’t work. So some of these problems that we flagged earlier in 2015, these really came to the fore. We see that users don’t have access to the reporting tools, or they don’t know how to use the reporting tools, so the systems that Facebook has in place to try to identify this content, they’re just not working.

And the backstop, this collection of local civil society organizations, who, when they see bad stuff are flagging it, this is completely inadequate. It’s not nearly enough to address the problem. We were frustrated because these were the trends and the dynamics that we had been seeing play out over the preceding 18 months, and it felt like we were at a key moment where we needed to lay this clearly at their feet and say: “More is needed here. More is needed.”

And at that point, had the violence become real?

Well, certainly there had been this incident in Rakhine State where this radicalized Rohingya group had attacked a security post of the Myanmar security apparatus, and they had responded with force. So there was definitely that – that was definitely happening on the ground. In just a couple of weeks from that meeting, U Ko Ni would be assassinated.

But there was a sense that things could easily get out of hand. It didn’t require much imagination at that point, so there was a fierce urgency for us in that meeting.

… This need to put in place product solutions that were going to try to address this central problem, which was that hate speech and other content, dangerous content that breaches Community Standards was persisting on the platform, and the systems in place to identify them and pull them down were inadequate.

We were quite specific about what we thought was needed, what needed to be done. Part of it was that this black box, this process of identifying, pulling down content; we needed more clarity about that. We had no really – really very limited idea about how that actually worked. Clearly they needed a better understanding of the local context. That was absolutely critical, and we made that point very strongly, that there needed to be a much deeper understanding of the context in which Facebook was operating in Myanmar.

We also flagged that significant work was needed in order to make this content identification process actually work. They needed to do a forced update of the apps so people had the latest version of the app. They needed to invest product time in making sure that those reporting tools actually worked in Myanmar, and they needed better emergency processes. You couldn’t be relying on a handful of local civil society organizations to kind offlag the worst stuff. That was just completely inadequate.

So we were very prescriptive and very clear at that meeting. At that meeting I reiterated this point that there was a real risk that Facebook would be in Myanmar what the radios had been in Rwanda, and I was really clear about that.

And basically bear some responsibility for violence. Is that what you’re saying?

Yeah, absolutely.

What was Facebook’s reaction at that meeting? I mean, this is your second meeting at that point.

We’ve had many meetings with Facebook. This was the second one where we had someone actually at headquarters. It was a sobering meeting. I think the main response from Facebook was, “We’ll need to go away and dig into this, and come back with something substantive,” which was exactly what we were trying to get at that point, which was a really substantive response.

The thing was, it never came.

It never came, meaning what?

It never came, meaning that the significant effort to make the system that identifies and pulls down content that breaches Facebook’s Community Standard[s], that system did not get improved.

How do you know that?

I mean, we can look at the evidence on the ground.

And what does the evidence suggest?

Well …

What happened?

The situation in Rakhine State in late 2017 gets much worse. We have hundreds of thousands of Rohingya displaced through a very aggressive security operation by the Myanmar military. The U.N., the United Nations conducted an assessment in early 2018, and the U.N. concluded that Facebook had played a determining role in what the U.S. government and others have described as a textbook case of ethnic cleansing.

Interviewer: … What has Facebook said publicly about how they’ve responded to the crisis in Myanmar? I mean,I think we’d have to go back and pore over a range of public statements that Facebook has made. I think they’ve acknowledged that the situation is challenging. They’ve said that they’ve tried to do a number of things over the years, and they’re trying to do some more things now.

But –

Have they admitted some culpability for what’s taken place? No, I don’t think they’ve done that. I don’t think they’ve really acknowledged the extent of the role that the platform has played in the situation and Myanmar over the years.

Do you think that they’ve acknowledged how much they knew about the role that the platform could have played in violence in Myanmar?

I don’t recall seeing anything publicly from Facebook that acknowledged that level of responsibility.

It seems to some degree like Facebook feels as though this was a rather new problem, that this had taken them a little bit by surprise, and that they’re now looking into it. Would that be true?

I think there’s a number of things that have happened in the last six to nine months that are pretty problematic. There’s definitely been the suggestion from some people speaking on behalf of Facebook that this kind of caught them unawares, by surprise. I don’t think Mia has said that. I’m not sure that any of the people we dealt with have said that; I’m not sure they would say that because it’s just clearly not true. I mean, the evidence is there that organizations like Phandeeyar, like Meedo and others made this very, very clear over a period of several years what a problem it was.

… The second thing is that we saw in this interview that Ezra Klein from Vox Media conducted with Mark Zuckerberg earlier in 2018, this idea that the systems were working. There was a very scary incident in late 2017 where incitements to violence were being passed around on Facebook’s Messenger platform, and in his interview with Ezra Klein, Mark Zuckerberg says that their systems detected this and they were able to take action. This is just clearly not true. It was another example of something that the local civil society groups in Myanmar had identified and flagged for Facebook to deal with.

In fact, it was a textbook case of exactly the problem that we had been talking about for years, which was that there was an overreliance on ad hoc arrangements with third parties to identify problems, and that Facebook’s own systems, its product, was not actually capable of identifying these systems. And the fact that Mark Zuckerberg claimed that they were, that is deeply, deeply concerning, because it suggests that at the top of the organization, there’s a lack of understanding about how well the company is actually able to deal with these challenges.

If the CEO thinks that the system’s in place that enable them to catch these things before they become a problem, that’s a real issue, because it’s just not true.

Was the content that the civil society groups flagged at that point, was that taken down immediately?

My recollection is that that took about three or four days. It was circulating for several days before it came down. I think it came down reasonably quickly after local organizations flagged it, but this is the thing, of course, is that if you’re just relying on some local organizations to spot this stuff and sound the alarm, then you can have material, incitements to violence, circulating for days or even longer, which was what was the case with this situation on Facebook Messenger in September of 2017.

Facebook’s Responsibility

What is Facebook’s responsibility here?

Facebook has created this platform that in many countries, not just Myanmar, but in many countries has become the dominant information platform, that comes with a lot of responsibility. It’s not good enough to turn around and say, “Well, who made it my job to figure out what’s hate speech?” This platform has been created, and it has an outsize influence in lots of countries. Many of those countries are wrestling with some pretty big challenges. Tensions between groups within countries, and we have seen this explode into what Mark Zuckerberg would call real-world harm, what others would just call violence or death in many other markets.

We’re seeing it right now in India through WhatsApp. We’ve seen examples of this in places like Sri Lanka.

So the idea that this maybe is just a problem that pertains to Myanmar is deeply, deeply misguided. And if anything the Myanmar example should be sounding an alarm at the highest level of the company, that this requires a comprehensive strategy.

But you’ve been warning of that for years.

Not just me. Others as well.

To no effect?

To limited effect.

So in terms of responsibility, are you saying that Facebook bears some responsibility for what’s happened in Myanmar?

It’s hard to escape that conclusion. When you look at the facts on the ground, and you look at what was and wasn’t done, it’s hard to escape that conclusion.

… Do you actually think that Facebook can live up to this responsibility? Do you actually think that’s possible? Or is this basically an intractable problem?

I think that’s a very important question. It’s going to require a much, much bigger effort than there has been to date.

I mean, there’s trillions of things shared daily on Facebook. It’s 2.2 billion people on this thing. To expect this company to do that, to be able to monitor for all of these things – I mean, you’re a technologist. And do you actually, do you think that that’s possible to do in any sort of effective way?

I think it’s absolutely right to expect them to do this. They built this, and they’re profiting from it, profiting enormously. Extremely rich company. They have some of the best engineering talent in the world, and it’s absolutely right to expect them to address this issue. There’s absolutely no reason why we should wave our hands and say, “Oh, this is really hard; we’ll take a pass on that.” I think it’s absolutely right to expect them to address this.

Going back to 2015, were you expecting them after that meeting to at least invest more in trying to solve this problem technologically or to create better systems?

Yes. I think we need to see much more investment in dealing with the challenges of operating in markets that look different to the U.S. Product was built for the U.S; it’s where most of the engineers are; it’s how most of them sort of think about the product. The reality is is that there’s many places around the world that look very, very different from the U.S.

But was it your expectation that they would actually start to put money and talent into solving these problems at that point in time?

Yes, it seemed like that would be the responsible thing to do. Here is this very real problem. We’ve already seen through the Mandalay riots in 2014 that people can get killed because of this stuff. It would be the wise, prudent thing to do to invest in getting ahead of this problem. Here are these millions of people that are going to come online in the coming months and years, and this problem is not magically going to go away. The prudent thing to do at that point would be to get out ahead of the problem.

Because the system at that point was not working?

Correct. But that’s not been the Facebook way. Until very recently, Facebook has operated by this “Move fast and break things” mantra. If it’s just a bit of code that breaks, maybe that doesn’t matter so much. But when you’re seeing major cleavages in countries, that matters.