Naomi Gleit has served as Facebook’s vice president of social good since 2005. This is the transcript of an interview with Frontline’s James Jacoby conducted on September 5, 2018. It has been edited in parts for clarity and length.
So if you can think back and situate yourself back in November 2016 and what that moment was like here internally and what was sort of dawning on you about Facebook, about the election, and what was the vibe inside of here at that point in time, tell me about it.
I’ve been at Facebook, just as some background, for almost 13 years. I started when I was 21 in 2005, and there have been a lot of ups and downs and intense periods of my career here, but definitely the past few years have been the most intense.
And what’s been intense about it? Why is that?
I joined Facebook because I really believed in the mission to make the world more open and connected, and I’ve worked on many different projects, whether it’s growth, social good, more recently safety and security, so I’ve really been focused on amplifying all of the good things that happen on Facebook. I think what’s changed is we’ve definitely become more aware of how Facebook can be abused by bad actors, by fake accounts, and what our responsibility is and making sure that we minimize those experiences on the site.
What was the Growth team about? What did you do at Growth?
When I joined at Facebook, it was just a site for college students. It’s even hard to remember. On the Growth team, the story of Growth has really been about making Facebook available to people that wanted it but couldn’t have access to it, whether it was because they were in high school – so one of my first projects was expanding Facebook to high school students – or whether it was because they weren’t English speakers.
People had friends and family that spoke other languages that they wanted to use Facebook with, so I worked on translating Facebook into over 100 languages.
Sometimes people had access to a phone but not a computer, so one of the Growth team projects was building a really rich Facebook experience for mobile phones, say, on low networks or low connectivity. More recently, people might have access to a phone but can’t afford a data plan, and that’s the goal of Internet.org. It really has been about removing barriers that are preventing people from accessing Facebook and allowing them to use it.
What were the metrics that you used in the Growth team about how you measured your success and how your team was incentivized?
One of the biggest things that I worked on as a leader on the Growth team has really been helping people find their friends on Facebook. When we were college students, it was really easy to connect to – I went to Stanford; all of my friends [were] at Stanford campus. But when I joined, there were 1 million users, and now there’s over 2 billion people using Facebook every month. We’ve had to build a new user experience, friend-finding tools.
Upload your profile picture, find people with shared interests or find people that went to your school. A lot of the work that we’ve been doing is really helping people find their friends on Facebook, because without friends – you know, the site’s all about your friends and family. If you don’t have any friends, I don’t think it can really provide its value.
Some of the problems that have reared their head with Facebook over the past couple of years seem to have been caused in some ways by this exponential growth, by growing too fast too quickly. Do you think that growth was in some ways responsible for where the company has found itself now?
So I think Mark – and Mark has said this, that we have been slow to really understand the ways in which Facebook might be used for bad things.
We’ve been really focused on the good things. I lead a team called the Social Good team, and their entire focus is finding great things that happen on Facebook all the time and building tools to make that even easier.
A good example is ALS Ice Bucket Challenge. We built the Donate button from that. In terms of have we been slow? Yes. But we’ve done so much to make up for that.
Let me ask you about the slow, though, because one of the things in our reporting and one of the things that I think a lot of outside critics would say is that there was plenty that was known about the potential downsides of social media and Facebook – you know, potential for disinformation, potential for filter bubbles or preference bubbles, potential for all sorts of bad actors and abuse.
Were these things that you just weren’t paying attention to, or were these things that were conscious choices to say, “All right, we’re going to abdicate responsibility from those things and just keep growing”?
I definitely think we’ve been paying attention to the things that we know. One of the biggest challenges here is that this is really an evolving set of threats and risks. One good example of this is we were really focused on cyberattacks for the November 2016 election.
We weren’t ahead of the curve on foreign interference in elections, and I think that’s true of us and a lot of other organizations and companies in the industry, so it’s been a process of learning. These aren’t things that we can get ahead of, but we’re really trying to.
Do you think that if you look back in terms of your years growing this business that there were moments that you knew about problems and you really should have taken them more seriously at the time?
I think we’ve made mistakes, but I think we’ve also really learned from our mistakes, and I think that one way that people have characterized Mark is as a “learn-it-all.” He’s not a know-it-all. We didn’t know everything from the beginning, but he is a learn-it-all, and I think that’s true of the company as well, is that once we know that this is a problem, we’ve shifted a lot of people and technology to focus on it.
How has Mark changed? You’ve known him for a very long time at this point, but how has he changed as a result of what’s happened over the past couple of years?
Mark has – I mean, I called him a learn-it-all. I think one really good example is his yearly challenge. In the past, his yearly challenge one year was to learn Chinese. Another year it was to wear a tie every day because it’s a serious year. That was like a serious year. His challenge this year was to really focus on improving safety and security on Facebook. It gives you a sense of how he’s changed the gravity of these challenges, and these responsibilities [have] really grown from, say, wearing a tie, you know when we were just a college site, to making sure that Facebook is a safe and secure place.
And in terms of responsibility, because it’s a term we’ve heard a lot from Mark, you’ve mentioned responsibility, what about accountability, right? Did you see a difference between responsibility and accountability in terms of this enormously powerful company, huge global reach, tremendous impact on societies all around the globe? Who can hold Facebook accountable for what it does?
I think we all acknowledge and understand how much of a privilege it is to be such an important part in people’s lives, and that is why we feel incredibly responsible for making sure that Facebook is used for good and not bad. In terms of accountability, a lot of times we can’t do this on our own. We work with experts; we work with other companies; we work with partners; we work with organizations. It’s something that we need to do in concert.
I guess what I’m trying to get a sense of is what was – was there a reckoning inside, right? If you can bring me into some – something happens after the 2016 election, and it’s something you’ve spent a long time growing this business. What is it like to see it and have all of these problems that have been known about but all of a sudden rear their heads in 2016? What is that like inside of here?
Yeah. I think I just want to push back just on all of these problems that we’ve known about, because like I said earlier, a lot of these were new problems. So again, we were ready for cyberattacks; we had a team that was on top of it, and we were not aware or expecting foreign interference in the election, so I think a lot of the feeling around the office that you’re asking about was just of – a little bit of surprise. It’s been the most reflective sort of educational year. We’ve been learning a lot.
But in terms of, I mean, engagement-driven algorithms, right? Part of Growth and the Growth strategy was in part to engage people. Right? And get people on to Facebook and give an engaging product. There was a long history of criticism either by academics or by others of the potential harms of just optimizing foreign engagement or optimizing for growing something. You don’t think that that was known about before that took people by surprise here?
From my vantage point on the Growth team, we definitely weren’t optimizing for engagement. The metrics or the stats that we follow are really around friending. I think I told you a lot of my focus on Growth was really about getting people to join Facebook, have a good experience, understand how to use the site, set up a profile picture and connect with their friends. So I’m not sure it’s entirely accurate to say that we were optimizing for engagement. At least for me and my team, we wanted to help people see value in using Facebook.
In terms of expanding internationally, which was also part of your purview –
– what about going into places … in Southeast Asia or elsewhere in the world that did not have civic institutions, strong histories of democracy, strong governmental institutions that in some way that there was some risk to bringing this technology to these countries and really promoting it as it was?
Yeah. I think this sort of is a great point. When we were first starting on Facebook, we were college students building a site for college students, so other people like us. As we’ve grown into other languages and to other countries, the people that are using Facebook look very different, whether it’s because they’re in Southeast Asia or they speak a different language, so we’ve really had to build a diverse workforce, partner with people that are on the ground, work with academics, work with experts to understand the people in the communities that are using Facebook and how.
Facebook’s Response To The 2016 Election
We’re at the end of this process of reporting out this film, and we’re here at Facebook, and I think one of the things is that there is a sense outside of here that Facebook has had this veneer, this public image of a force for good, and that the company’s been very successful at putting forward a very positive image of itself – any company would do that – but that at the end of the day, there’s been quite a bit of harm caused by this company and what it’s built.
When Exxon spills oil, someone’s going to be held accountable, and there’s going to be people that are fired. What do you think that the accountability has been here in terms of what’s been caused or what Facebook has been a part of and what you’ve built, and can you understand why it may not be satisfying to a lot of people out there that there hasn’t been, you know, who’s been fired, for instance, or who’s lost their job as a result of what’s happened here at Facebook?
You know, I am still at Facebook because I firmly believe that Facebook is a force for good, but when you build a product that really is a reflection of society, you get the good and the bad. So I’m not sure that in terms of accountability someone has been fired, but we’ve definitely changed the way that we operate, and this is our number one priority.
And in terms of changing the way that you operate, is there a sense that we kind of have to take your word for that? I mean, obviously we’re here at Facebook to see and talk to the people that are changing the way that you operate, but at the end of the day, we have to take your word for that, right?
No. I think that one of the most important things is that we’re super-transparent about all of the things that we’ve been doing: how we’re taking down fake accounts; how we’re identifying misinformation; who we’re partnering with third-party fact checkers. We recently released a report that shows all of the stats and metrics around our integrity teams, what they’re taking down, how fast it comes down. We’ve doubled from 10,000 people working on safety and security to 20,000, so all of these things are things that we need to be extremely transparent about so that we can welcome feedback and evaluation.
The other thing is just trying to be a lot more transparent in our product. Now you can go to any page and see what ads they’re running. Anyone who wants to run a political or issue ad needs to verify. These are all part of our effort to be more transparent about the company and our product.
What is the standard that the public should hold Facebook to in terms of solving some of these seemingly enormous problems? Whether it comes down to hate speech on the platform or influence campaigns, what standard should the public have for the company going forward in terms of how responsive you are to these problems?
I think the standard and the responsibility, what I’m focused on is amplifying good and minimizing the bad.
We need to be transparent about what we’re doing on both sides, and I think this is an ongoing discussion.
What’s an ongoing discussion?
How we’re doing on minimizing the bad.
But we’re dealing with such consequential issues, right? We’re talking about integrity of our elections; we’re talking about –
– in some cases playing a role in the genocide. An ongoing conversation means what exactly … about a standard for success here?
This is the number one priority for the company. Mark has been out there; Sheryl [Sandberg] is out there. You’re talking to me and a bunch of the other leaders that are working on this, and I think that’s what we mean by having an ongoing conversation. This is something that we need to – as you said, this is serious; this is consequential. We take this extremely … We understand this responsibility, and it’s not going away tomorrow. We can only do better. We’ve done a lot, and there’s so much more that we need to do.
Legally, is there anything you’d propose, any way to kind of hold you to a standard or hold you to being responsible? Anything you’d propose on that level?
I think we are open to that. We don’t have an answer, and we are working with governments, companies, other people in our industry to figure out what is the right standard, what are the right forms of accountability.
Facebook’s Challenges For The Next Election
In terms of these midterms that are coming up, how confident are you that you’ve really turned things around enough to say confidently that Facebook is on top of the misinformation problem, the disinformation problem, any of the problems with election integrity?
You’re asking about midterm readiness, and Mark actually just published an op-ed in The Washington Post yesterday about our readiness for the midterms. We feel ready. Since we started this whole initiative, there’s been a ton of work that we’ve done here around fighting fake accounts, misinformation. We’ve been practicing and running our drills in all of the upcoming primaries leading up to the midterms. There have also been several elections in other countries that we’ve been working on defending, so I feel like we are ready.
And what happens if something goes wrong? It’s a hypothetical, but I’m curious. If the election doesn’t go well, if there’s information that’s found, what are we to expect from Facebook if something doesn’t go right? And in what way is there to have, again, accountability for something going wrong?
We don’t know what to expect for the midterm. I think that’s why we put in place a lot of processes to try to react quickly. Again, we can’t anticipate everything. We have a dedicated war room of people that are working on safety and security that will be there 24/7 leading up to the midterm and after just to see all of the things that are going on and hopefully to respond to it really quickly.
Going back for a second, did growing so quickly get us into this problem, into the problems that Facebook’s seeing these days – be pretty intractable problems?
I don’t know. I mean, I wouldn’t attribute it to growing so quickly. I do think that we were idealistic. We were really focused on the good and amplifying that, and we were slow to understand and really focus on the fake accounts and bad actors.
A skeptic would say that focused on the good is one thing, but you’re a for-profit company, right? For-profit companies optimize for different metrics, right, and they’re incentivized. People are incentivized in different ways. Do you not think that the profit motive in some way played a role in this? Do you not think that incentive structures for what people were incentivized to do played a motive or played a role here?
No, I definitely don’t think the profit motive played a role here. Even when we had our IPO, Mark wrote in his S-1 documentation that when he started Facebook, he set out to make not a company but to pursue a mission. We’ve been super-mission-oriented. Again, I have a team that is focused entirely not on any company metrics, not on dollars, not on engagement metrics, not on growth, but simply on positive real-world actions, whether that’s generating donations for nonprofits or facilitating blood donations. That really is the focus, and it’s on the mission.
Why the shift in your job? Why did you go from what you were doing to what you’re doing now?
I’ve always wanted to focus on what I thought was most important. In 2005, just expanding Facebook to people who wanted to use it – really, it was just other colleges. It was initially available at like five, and then we opened it to all the colleges in the U.S. and then beyond. I think it was in 2014 Mark approached me, and he was just saying, “I feel like the time, the space, the bandwidth has come to have a team that is focused on just driving positive real-world actions.”
It was just after I mentioned earlier the ALS Ice Bucket Challenge, and we were really inspired by that. We see so many good things happening on the platform, and Mark just felt, I felt, the team felt that we could do more. When we had the ALS Ice Bucket Challenge, the ALS website actually crashed. We can do more. We can actually process that donation on Facebook because we have a great payment system. The ALS website is not optimizing a global infrastructure for payments. Those are just examples of the responsibility that we felt, the opportunity that we saw to make this even easier on our platform.
Has there ever been a minute where you questioned the mission that this is actually – you know, in terms of what’s happened over the past couple of years? If you see what’s happened on the platform in a place like Myanmar or you see what happened during the elections here, there’s almost this evangelism around Mark’s mission, and I’m just wondering, internally for you personally or here, whether anyone has taken a second to step back and say: “All right, is this for real? Has this blinded us in some way?” Have you had a moment like that?
I still continue to firmly believe in the mission.
I wouldn’t be here if I didn’t. I don’t think there’s anywhere else I could go to have the kind of impact that I have here. But in terms of stepping back, in terms of reflecting, absolutely. I think I’ve done that; Mark’s done that; leadership team has done that. We’ve all done that here, but that isn’t on the mission. The reflection is really about how can we do a better job at minimizing bad experiences on Facebook.
And why wasn’t that part of the metric earlier in terms of how do you minimize the harm?
As I said, it’s possible that we could have done more sooner, and we haven’t been as fast as we needed to be, but we’re really focused on it now.
I know, I know, I know. But I’m just curious about going back, you know, an honest accounting, because the whole point of this is we’re going through a history of this company to some degree and trying to understand the choices that are made along the way, because it’s not just not paying attention to things; it’s not just naiveté. There’s something that’s happened, and we need to understand that. For instance, do you regret choices going backward and thinking about leading – you know, being a part of the Growth team and decisions that were made about not taking into account risks or not measuring risks and things like that?
Yeah, I definitely think we regret not having 20,000 people working on safety, secur-, and security back in the day. Yes. So I regret that we were too slow, that it wasn’t our priority –
But were those things even considered at the time, to kind of amp up safety and security back in the day, but there was some reason not to, or it was going to be too difficult a problem to contend with, or – ?
Not really. I mean, we had a safety and security team. I think we just thought it was sufficient. It’s not that we were like, “Wow, we could do so much more here,” and decided not to. I think we – we just didn’t – again, we were just a bit idealistic. We didn’t see the need to go from 10,000 to 20,000 back then.
Facebook’s Response To The 2016 Election
… A lot of the problems that have reared their head recently and that seem to have taken the company by surprise, it’s not as if they’re very new. I mean, yes, in terms of Russian interference in our elections, sure, but the Russians were interfering with elections abroad using social media, using Facebook in some cases. It does seem naïve to say, “OK, all of a sudden we’re going to ramp up; we’re changing everything.” I guess why didn’t this happen sooner?
One thing I would say is that there were known problems, and I think we were focused on them. It’s just possible that they were different. We had a big effort around scams. We had a big effort around bullying and harassment. We had a big effort around nudity and porn on Facebook. So there were investments in that. This is an adversarial space. You know, as soon as you take one step forward, your adversary is evolving as well, so it’s never fixed; it’s always ongoing. Some of these threats and problems are new, and I think we’re grappling with that as a company with other companies in this space, with governments, with other organizations. I wouldn’t say that everything is new; it’s just different problems.
And strangely does it put Facebook in kind of an untenably large position or too powerful a position now that you’re gonna have to take responsibility for the enormity of these issues? I mean, is it something something that the company didn’t want for a long time and now it kind of is saddled with to some degree?
Yeah, we- this is a responsibility like I said, to make- make sure Facebook’s used for good and not bad. One thing that has really been highlighted for me that relates to the earlier question is we are working here in Menlo Park in Palo Alto, California to the extent that some of these issues and problems manifest in other countries around the world, we didn’t have sufficient information and a pulse on what was happening in Southeast Asia or what was happening in South America and so one change that we’ve made along with hiring so many more people is that a lot of these people are based internationally and can give us that insight that we may not get from being here at headquarters.