The Facebook Dilemma | Interview Of Antonio García Martínez: Former Facebook Product Manager

The Facebook Dilemma | Interview Of Antonio García Martínez: Former Facebook Product Manager
The Facebook Dilemma | Interview Of Antonio García Martínez: Former Facebook Product Manager

Antonio García Martínez was a product manager at Facebook from 2011-2013 and is author of the book Chaos Monkeys.

This is the transcript of an interview with FRONTLINE’s James Jacoby conducted on February 27, 2018. It has been edited in parts for clarity and length.

Your first days at Facebook, do you remember your first day there? What it was like?

Oh, absolutely. That’s when they do what’s called onboarding and it’s kind of the initiation into the cult of Facebook in which they trot out the heavies, the senior people who are in charge of the Facebook vision, to come and hype you up on the vision of the company and really convert you. And that first day is really important in many ways. At Facebook – it will sound strange, but I assure you it’s true – they have what are called Faceversaries. You don’t celebrate your birthday at Facebook. You celebrate the day you started to work at Facebook. And so on your first or second Faceversary, you might get to your desk and you’ll see one of those large Mylar balloons like you see at a child’s birthday party with a two or a three and a bunch of flowers. And again, not your birthday, your Faceversary. So yeah, that first day is an important thing in your career at Facebook.

Facebook’s Mission

And so bring me into the onboarding and what the mission and the evangelizing is like in that. I mean, as an outsider, what it’s like to get into that?

Right. I suspect it probably changes a little bit by time, by, you know, how the vision is going that moment. So this was early-ish 2011. For Facebook historians, that’s after Feed, our modern Feed, but not a lot of the shared content you typically see on Facebook as it is now. It was still largely a personal platform, no ads in Feed. And one of the more compelling visions was by a guy named Chris Cox, who I think I jokingly referred to as the Ryan Gosling of Facebook. He’s a dapper, young, handsome-looking guy, very, very charismatic. He had his pitch down to a T. And he convinced us, he asked the rhetorical question. Oh, and by the way, to set the scene, Facebook hires about 10 percent of its workforce as interns in the summer. And so I think it was a starting day for interns, and it was basically a roomful of 18-year-old-looking people. I felt very old.

And Chris Cox strolls in and says, rhetorical question, “What is Facebook?” And some jackass says, “Oh, it’s a social network.” And he might have been a shill. Who knows? And Chris goes, “No, wrong.” Somebody else says, “Well, it’s something along the lines of, ‘It’s your personal newspaper.’ Right?” And he says, “Exactly. It’s your personalized newspaper. It’s your “The New York Times of You.” Channel You. It is your customized, optimized vision of the world.”

And so, this vision of Facebook being the sort of news editor for the world – which is one of the things that we’re discussing now – was always part of the vision. I don’t think all the ramifications of that were part of the vision, but that was a vision that was, at the time, being foisted upon new hires at Facebook by the head of product.

And what did you think of that? At the time, what was your thought process about “The New York Times of You”? And this whole idea of [an] optimized, personal channel for you?

I was admittedly a little skeptical. Right? It seems like a very ambitious vision. Again, at the time – nowadays it seems obvious, but you have to remember back at the time – one didn’t consume most of one’s media through Facebook. It was still mostly a social thing. At best, it would be shared things like music on Spotify. I mean, that was still a year to go on that. So it seemed ambitious to me. It seemed like something that the company could potentially achieve. … So  I was a big believer in the company. I knew that it was going to be a paradigm-shifting thing, a generational company on the order of Google.

How it would get there exactly, what their vision was, I wasn’t quite sure. But I was definitely sold on it. And as time went on, to be honest, I speak of it now like I was cooler than you at the time, but I was totally suckered into it. …There is definitely this feeling of everything for the company, of this world-stirring vision – everyone more or less dressed with the same fleece and swag with logo on it, posters on the wall. It looked somewhat Orwellian but, of course, in an upbeat way, obviously. But yeah, I fell for it, as everyone else did. I was very pro-Facebook and, in many ways, I still sort of am, actually.

You said Orwellian posters. What posters were on the walls?

Some of the slogans were pretty well-known. “Move Fast and Break Things,” which was sort of one of the core slogans. I think Zuck [Facebook founder and CEO Mark Zuckerberg] probably disavowed that a couple of years ago, as the focus was more on building a stable platform. But still, the company has always been characterized by: Ship whatever new features are possible and worry about the details later.

What else? “Get in over your head.” “Fortune favors the bold.” “What would you do if you weren’t afraid?” You know, it was always this sort of rousing rhetoric that would push you to go further. Sometimes it would appear, actually, for new hires on their actual monitors. When they arrived, there’d be one like right on their monitor. Yeah, I mean it seems ridiculous in retrospect and if you weren’t caught up in it, like any movement, it will seem kind of silly, but I think at the time most people took it very seriously.

We hear all the time and we’ve heard about Mark Zuckerberg’s vision of “open and connected.” And at the time, is that something that you really bought into? Is that something that you think is a true vision, to make the world open and connected?

Yeah. I should start by saying that that mission is actually taken very seriously at Facebook. That is not some piece of corporate pap that’s foisted on the press to sort of paper over whatever Facebook does. I think people at Facebook are very mission-driven. And I think from upper management to even junior hires, people really do believe that Facebook is a force for good in the world. And getting to your question, yeah, I felt it was as well.

I worked on the ads side. Ads is a very different product. Right? You’re monetizing the beast, you’re not creating the beast, in a way. And so our goal was to somehow pay for all this. Right? It almost felt like at the time, probably less so now, the ads team was running around the side that actually built the app that you see as a user, almost like a nanny would some petulant child and picking up after it, and like somehow trying to keep the kid alive. Right? And that’s kind of what it felt like [in] ads. They’d ship new features and products, and somehow we’re supposed to turn them into money.

As time went on, of course, ads had more agency and just produced new products that just were good in and of themselves [but] had nothing to do with the core product. But yeah, I believed in the vision to a certain degree. I mean, I think it’s easier to believe the vision when you’re inside Facebook, when you’re on Facebook 16 hours a day. You use Facebook internally for work when you’re at Facebook. And so yeah, your entire life revolved around Facebook, your personal/professional divide just basically evaporated because you were never not working. Yeah. Yeah, it’s very easy to get sucked into that world.

And do you think that maybe there’s the benefit of hindsight now? But I mean, do you think that there was a pervasive naiveté at the company about the ramifications of what it does?

Yeah. So I think again, in the context of all the political stuff that’s been happening, there’s a lot of criticism of Facebook leveled. I don’t think they’re an evil company. They didn’t intentionally engineer any of this. I think if there’s any fault you can blame them for, I guess you can call it naivete, maybe shortsightedness, or maybe a one-sided vision; only seeing the positive of what they’re trying to create. Yeah, I think that’s probably safe to say.

We might be jumping ahead here, but when we created these ads technologies, right, they were supposed to sell you a pair of shoes. They weren’t supposed toget Trump elected or not elected or whatever. Right? And so, we didn’t. … Yeah, I don’t think we understood what that. When Chris Cox enunciated his vision of “The New York Times of You”, I don’t think anyone in that room or anyone in that company understood what that would really imply in terms of fake news, or what it means to actually be in some sense the arbiter, frankly, of what’s read and what’s considered true. I don’t think the full implications of that statement were really understood at the time.

Why not?

Well, it’s hard to imagine. I mean, I forget what wag  said, “Things are hard to predict, particularly the future.” [Neils Bohr is often credited] Right? I mean, here we were driving for this vision and you have to embrace this vision and love it and imagine a world with it and only imagine the best for it. Almost like you imagine your child growing up or something. You don’t sit there and think, “Well, what if he gets strung out on meth and goes to jail?” Right? I mean, it’s not something that when the kid is age 1, you sort of really ponder what you’re going to do. Right? By the way, this is not just Facebook. Silicon Valley in general doesn’t understand the downsides of the technology that it creates. I mean, that’s the techno optimist side – they don’t understand. If they sat there and actually thought about it too much, they probably wouldn’t pursue their sort of futurist fantasies with the same sort of vim and vigor that they already do right now.

Engagement And Facebook

When you’re hearing, for instance, “The New York Times of You” – and obviously you’re on the ads side, we’ll get to that – but when you’re hearing this vision of “The New York Times of You”, describe in real terms of what it meant to try to create something that was tailor-made, and what effect that might have on a world view.

Well, again, if I’m understanding your question …I don’t think anyone was thinking about what it would do on the world view. That’s the thing. Right? You know it’s easy to diagnose what went wrong there in hindsight. Right? I mean it’s like, what is an editor, what is editorship right? It’s someone saying, “Eat your veggies.” It’s someone saying, “Look, I know you want to read about Kim Kardashian or whatever but you should read about this instead.” Some complicated story on the front page of The New York Times. But what happens when that goes away and that gets replaced by math? By an algorithm saying, “Well, we’re just going to show you whatever you tend to engage with,” which can mean anything. I mean that’s a totally nonjudgmental statement, whether that’s some fake news thing about Hillary’s health or Obama’s birth certificate or whatever, or whether it be, who knows, some salacious photo or whatever. The system just dishes that out to you. There’s no expertise. There’s no editorial judgment. Right? Yeah, and I don’t think people understood what that really meant.

One of the things that’s of interest is that metric and what was the most important thing about the algorithm learning about you and engagement. I mean, describe what were the main goals inside there, the main metrics through which the company judged its own success.

I think the biggest high-level metric, of course, is usage – more users who use the app more. I mean if there’s one dashboard or one number you’d have to look at, that user number is the high-level number. Growth was a very powerful [team]. What’s called Growth is the team that basically, in a very negative and reductionistic way, is what turns users into a Pavlovian dog who sits there and drools every time the phone buzzes. Right? Effectively, that’s the team responsible for doing that. The long-term goal, of course, in a slightly more polite way of saying it – is they’re supposed to get new users and keep them engaged and keep them from turning off the platform. And so the Growth team making Facebook’s user growth go up and to the right has always been a massive goal. And they consume a huge number of resources. Zuck lets them do whatever they want.

And so I think that’s the biggest high-level thing. Right? But I think in light of more recent events, in the past two years Facebook has changed a lot. Obviously, I’m looking from the outside perspective, but I can hopefully read between the lines a little bit better than most. And so the fact that Zuck kind of famously went and did his 50-state tour of the United States, which many critics allege maybe he’d be running for president. The only people I’ve ever seen theorizing that Zuck might run for president [have] never met him. Anyone who has met him realizes he’s never going to run for president. But he was really, I think, just blindsided by what happened in the election and was just really trying to figure out what had gone wrong.

Well, how would his vision of a more open and connected world – which I really do think he and everyone around him believe in – how did that go off the rails and create a situation like we have today? And I think they didn’t totally grasp that, and they certainly didn’t see it coming. And so I think understanding that is a new thing. And the company announced, I guess it was a few weeks ago now  [January 11, 2018], that they’re going to replace this sort of metric by which they judge content in Feed not by strict engagement, but by some quality metric that’s related to your sociability. Right? The company published a study a couple months ago now, saying that some types of Facebook usage are actually not healthy for you, which is a huge admission on Facebook’s part. The internal mythology on Facebook is that Facebook is like vitamins. You’re just supposed to get some every day for general health and that’s it. The thought that there is “bad Facebook” is just, it is kind of a surprising statement, and so you can tell the internal culture is changing to a certain degree. So, yeah, I think they’re changing their metrics to be time well spent, time in which you’re engaging with people you know and not necessarily raging with a stranger over some piece you read in something.

But isn’t it kind of astonishing that even going back, that there weren’t questions and there weren’t concerns about “The only thing that matters is that people are engaged.” I mean, that quality, for instance, doesn’t matter?

So there is a quality metric. I don’t want to get into the details. But yeah, if you report content on Facebook in the upper right-hand drop-down, that actually gets used. That’s not there to entertain you like a door close button on an elevator. That data actually works and content can get down-voted. And so there was thought to be negative experiences and there’s a lot of security teams that remove pornography that handle things like grooming of children, sexual harassment and stuff. So the company is not a bunch of Boy Scouts. They understand that the platform will be used for evil in some regards. But they saw it very much in the very personal evil of filter out porn, or filter out if there is a conversation between a 50-year-old man and a 13-year-old girl then something weird is going on. So they were very smart about that and they policed that very, very well. I think that the macro social issues they don’t necessarily understand.

Just to pick an example out of the air, what’s going on in Burma [Myanmar] – which I barely understand, the Rohingya business – in which purportedly Facebook is actually being used in some sense to stoke this sort of genocidal rage. I can guarantee you nobody saw that coming. I’m sure none of them even knew what the Rohingya were. I didn’t before this happened. The other thing in their defense, if I’m putting back on my Facebook hat I guess, is … there’s many fallacies in how people think about Facebook and part of the reason why I wrote the book  [Chaos Monkeys: Obscene Fortune and Random Failure in Silicon Valley] is to try to debunk some of those fallacies so that we can talk more intelligently about Facebook, not that it shouldn’t be critiqued. It should, but we can do it more intelligently.

One of the fallacies is that A, Facebook has infinite resources, which it doesn’t. It is actually much smaller than most tech companies you can name. And when I left, it was 5,000 employees. I think now it’s just over 10,000  [more than 25,000 in December 2017]. You know Microsoft has 50,000 plus  [131,300 in 2018]. Facebook actually is not that many people. And then 2, that they’re omniscient, that they know everything going on in the platform. In reality, it’s actually very difficult. When you’re sitting in Menlo Park staring at literally of tens of billions of things being shared a day over 2+ billion people, it’s very difficult to figure out, “Oh, by the way, in Burma this weird thing is going on. Maybe we should do something about that.” It’s very difficult to actually see that. And I could tell you stories about all the times that in the ad system we kind of missed things because we didn’t understand that that’s how Facebook was being used.

Ads Targeting

What was your job there?

Good question. What was my job? So I joined in 2011, through this weird deal acquisition thing through Twitter. I bailed on the deal and basically went to Facebook. And by sheer happenstance, I did have a certain ads background, but certainly I wasn’t an ads guru, not really a background in targeting at all. By whatever happenstance, I ended up as one of the raft of initial sort of six or seven or so what are called product managers. A product manager, in a very self-flattering description, is a sort of CEO of the product. It’s the person who in some sense guides the product to completion, establishes priorities, represents it to the outside world. In some sense, if that product were a company the PM [product manager] would be the CEO. So I was the PM for what’s called Ads Targeting. And what targeting means is taking your data on Facebook and actually exposing that to the advertiser in a data-safe way. Again, no data sold, no data leaves Facebook. But in a data-safe way, expose those levers to the outside world so advertisers who have certain views about what sort of person buys or engages with their product can then express those views via the ad system. Basically, taking Facebook user data and turning it into money. That’s what I did.

So let’s slow it down a little, because that can be a lot for people to digest. But in essence, what does targeting mean?

Well, so again, it can mean lots of things.

So here is the dance that happens. It’s almost a yin-yang thing between what’s called targeting and optimization. So you see an ad. People say, “Oh, Facebook showed me a thing.” The reality is Facebook almost certainly didn’t show you that. An advertiser picked an ad and a bid, saying, “Oh, show this ad for five bucks.” And then picked a type of person they want to see it and that person happened to include you. And then Facebook went, and of the 100 ads that may be your targeted ad, it picks one. So that’s the level of Facebook’s discretion, or I guess, authority in that moment. And so targeting is, look at it this way. The simplest way to explain it – this is what targeting is. When you get an email or you get a phone call or you get even a postal letter, someone has addressed that to you and said, “Show it to this person.” Or even in the case of [a] direct mail ad, “Show it to this type of person or person in this zip code.” Right?

That’s what targeting is. It’s the address on the ad’s envelope saying, “Show it to this person.” That’s all it is, effectively. And you can think of ads – people don’t think about it this way – but what they are is really paid messages. So when you see an ad inside Facebook, I know the Facebook logo’s right there. But it’s not Facebook showing you an ad. It’s that advertiser showing you an ad and Facebook is the messenger showing you that ad. So that’s what it means.

And so you come in at a point, and in your book you kind of describe the fact that you were in some way astonished by the fact that they didn’t have a monetization machine. Bring me into that a little bit.

So this is, again, 2011, and again I had just risked this kind of important to me, although small, deal to go to Facebook instead. And on day one or two, after the onboarding experience that we talked about, I go to the revenue dashboard that at the time, it was pre-IPO, so only the Ads people could even look at them because they were kind of super secret. Open the revenue dashboard and look at what’s called CPM [cost per thousand]. CPM is basically the average amount of money it costs to run 1,000 ads, sort of the price per square [inch] in the media world. And it was astonishingly low, like super low, like ads-running-on-a-blog low. And I just couldn’t understand, because like everyone else thought Facebook has all this data; it must be super high. Of course, there are subtleties here. That was an average over the entire world. CPMs vary widely over the world. They weren’t that low in the United States, etc., etc. But the reality was that at the time, the way Facebook was making so much money, and it was making a good amount of money even before the IPO, was that a billion times any number is still a big number. Right?

And so Facebook’s actual per-ad cost wasn’t very high but the number of ads that people were seeing and the number of people on the platform was actually quite large, and that’s how Facebook was making its numbers.

And how good was Facebook at that point at targeting people?

It was very poor. What was good is they know your location very well. Location for some advertisers is a big deal. They know gender. They know some pages that you’ve liked, which may seem important but really it’s not. The fact that you like BMW doesn’t mean you’re about to spend $50,000 on a 3 Series. It’s just a lifestyle brand thing that you associate with and you like it on your profile. So yeah, the profile was actually very crude. Most of the people who bought ads at the time on Facebook were not what we call direct response advertisers. Direct response is someone who’s trying to sell you something right now. It’s when you browse for a pair shoes and then you see those shoes inside Facebook and the temptation is to go and click on it to go buy it. That is direct response. Right? And that relies on very specific targeting that knows a lot of things about you: I want those shoes. I have this income level or this education level. I abandoned this item in my shopping cart on Amazon. Right? They know very specific things and want to sell you this one thing.

And the other sort of beast in the marketing zoo is what’s called brand advertisers. They want you to think that Burberry jackets are cool or they want you to think that some musician’s album that’s coming out is cool or whatever. That’s a different sort of buy. Facebook was pretty big on brand because it was seen as a platform for a brand to express itself. But it was really bad in direct response advertising, which, by the way, is also very important for politicians, particularly looking at swing-state voters who have particular tastes and preferences. You need really good targeting for that. You can’t really just get away with targeting everyone from the age of 25 to 45 who’s a male in Ohio. Like, that’s just not good enough. So yeah, I mean, the targeting is actually really poor. My job for the first year was to try to improve it, and it was actually much harder than we would have thought.

This is a company that had huge vats of data about everybody though. So explain that to me and then bring me through this sort of evolution of how you then get to that Holy Grail of …what you’re having to buy from third parties or what you need to do to kind of triangulate and find out how to do it properly.

I mean, the way I would say it is slightly salty. Just because I have a nude photo of you on the internet doesn’t mean anyone wants to pay to see it. And so the reality is that, yeah, Facebook has a lot of personal data, your chat with your girlfriend or boyfriend, your drunk party photos from college, etc. The reality is that none of that is actually valuable to any marketer. Nobody wants to actually know that.

Here’s another fallacy that people have about thinking about Facebook. They think the data that they most want to keep private is what Facebook wants, and that’s actually not true. They want commercially interesting data, which is usually not your personal data. If you think about it, what do advertisers really want to know? Again, what car do you drive and how long have you driven it and how many miles are on it? Are you up to buy another one? You know, what products did you take off the shelf at Best Buy? What did you buy in your last grocery run? Did it include diapers? Do you have kids? Are you a head of household. Right? It’s things like that, things that exist in the outside world that just do not exist inside Facebook at all. And so the reality was that, again, Facebook had some data, and some of that data can be used in an interesting way that ended up being worth something. But to first order, I target all people who do X. The X’s that people wanted were not inside Facebook, by and large. And so that’s why circa 2012, when you know the IPO was obviously imminent, the call went out. This needs to improve in a serious way. And it’s funny again, everyone thinks that Facebook is so money-focused or ads-focused. Facebook actually thinks, in general, [of] ads as being sort of an unnecessary evil. Zuck doesn’t care about ads and never has. And so, there was really kind of pressure to not upset the user and give them really aggressive ads, not to use data too aggressively, not really take risks. Until the IPO, and then the tenor of that entirely changed. And you know, that’s when the company developed a lot of the products that it’s known for today.

Sheryl Sandberg

All right. So first of all, Sheryl Sandberg looms large there. Tell me, put her in context and how you affiliated with her.

So Sheryl, obviously, was effectively the head of, broadly, ads and sales and operations inside Facebook. I think part of the reason Facebook has been so successful is because the leadership is so complementary. Zuck and her, I think, work very well together. They are also opposites in many ways. Again, you know, Mark Zuckerberg has no interest in ads.

And so he was perfectly happy to outsource the operation of all that to Sheryl and a few other lieutenants obviously. … So yeah, she loomed very large, she ran the entire sales organization. She’s not really a product person in the sense of coming up with new products. That wasn’t her forte. I think what I saw her do best at – I describe her, I feel, quite flatteringly in the book – is that she’s very good at getting a lot of people with big ideas and big egos in a room and having them kind of figure out what to do, and then working that message through the organization up to and including Mark. So I think that’s what she’s really good at, and kind of understanding the bigger picture of it and how it all kind of fits together. I mean she’s just a very adept senior management leadership person.

But the incentive to monetize and to make this a money-making venture really and the incredible ad model that you ended upstarting to build, was that all under Sheryl?

Yeah.I mean I think a lot of the inducement, I mean, not the ideas per se, but the inducement definitely came from her. In fact, there’s a scene in my book in which there was a meeting – I think it was in March of 2012 – in which it was everyone who built stuff inside Ads. So it was a roomful of maybe 12 to 15 people, myself among them.

And she basically recited the reality which is, people aren’t going to buy Likes anymore. I mean, remember people used to spend millions of dollars to buy Likes, for God knows, it’s not clear what purpose. Right? And that game was over. People aren’t going to do that anymore. Games advertisers like Zynga, it was clear they were not going to be around forever either. Revenue was flattening. It wasn’t slow, wasn’t declining. But it wasn’t growing nearly as fast as investors would have guessed. Facebook’s revenue was doubling almost every year for a while, and that was not going to be the case the year of the IPO on the current trajectory.

So she basically said, “We have to do something. You people have to do something.” It’s like, “Well, OK, I guess we do,” and off we went. So yeah, I think she sounded the alarm in many ways and, again, pushed us and was open and receptive to weird new ideas. Like, “Come up with the crazy ideas and together we’ll get them past Zuckerberg,” which was part of her job too, this kind of massaging the message and laying things up for whoever wants to approach Zuck for a certain product, or at least at the time. I’m not sure if it’s the case anymore.

Big picture-wise, what did that set in motion in terms of the ad thing?

So lots of things. I mean, I only was one part of this whole massive effort – it wasn’t massive, it actually wasn’t that many people if you think about it. It was probably, at the top level, maybe a dozen people, maybe 20 people. But yeah, there was a big effort to basically pull out all the stops and start experimenting way more aggressively. The company had spent a year pushing a product called “sponsored stories” that most people probably don’t remember. It was sort of these social ads that would appear on the right-hand side that you’re supposed to engage with. It was an utter failure. That was the big company bet that they had made that had failed, one of them. And so, the call went out for more risky bets. And this is what people ask me: “Well, why did Facebook succeed?” It’s not that they’re brilliant product visionaries. They’ve come up with all sorts of terrible products that have been abandoned and forgotten. It’s the fact they’re willing to experiment. The “Move Fast and Break Things” isn’t just a poster; that’s kind of true. They’re still willing to ship weird stuff. And even if it’s a little bit half-baked and even if it’s a little bit off, or maybe a little aggressive, they’ll ship it and see what happens. They’ve never lost that nerve. And so, I don’t know if you want me to run down the list of what [were] the products we tried to go in. Probably not, you probably don’t want to go into that.

Your Data On Facebook

I’m just kind of curious, what I imagine, and also what it feels like from the book, is that it sort of sets up this arms race to harness the power of the company. Help me get that.

Right. So I can speak specifically to me, because other products I was seeing as an insider, but as an outsider on the product. So in my case, it’s clear that the data that Facebook had – random pages you’ve Liked, conversations you’ve had with whoever – weren’t going to be worth anything. That’s very difficult to monetize. And everyone has their personal theories about how Facebook can monetize their data, including listening to you on your phone. I can assure you, it’s either technically impossible to do or it’s not going to be as valuable as you think it is.

Right.

And so, what do we do? And frankly, I, again, I had had a relatively brief career in Ads and I had worked in what was called retargeting. So the shoe-following-you-around-the-internet trick, for lack of a better name, is called retargeting. And you know, there’s more sophisticated ways of doing it. It’s not just a shoe trick, but the idea being, you use very real-time data, browsing on the internet or potentially even offline in stores, to target things on Facebook. So bring that outside data out. Historically, think about Facebook as a walled garden. It was literally a fort. Seen from the point of view of every other advertising person, Facebook was a fort. It had its data that it would expose to you but again, never leak out. It would never leave Facebook. But that data wasn’t particularly good. The fact that you Liked the page, the fact that you Liked Burberry two years ago, doesn’t mean you’re actually about to spend $3,000 on a coat today. Not at all. Right?

But a lot of the tools we built – Facebook exchange, what later came to be called Custom Audiences – that was building the sort of tunnel, or whatever metaphor you want to use, underneath that walled garden. And that was the initial pitch to Zuck in I guess it was the early part of 2012. “Hey, we’ve got all these crazy ideas about joining the outside world data with the interior experience of Facebook like we’ve never done before.” That join never existed. And so then the question became: How do we build it, how do we get clients on there, how do we scale up, spend quickly enough to make a difference for the IPO? Again, this is all happening very quickly. And you know that was the real challenge.

So basically, you’re saying go outside the walled garden and figure out what people are actually doing online and offline.

Right.

All right. So but what does that mean in real terms?

…So one of the big fallacies about how people think about Facebook is that Facebook doesn’t really sell your data. It actually buys your data. And how does it do that? It has to go to people like Macy’s, for example, that know that you browse a certain handbag and convince them to part with that data. Macy’s, by and large, is very unwilling to do so. Rightly so, they consider that data to be a core part of its business and will part with it very reluctantly, if at all. And so what does Facebook have to offer them for them to part with that data? Well, if they can go to Macy’s and say, “Of the, I don’t know, 3 million people, whatever it is, that came to your website yesterday, you can find 2.5 million of them on Facebook.” And at a CPM that makes sense, at a cost of advertising that makes sense for your revenue, then, well, that’s a very compelling offer. And then suddenly Macy’s, or any other e-commerce company, is willing to put a little bit of code on their website, which you don’t even realize when you go there, that basically looks and sees your Facebook login cookie and says, “Aha, this person is a Facebook user.” And so instantly, when you go back to Facebook, boom, there is the handbag you were checking out on Macy’s. Right?

That’s one of the joins that happens. I’m just describing one. That’s what I mean by buying their data. You have to construct an advertising product that convinces outsiders to give you their most important data for the purposes of Facebook advertising.

And then essentially, Facebook can track you where you are going online.

Right. And this is part of the vision. Any machine that you’ve logged into and used Facebook, effectively Facebook more than likely will be able to track you there, which is why, by the way, Facebook was such a compelling ads offering  because it’s real identity, it’s the real you. Usually mobile or desktop advertising stops with the browser. It’s that browser, it’s not that person. If you go and browse for shoes at work, but you don’t buy because you’re embarrassed, because you don’t want to take your credit card out, because your boss is there, and then you go and look them up again on the subway, and then finally, you’re at home in a moment of weakness because you’re bored or whatever, Facebook can actually link all those experiences together and show you the ad and say, “Boom,” and drive that conversion. And very few other people can actually do that.

So it is, to some degree, a massive surveillance system of what you’re doing online.

To sell you shoes, yes, correct. Yeah. It’s Big Brother, but he comes bearing shoes. Yes.

Then what about your offline life?

Right. So the offline thing is trickier. So I should stress Facebook didn’t invent these technologies. It often copied or co-opted existing business models. But yeah, the offline thing is interesting, how that world kind of works. You know, Facebook, again, was not – I was not very familiar with that world at all. And so we had to do a lot of deep digging there, got lots of demos from companies that would end up being effectively competitors later, to figure out how this world worked. So you go to Safeway, to cite an example of a grocery store, and you’ve got your discount card and oh, by the way, it’s registered to your phone number, which is how you remember it, and when you don’t have it, that’s how you enter it. It’s not accidental. Safeway can go and upload a spreadsheet with a phone number, and that will join to your Facebook experience via this product called Custom Audiences. And if you went and bought, I don’t know, diapers, then they could start showing you motherhood- or parenthood-related stuff based on that.

I’m not saying they are, but they could. And that’s how they joined the offline world to the online world. Here’s the key thing. This is why Facebook is so compelling as an advertising solution. Facebook is the one sort of immutable, accurate online ID for your capitalist, consumerist internet life. Right? Every machine you go to, what’s the first thing you do? You download and log in to Facebook or email or Instagram or whatever. They know where you are effectively. And so all these outside sources of data – whether it be browsing behavior through the cookie, for example, whether it be a phone number for phone marketing or your Safeway card, whether it be an email address for email marketing, whether it be a physical address, for example, the entire direct mail world, those coupons and catalogs you get – Facebook enables those people to, what’s called, onboard into the online space. Some rely knowing an address lets somebody get found on Facebook and that address goes to reams of data about you. Most Americans don’t even realize. It’s funny, Facebook is always painted as the bad guy in all these movies. You know, Facebook is just a middleman. …

And so when you put all that together on Facebook then that’s the real targeting tool.

That’s right, yeah. What I’m describing, by whatever mechanism it happens, is the real targeting tool that Facebook has: that one unique online ID for everything you do online, and even some things that you’ve done offline. Yeah.

Was there any concern at the time when you’re creating… First of all, is this one of the most potent technological tools of this century to some degree?

Oh, come on. I don’t know about that.

No, I’m serious. You’re compiling all this data. You’ve got the processing power to actually do something with it.

Well, let’s be careful now. You said “compiling all the data.” To be clear, so Facebook doesn’t actually get the data. Right? All Facebook gets is the match. Again, the advertisers, they want to use Facebook. Again, I’m saying they essentially turn over the data. They don’t actually hand over a data file saying, “By the way, here are the crown jewels.” No. They keep all their data that they know about you. But the personally-identifiable information, PII in the trade, is what’s used to make the join. Your name, your phone number, your address or whatever. So they make that join and then they activate the data on Facebook. So Facebook never actually gets the data. In the case of browsing, it might. But in the case of offline data, it does not.

But you’re right. Yes, it’s a powerful solution. And again, to be clear, Facebook didn’t necessarily innovate [it]. The onboarding world existed before Facebook invented it. It was basically cribbed from an existing thing. In the case of Facebook, it just works a lot better. What’s called the match rates    – say I’ve got a list of a million emails of people I want to match who bought stuff on my site last month. You know, I might go to a conventional data onboarder and I might get 20 or 30 percent of them online. I go to Facebook and I get 70 or 80 percent. It’s a huge difference.

The other thing is that, again, it’s a perennial match. Facebook, since you’re constantly logging in, that match constantly exists. In the browser world, you delete your cookies or change machines and that trail just dies. Yeah, it is powerful. Again, what makes it powerful is not that it’s so unique, [but] that it just works way better than any other industry version of it does.

And in creating this, was there any hesitation or any kind of concern about building such a powerful targeting tool?

I mean, there were privacy concerns. I spent a lot of time with privacy lawyers when it came to all this stuff. And you know, Facebook takes that seriously. It doesn’t just charge in and not think about it. Yeah, I mean, again, it depends what you mean by “concerned.” There’s the privacy side of: Do people feel like [they’re] getting stalked? And what sort of opt-outs do you have to have and all the rest of it? The problematic thing is, in the U.S. that’s almost completely unregulated and it’s considered to be a consumer choice issue. The only one really setting the rules are various consumer industry bodies that really don’t have force of law. At the end of day, it’s really just Facebook looking at Google, looking at Apple, looking at Amazon, saying, “What did you do?” And someone moves a little bit, and someone moves a little bit more, and somehow, collectively, all kind of inch in one direction. There’s no real law regulating. Of course, Europe is a whole different story.

I think, yeah, there were privacy thoughts about it. And again, you don’t want to weird out users. The weird thing about ads is ads are either crappy or they’re creepy; there’s never in between. And so in the case of Facebook, they instantly went from pretty crappy – everyone complained about how bad Facebook ads were and how untargeted they were – to knowing what you bought from the store faster than the time it took you to get back from the store. And that can weird out some users. And so you have to have some mechanism either opting out or saying don’t show me this ad again, all the rest of it. So no, I think that the user experience side, they definitely thought about it, because, again, the user experience always came first at Facebook. You know the bigger notion, which I think is what you’re asking is: Is this good for society? Is this like a big advertising NSA [National Security Agency] organization? Yeah, I don’t think they thought about it a lot, to be honest.

I mean, what are you going to do? Opt out? I mean, it’s basically like being a conscientious objector to the draft. I mean, well, then leave. No, I don’t think it ever really came up.

Political Advertising And Targeting

Let me see how to phrase this. At the time of creating it, to envision that it would be ever used for kind of political purposes or for political advertising or as a targeting tool for any host of people or manipulation of elections, what have you. I mean was there any…

No, not really.

Tell me.

So I was there during the last election, so I had some feeling of what it was like. There were political budgets. It wasn’t like nobody was spending money on … That was the second Obama election and it seems that if he had spent a lot of money on digital for 2008, it was true for 2012, in some sense. But politics was – at the time obviously, I think now we’re in a different world – but at the time, politics was not that big a deal inside Facebook. There was a political sales team. I think the thought was that eventually politics would loom large at some point. But at the time, I recall, as a product manager, one of the roles you have to do is you get product requests, usually through sales or just directly from the client saying, “Hey, it’d be really cool if we had this. We’d spend more money.” And you have to sort of say, well, “yes” or “no” or “That’s cool, but like right now we don’t have the time to do that” or whatever. You prioritize requests. Right?

And you know, the political advertisers often have unique requests because it’s a very unique advertising world that they live in. But I’d usually ignore almost all of them because their budgets just weren’t big enough to justify completely changing the road map for them. And you know what? The budget’s going to disappear in literally a few months. When the election’s over, that’s just gone for another four years probably. And so, no, it wasn’t a priority. There was another product called Facebook Exchange, which also was a retargeting tool, and there’s a lot of products that were shipped that were doing this sort of precision targeting.

There wasn’t a lot of thought given to the political side, because, again, it just wasn’t that big of a deal at the time. No one thought about that. The big goal was to move into spaces that Facebook originally had not done well in, which was e-commerce and other sort of direct response, like selling you something online. Again, if you don’t have good targeting, you just can’t do that. And so the goal is really to move into that space and do it well.

The Facebook Culture

…What do you think the motivation is [for Mark Zuckerberg] then?

With the proviso that I don’t claim to know the guy very well at all, I really do think he is ideologically driven. I think a more open and connected… Well, A, there’s his ego. He’s a total alpha male dominant guy who just wants to project this vision of the world and be in control of it. So there’s the ego side. But I do think that he does, or used to, think that Facebook is a force for good in the world, and that that really is what drives him. It’s not money, at least from the little that I know about his actual lifestyle. It’s way below the means of a multibillionaire. He doesn’t live some crazy splashy life, as far as I know. He’s been with the same woman for years. He doesn’t have that outsized hedonistic or whatever you want to call it type thing. He just is really focused on the vision and the mission. I really think that’s what drives him.

And were you an odd species or were you a kind of more endemic species in that you had come out of Wall Street and were on a trading desk there? I mean, were you a strange bird there or were you more typical?

I mean, I wasn’t really a Wall Street guy. I had spent three years after grad school on a trading desk at Goldman [Sachs] and had watched the whole world blow up with the whole credit crisis. And then I landed in tech after that for a couple reasons. You know, I think I was strange. I was slightly older. The average age of Facebook was really low, probably in the mid-20s or high-20s. And I was in my mid-30s, early-30s because I had spent time in grad school, Wall Street, whatever. I suppose I was fairly typical. You know the place did feel a little bit college-y, a little bit juvenile, people going around on skateboards and sleeping on couches and generally looking like s— basically, and just eating horrible junk food. Like that wasn’t really my thing. But I wasn’t, no, I don’t know if I was a rare duck. I mean I was a rare duck inside the Ads team, I think.

In terms of it being kind of a juvenile culture, is it not strange to think that a company that was growing so exponentially with such reach and such power over the flow of information in the world was run by a bunch of juvenile kids? I mean, seriously.

[Kids] sleeping on couches. Yeah. No, I mean, that’s how you get to that position. That’s the reality. By not knowing what you don’t know, and just applying your youthful enthusiasm to a really captivating vision with a lot of technical skill, and you know, never thinking that “Oh, no, this can’t happen. We can’t create the social intermediary layer for the world.” Yeah, sure, we can. Why not? No, I think it’s exactly what it takes.

But what about an understanding of history?

Yeah, right. So that’s the thing, I think. They don’t have the tragic sense of history and understand what happens when you allow someone to livestream video from their phone. I don’t think they quite thought that someone would instantly livestream a murder, right, which basically happened. Yeah, they just don’t understand that that’s kind of how the world really works. Yeah, there is this certain childish naiveté there, yeah, and not just in Facebook. I think Silicon Valley in general.

Not understanding that the real world works where there is a lot of bad actors that want to do bad things?

Right, but these are all 27-year-old males growing up in an affluent part of the United States. Of course, their horizons are limited.

Do you think that’s part of why things have gone so…

Probably. Yeah, probably. I think, yeah. I mean, human empathy. They have a lot of adult supervision now. There’s a lot of senior execs. Like this isn’t a bunch of kids running around anymore. That said, that culture is a little bit inescapable. Yeah, no, I think that’s definitely why. Yeah, what you’re getting at – what we’re seeing now is due to that, basically, yeah.

What’s a chaos monkey?

Yeah, a chaos monkey. So, a chaos monkey is a piece of software that was outsourced by Netflix. And imagine a primate, like an ape of some sort, a monkey, running through a data center, like these big, long air-conditioned data centers that run the internet world, running through, punching boxes, pulling on cables, just generally causing sheer havoc. So there’s an actual chaos monkey. It’s a software version of it and what it does is, it goes to a data center and literally starts killing random boxes. And the idea is you can test whether that data center can still stream “House of Cards” or whatever it is. It’s a very real in-world way of basically trying to kill a system and seeing if it’s robust to critical failure.

I use it metaphorically in the book to say that entrepreneurs in Silicon Valley these days are sort of the chaos monkeys to society. Uber comes along and says, “You know what? We’re not going have taxis anymore. Forget that, taxi medallions aren’t worth anything, pull the plug.” Airbnb says, “You know, we’re not going to have hotels anymore. We’re just going to use all this, this underused asset called your spare bedroom and that’s going to be the biggest hotel company in the world.” Or Facebook comes and says, “You know what? I don’t know about this whole newspaper business. Let’s just all share it through a News Feed thing.”

They’re the sort of chaos monkeys that run through our society and knock over boxes, and in some sense, we as a society have to see if we’re robust to that monkey’s sort of depredations inside the server farm, effectively.

Is it a noble profession?

I don’t know. I mean, I think, yeah, it’s a stress test of society, which society needs occasionally. It’s how you make forward progress. But I also think, look, here’s the thing. Silicon Valley seems, you know, really BS-y and kind of full of it and really fake until, for example … I’ve lived in Barcelona and initially, I lived in a series of Airbnb’s. And downtown Barcelona is basically being destroyed by Airbnb. All the apartments are basically getting turned into Airbnb hotels. It’s populated by drunk English tourists. Well, you can imagine it turning into a massive Airbnb resort. Right? It turns out these crazy kids in San Francisco, in their industrial lofts, wearing sandals, can actually have major real-world implications. They actually are a chaos monkey unleashed on the world. And again, I think that that can be a good thing. Like I use Airbnb. Who am I kidding, right? But I think the negative sides of it are often socialized and the government has to deal with it, while the gain and the upsides are often privatized to the shareholders and to themselves.

Facebook’s Response To The 2016 Election

Do you think that 2016 has been kind of the wake-up moment for that? Tell me what the revelation has been in 2016, about Facebook specifically.

You know, I think 2016 was definitely a wake-up moment for everybody. Everybody in the room got woken up. Zuck woke up, the company woke up, everyone’s like “Wow, this happened.” Yeah. No, I think it was a big wake-up call. I think Facebook really realized the real implications of what the “newspaper of you” actually means. I think society in general understood what it means to have everything we know about the world intermediated by one company that tends to serve you what you want to hear. I think politicians understand what the value of Facebook is now. Yeah, I think it’s a global wake-up moment around Facebook.

Facebook And Privacy

…You know, you mentioned earlier about privacy. Describe the sorts of interactions that you had internally where you’re coming up against privacy concerns. I mean, you describe in the book a few things, a few anecdotes about the lawyers and the privacy concerns, the guy in Ireland for instance. Right? Tell me what that was like, where you’re wanting to go whole hog on an idea and they’re like, well, not so fast.

Yeah. So privacy is a touchy thing. It means different things to different people in different countries. It’s rare for anyone to come out with a sort of privacy vision, like this is how the world should just work going forward. It’s not really how it works. Usually, companies try to get away with the thing. They sort of massage the marketing around it, they get away with it, like, “woo, yes” which is exactly what we did to ship a lot of these products. To do a lot of this retargeting stuff, Facebook’s Terms of Service had to change because the data policies we had didn’t allow us to actually do this.

And so it’s a particularly comical episode because early on, before I had gotten there, Zuck in some public forum had said, “We’re never going to change the Terms of Service and if we do, we’ll hold an election to see that it went past users and that it was actually a referendum,” which meant that we actually had to hold an election in order to change these things, which was comical. Yeah, and fortunately one of the stipulations in the election was that a certain fraction of users had to actually vote. And so obviously, getting 10 percent of close to a billion people to vote at the time on anything is very difficult. And so we didn’t meet the threshold. Everyone voted against the change, but the results were basically ignored. And we went ahead and changed the data use policy and it was then legal to track you on all these various ways.

So let me get this straight. For tracking people, you needed to change the Terms of Service?

Right.

Mark Zuckerberg was uncomfortable with it.

Well, he wasn’t so deeply involved. A change of this size did have to be pitched to him. In fact, I, along with others, pitched a very early, embryonic version of what we were doing, which he did sign off on. And yeah, as I’ve written, you know that he objected to certain aspects of that plan. But by and large, he said yes to it. I’m not sure if he really understood the implications of it all, but [it] didn’t matter. We took the yes as a yes. And we went ahead in that direction. But yeah, he did have to sign off on it.

But why put it out for a vote?

Well, again, that was just to stay compliant with what he had said years ago, that if it ever changed, it would be subject to a vote. There was no real legal requirement to it, as far as I understood. It was just a really strange, awkward thing we had to do. It was almost a formality, really, because again we understood we would not get 10 percent of the user base or whatever it was, something like 10 percent to vote on this one thing. Again, I think we calculated it – that’s almost a number of voters in an American presidential election to vote on a Facebook Terms of Service. It just wasn’t going to happen. So we weren’t too worried about the vote. But the privacy thing was a thing, you know, that had to be drafted and changed in critical ways to actually let us do it.

To some degree, is Facebook’s success and your success in creating targeting tools, is it contingent upon people abdicating more and more of their privacy?

Sort of. Again, as critical as I am about Facebook about certain things, the privacy thing is somehow a bee that’s never flown into my bonnet. I just don’t see what the big issue is with using not terribly personal data to show you an ad for a sweater and try to sell you a sweater, particularly when the alternative is showing you a generic ad that you certainly have no interest in whatsoever.

I think part of the problem is people just don’t understand that ads are what pay for the internet, at least the consumer internet of something like Facebook. And they’re going to be there, they’re never going to go away, and that’s just the end of it. The only question is: Do you have good ads or do you have bad ads? And so to me, it’s not really this big, burning concern. It’s on a government that obviously has laws to comply with and the Fourth Amendment, all the rest of it.

I know Europeans, for example, have a very different attitude on this. And I had to deal with what was called the Irish privacy audit of 2011, which was kind of a big deal. But I just don’t get that European concern. I mean, I understand where it comes from in terms of their history of fascism or government, whatever, but I mean it’s Macy’s trying to sell you a pair of shoes, man. I don’t know, I don’t see what the big deal is.

Well, isn’t it that the more data and the more private data that you give up, the more manipulable you are?

Well. But.

More targetable?

Here’s the other thing. The people who actually run these ads empires – and you’re seeing it now with the election, actually – don’t have quite as much faith in their own powers of persuasion as their critics. Like, in the context of the election, for example, some of the numbers that came out – “Oh, this ad or this post was shown to 100 million people maybe once, maybe.” I look at that [and] I’m like, “Hmm, probably didn’t change anyone’s mind about anything. Right? Everyone thinks, “Oh, my god, well, that’s it, they just turned you into a zombie. Now, that’s it-you’re just a bot in Zuckerberg’s army.” It’s like, advertising doesn’t really work that way. You have to see a lot of ads to convince you to do anything. And for a small budget, like what the Russians had or whatever, you’re not going convince them to do anything. And so I think ads people are a little bit less risk-averse around the ads thing because they understand the limitations of ads frankly. How do you manipulate someone with an ad? How do you get them to do a thing they don’t ultimately want to do? It’s very difficult to do really.

Disinformation And Misinformation

But what about with misinformation or disinformation or propaganda?

So here’s the one thing where I think Facebook does have a problem. It’s not the ad side. The notion of the bigger story, which is we’ve given up on editorship and instead we have an algorithm. And Facebook is the owner of that algorithm and will feed you a truth that flatters your worldview. Humans have this thing called cognitive dissonance, which sounds very wonky but it’s very simple to understand. Which is, if I have a view of the world, I don’t like having it disproved. Humans undergo this really awkward discomfort whenever they confront facts that go against their worldview. Right? And Facebook, in a way, indulges that. Not intentionally, but indirectly, it actually does indulge that. And I think that’s gotten us into the pickle that we are today, in which fake news – as you want to call it – has become kind of the tribal folklore of whatever online tribe you consider yourself to be a member of. And I think Facebook definitely does amplify that.

And that to me is a big issue. The ad thing, I really think, is a red herring. Privacy, the reality is that most people actually don’t care about it, although they do. I mean they care about it until you offer them a 30 percent coupon at Gap and thenthey’ll text you nude photos of themselves. They really don’t care. But the fake news thing, I think that really is almost an existential threat to society. And it’s threatening for two reasons: one, it’s dangerous in and of itself, and then two, the solutions to it are not obvious, at least to me or anyone I’ve talked to.

But aren’t you more susceptible to fake news and fake messaging and propaganda the more that the algorithm knows about you, the more data and privacy that it can crunch about you? It knows what you want, it knows what you’re looking for, it knows who you are friends with and where you go.

Potentially, yeah, potentially, that’s right.

So an abdication of privacy does leave you open to vulnerability?

Well, it depends what you mean by privacy. I thought when you meant privacy, you meant consumer data, like what did I buy at Best Buy, which, again, to me, I don’t think that would make you more susceptible to messaging. But if you mean location data, yeah, maybe. Yeah, I mean, privacy in the broad sense, I guess, yes, might leave you open to manipulation – not necessarily by Facebook, but by outside players who are basically drafting off Facebook’s ability to figure out what you really want to engage with even more than you do. Yeah, to some degree that might be right. Yeah.

So I mean, what’s the algorithm biased toward? What’s the bias involved here?

Right. So I should qualify this. I never worked on News Feed. I knew vaguely what it was about. Obviously, we had models inside ads that did very similar things, like: Which ad do we show you? So I’m not a News Feed expert by any stretch of the imagination. But I think it’s pretty common knowledge that what gets you shown a thing is a few things. Your relationship to the person posting it. Obviously, if it’s someone close to you, you are more likely to see it. If historically, you’ve clicked or Liked things like that before, then that’s part of the thing. What Facebook politely calls engagement and some might call clickbait or Likebait or comment-bait is definitely a part of the algorithm. And that’s no secret, that’s always been true.

And was quality a factor in that?

It is. I mean, quality is a factor both in ads and on the Feed side. Again, if you go to any post – whether it’s an ad or a post – right-hand column, drag down, it says, “Oh, I don’t want to see this for whatever reason.” That feedback is actually used – definitely in the ad system to not show you an ad potentially going forward.

But that’s if it’s something that feels irrelevant to you maybe, as opposed to something that feels completely relevant, but potentially untrue.

Right, right, that’s right. That’s the thing. The person who would down-vote, effectively, that piece of content is a person who thinks it’s false, who by definition is not the person looking at it because Facebook would shield you from that reality. That’s right. That’s right. And so in some sense, that feedback system is kind of fundamentally flawed, in a way. Yeah.

So you talk about, in one of your articles, the algorithmic pass.

Oh that’s great, yeah. I’m glad you resurrected that term, yes.

The Algorithmic Pass

So tell me, what’s the algorithmic pass?

Right. So I coined this thing. But for the past, I don’t know – as long as I’ve been paying attention to technology, 20 years or so, say the early aughts -every company that’s been a consumer internet company that filters content for you has given itself what I call the algorithmic pass. And what that means is, they claim no responsibilities of editorship over what you see, but rather say, “No, we just optimize for some particular metric” – clicks in the case of Google, engagement in the case of Facebook, and, “We just show you what you want to see. We don’t make any moral judgments. We are not arbiters of truth or value or quality or the American way or anything. We just show you what you tend to engage with.” And that’s called the algorithmic pass. And whenever anyone points a finger at them and says, “Oh, no, you have some responsibility there,” they cry, “Oh, wait, wait, wait, we’re not a media company, we’re a tech company, we’re just middlemen. We don’t understand how editorship even works.” That’s the algorithmic pass that companies have had for, I don’t know, 20 years or whatever it is.

They say it’s the math that’s the problem. They say it’s the math.

Right. “It’s the math showing you a thing. It has nothing to do with us. We’re showing a thing.” I mean, at the end of day, it’s original sin. You’re the one who’s flawed. You like this sort of content, right? And I think that’s kind of what blew up in 2016. The algorithmic pass, I think, is getting revoked by society, by and large, and they’re expecting Facebook and other companies to actually embrace some responsibility for what they show you there.

So what blew up in 2016 then?

So I think the algorithmic pass is what blew up in 2016, that people pointed to Facebook and said, “No, that’s it. You’ve got to take some responsibility about what we’re reading here.” And I should mention, that’s a job that Facebook doesn’t want. Mark Zuckerberg is now the sort of front-page editor of the world’s news. I don’t think that’s a job they really embrace. But again, I think that the world is pointing their fingers at them and saying, “That kind of is your job now.”

Is it a job you think they can do?

No, not really. Exhibit one in this strong, strong affirmation: I guess about a year ago now, they had a product called Trending Topics that most people probably know as the topics you see on the right-hand side of Facebook, which is more or less copied from Twitter. And that product was having real problems. At the time it would surface, clickbait-y content like, I don’t know, Kim Kardashian’s most recent semi-nude photo, whatever ridiculous piece of news is going on. Like, literally a major piece of news could be going on [and] instead you’d be staring at nonsense. Major problem.

Facebook decided in an unusual move, in my opinion, to hire journalists to actually curate it. So they hired relatively young interns-slash-contractor types and a small team, I believe in New York, to basically hand curate that news, to improve it, because obviously, you needed a human eye for it. And it blew up in their faces. The sort of playbooks they were handed about how to filter news, what were acceptable sources, leaked. Because, of course, it’s a bunch of journalists, young ambitious journalists, of course, they leaked it to other journalists.

They weren’t treated very well, supposedly. And so a lot of them quit or [were] fired or started leaking to journalists as it turned out. They couldn’t even do what The Sacramento Bee does every summer, which is take in 10 or 12 journalists and kind of treat them OK and then give them a small job to do. They couldn’t even handle that. So to think that they’re suddenly going to become the editorial staff for the entire world internet, it’s a little bit hard to believe. And again, it is so foreign to their DNA. I mean, when they say, “We are a tech company. We’re not media company.” It’s not just BS. It’s kind of true. The engineers run that company. They don’t understand media and content. This is just a very different vibe. So, no, I don’t think they can actually do that job.

When did Facebook become the primary news platform? When did that happen? Do you know?

I don’t know. It happened after I left. I don’t know, to be honest.

OK.

I don’t even know how you’d measure that. It’s a function, actually, when the news platforms declare it. When their amount of inbound crosses 50 percent from Facebook, then I suspect you could say, “Okay, they won.” But yeah, I don’t know when that point was to be honest.

Was there a sense inside the company – I mean, I know you were on the ad side not on the News Feed side and it was before News Feed became basically a News Feed – but was there always this sense that they never wanted to take responsibility for what happened on the platform?

Well, I don’t know if that’s totally true. I don’t think that’s totally fair, either.

OK.

I mean they do take responsibility. They do filter the detritus of the internet, porn and bad behavior. Little known fact – they actually police users [for] criminal behavior. Try to sell drugs or guns or try to harass a minor, you might be getting paid a visit by the police and with active collaboration from Facebook. So I don’t think it’s a complete sociopathic indifference to the implications of it. They’re very loyal to the user, and the immediate concerns of the user. I think the bigger picture of “What is this doing to society for the next 20 years?” is maybe where they’re a little bit weaker. And they don’t see it coming.

Facebook’s Responsibility

What are they doing to society? Explain that to me. What’s the bigger issue that they can’t police?

Well, I mean, what obsesses me, obviously, is the filter bubble – this business of sort of algorithmic cognitive dissonance at scale; you never actually encountering a piece, a fact about the world that you find displeasing. We often say democracies, in a traditional democracy, every person has a right to an opinion. But on Facebook, you have a right to a reality, your entire own reality. And so I think that aspect of it is something they never quite understood.

I mean it’s strange, though, because in 2011, The Filter Bubble comes out, Eli Pariser’s book. How does that land inside the company?

I never heard it quoted even once, well, maybe not even once in passing. But it wasn’t some sort of deep, obsessional thing. Part of Facebook, at least at the time – [it] may not be the case now – was pretty impervious to outside influence. Again, when you’re inside a cult, you don’t exactly read outside books very often and have it change your internal doctrine. You don’t, right? So I think there was very little in the way of questioning the fundamental values of the company or the fundamental goals of the company. I don’t think I ever saw it really.

But were you aware? I mean, you’re a well-read guy, you read a lot. Did you read The Filter Bubble at the time?

No, I didn’t. No, I didn’t actually. I don’t read business books a lot. I don’t read tech books either.

But this notion that you’d been indoctrinated with, this notion that you’re part of creating “The New York Times of You”. The mission is to create your people’s filter bubbles.

In a way.

In a way. And really, people are impervious to any outside critique of that?

Again, in light of what’s happened recently, this seems like an incredible claim – like, how could you be so stupid? But circa 2011, yeah. None of these things had happened. Like I often say, if you had told me in 2012 – when I was [product manager] for what’s called ads quality, the team that actually polices ads – if you had told me that “Oh, by the way, in a few years, Russian agents in the employ of the Kremlin are going to use the ads system to undermine the American democratic experiment,” I would have laughed in your face. I mean, that didn’t occur to anybody. And so again, in hindsight, here we are in 2018, that’s the world we live in. But at the time, no, I don’t think that was really on the radar screen.

So no one was ever talking about the idea of the weaponization of the ad platform?

I mean, we would have been lucky if it had been weaponized. I was so useless at the time. We couldn’t convince him  to do anything. How would you weaponize it? Targeting who would like BMW? I mean, it was horrible, like the CPMs are terrible.No, there was no weaponization. Again, the platform just wasn’t that strong. There wasn’t that much you could do with it. We would be lucky to get click rates above 0.1 percent. That’s bad even by internet standards.

So you leave, but then the company actually ends up using that to some degree, not exactly your product, but you were part of a movement to build what was eventually Custom Audiences and Lookalike Audiences. Explain what happened in terms of actually building the potent tool.

Right. So again, this all came out of a vision that many people shared. It wasn’t just me. A lot of people cooked up this vision about joining to the outside world by various means. I had my baby called Facebook Exchange, which was this very real-time, very quick way of doing it; a thing called Custom Audiences, which was another product which I didn’t work on at all called Lookalike Audiences, that’s also very powerful and used now.

And they kept building an edifice, they kept on making it better, which Facebook was very good at – iterating, and making it better because its engineering is very skilled and can ship code very quickly. And so they just made it bigger, more robust, easier to upload a spreadsheet with a million rows or whatever, which at the time would have been hard to do. It’s odd in a way because they had other ambitious products and some have done very well. Mobile News Feed ads did very well. That’s what saved the company actually.

And I don’t mean this in a way to ding the company. But you know, the vision that’s been realized is more or less what everyone had five years ago. This is exactly it. Right? Yeah, that’s right, this was the vision. Outside advertisers are supposed to effectively upload or join their very particular first-party data at scale quickly, reliably, and target people inside Facebook. That’s exactly the vision that’s been realized. It hasn’t really changed that much from when it was being cooked up in 2012.

Meaning what, though? Unpack that for me. That means what?

That Facebook is the one ring that rules them all. You show up with a mailing address from your direct mail campaign or an email or a phone number or a name or a cookie, a visit on your website, and Facebook will find that person, always, everywhere, on every device they touch. Facebook is the one unique ID for that person online for your ads experience. That was the vision and that’s what’s been realized.

And that’s what’s also been hijacked or perverted in some way? I’m serious.

Yeah, I mean, capitalism is an inherently immoral enterprise. At the time I didn’t really have value judgments about it. It was just a way to get the CPMs up and survive the IPO process.

But now seeing what that’s become, and that’s what you’ve been writing about most recently, it’s kind of like, “Look what we actually created here.” Right? And I mean, you are a student of history. I know you do read that. So what was it that you were actually creating? If you look at it with the benefit of hindsight and you put it into kind of a historical context, what was it?

Yeah, well, what did we create? To a certain degree it’s “1984” with iPhones. I mean, yeah, sure, I guess that’s one way of looking at it. But again, you’re not getting 100 grams of bad chocolate from Big Brother. You’re getting an offer to buy whatever.

To me, it’s funny having been inside it and thought about it. It’s often difficult to pull away from it and see the big, menacing ogre that many people see it as. To me, ultimately, it’s just a way of joining data across disparate data sources, although obviously, it’s very powerful. I mean, yeah, it’s the commercial side of personalized media. It’s the newspaper of you but it’s the ad of you. That’s the thing. Like, it’ll flatter your every worldview, your every taste, your every appetite in a way, and in this case, it just so happens, for money which is what the point of an ad is. Yeah, that’s what, I guess, it created in a way.

Facebook And Political Campaigns

And then kind of apply it to the political system.

Right. So yeah, it’s funny, politics. Yeah, this is where the rubber really hits the road because politics is a whole different beast. It’s no longer selling you stuff. Yeah, it’s problematic. What was Jefferson’s famous quote? That he’d rather live in a country with newspapers but no government rather than government with no newspapers, right? Well, we might not have newspapers. We’re going to have Facebook, is what it is, because most of the newspapers are dying. Yeah, do you want a government with Facebook? Does it still serve as a newspaper? I’m not sure. It’s a good question.

Again, I really do think one of the problems with Facebook is that it erodes a democracy’s ability to do anything in a democratic sort of way. You and I have to agree on certain ground truths and facts and values before we agree to collectively – particularly if we disagree about something – come up with a compromise and do a thing. And I think Facebook erodes the ability to do that across ideological barriers. I mean the reality is the fault lines of these online tribes that we’ve sort of Balkanized ourselves into run underneath the historical borders of the governments that we’re presumably sort of collaborating to manage. Right.

And the reality is that some American “alt-right” kid might have more in common with a Brexiteer than either does with someone right in their community and not [on] Facebook. But the internet made that link sort of possible. Right? And they both feed on each other’s medium  . They have very different worldviews. And so, yeah, I think the edifying job of, or the educational job of newspapers, which is to inform you and say, “Look, eat your veggies. Read this thing. I know you don’t want it. I know you want to just check out the Kardashian feed, but read this thing instead.” That’s gone. It’s like nope, Kardashian all day, that’s all you want. So yeah. And again, I don’t think Facebook created this. And if Zuck dies tomorrow, Facebook keels over, there’ll be somebody else doing it. An optimized feed of news and content has now become a sort of default utility that we expect.

And also one that seems, both on the ad side and the content side, but that seems that there’s a structural advantage for something incendiary, isn’t there? Explain that to me, how there’s advantages to certain types of material or certain types of ads. It’s one of the things that you were talking on the phone about. Explain that.

So without getting too wonky about it, everything on Facebook – whether it be stuff in feed, which is, as we call it, organic, unpaid, or whether it be inside the ad system – for various reasons, depends on what’s called engagement. And engagement is a very broad and frankly euphemistic term to mean clicking on a thing or Liking a thing or commenting on a thing, or if it’s a video, playing the video. It’s engaging with that content, rather than passively scrolling, usually for some marketing goal. If you click through, they want you to go to a website and buy the thing or whatever it might be. And that is also tracked, by the way.

So engagement drives all of this. And what it means in the context of the auction – although as data increases, we’ll see, but in general it’s broadly true – [is] that if an ad is more engaging because it’s perceived to be higher quality, you, the user, are effectively voting for what you want to see every time you engage with a piece of content. Facebook says, “Well, effectively for advertisers that are providing engaging content, we will effectively charge less for that ad.” Because if you think about it, an ad is burdening a user’s time. If you don’t burden the user’s time or at least engage with them, we’ll charge you less. But if your ad kind of sucks, then we’re going to charge you more.

So if you think about it – and I know in the context of our heated political moment, everything seems very morally clear depending on what side you are of the discussion – in general, it is the case that you want ads that are more engaging and you should be giving your advertisers a discount if they come up with good ads versus bad ones. That’s the goal for it. And again, to be clear, Facebook didn’t invent this. Google’s auction works the exact same way. Most auctions that are not bid at the ad impression level probably have some version of this. So this is how engagement of various types is defined on Facebook.

Of course, the company can change its definition of engagement. Right? It can decide that, you know what, Likes beyond a certain number aren’t going to count, just to invent a certain thing. I’m not saying they’re doing that. But they can try to get around having extremist content, basically, game their engagement metric. Facebook does this all the time. Advertisers and content marketers try to game Facebook algorithms all the time. And so rather than have a race to the bottom of the most salacious and outrageous content, Facebook tries to game it and nip it in the bud and does it in various ways. One is user feedback, etc., etc.

But again, just because Facebook says “engagement is our metric” doesn’t mean they’re a slave to every type of engagement that exists in this world. They have very smart people. The Optimization team and either the feed side or the ad side is full of brilliant PhDs and they understand the problem. It’s not like they don’t get it. The question is, as with everything at Facebook, is: How do you do it at scale? How do you solve this problem for whatever it is, 2.4 billion users, hundreds of billions of times a day. That’s the hard part. I mean, it’s very easy as a Facebook user and this is how all these bad screenshots and bad stories about Facebook get out.

One specific photo got through their filters, or Facebook banned one actually noble group that it shouldn’t have, but it did. There’s always going to be false positives and false negatives in any filter classification algorithm. So I think we do need to cut Facebook a little bit [of] slack at the scale that they’re sort of running, at which I think most people don’t understand.

But I mean in terms of the advantage …Is there an advantage given to incendiary content and ads?

It’s hard to say, I mean, there’s an advantage to posts that are engaging. It can be positive, too. Just to cite another counterexample, Obama ran on a campaign of hope and optimism, and that was also very engaging. He won by a large margin. Clearly, his marketing was pretty good. So I don’t know that it always has to be cast in the most negative light. Positive content also sells. I think it’s harder to produce. You have to be smarter about it. The negative is always harder to produce than the positive. That doesn’t mean the negative always has to win.

In playing to kind of people’s fears, I’m just wondering, one of the critiques right now that’s out there is that the algorithm kind of favors messaging of fear and anger and that it kind of plays into those base emotions because the computer knows that those are things that you’ll engage with.

In a fearful constituency, yeah, that’s true, which maybe is what we’re in right now. Yeah. Yeah, yeah, among a certain constituency that might be right. Yeah.

There were a couple of lines that I really thought were interesting. One, you described sort of Orwellian tools, they were kind of Orwellian and diabolical tools, right? I mean those were two words that …Yeah, so explain that to me.

And again, I don’t want to give the impression that I developed these tools …I don’t want to give the impression like I created Lookalikes. I did not create Lookalikes at all. The only thing I ever got about it was its first pitch meeting by the guy who became the PM. I was in the meeting, but that was it. I had nothing to do with that product whatsoever.

But products that were created by the company that you worked for and helped to develop their ad platform …Yes, I mean, tell me what those things are and what’s kind of concerning or Orwellian about them.

Right. So the product here in question – which I didn’t work on directly but is a very powerful product and I think it’s one of the least known, but actually most powerful ads products – is called Lookalike Audiences. And as the name implies, it flags people who look like a certain thing. Here’s a little glimpse inside the ad tech trenches, okay? When you’re in a client meeting with a sophisticated client who has their data ducks in a row, they know what they’re doing, OK. “OK, we’ve got this million people, 100,000 people who spent more than $1,000 last year on our site or wherever. They’re great. We know we can reach them through you. We know that. What I want you to find me is 3 million people that are just like them. That’s the value you can bring to the table. Find me people like this, right, Facebook.”

And so how it works is you upload Custom Audience, the basic unit of very precise targeting, and you click Lookalikes in the UI [user interface] and Facebook goes out and using a witch’s brew that I sort of vaguely knew what it was about at the time, now who knows what they use – but basically, one of the funny things is, I mentioned how a lot of Facebook data is actually not directly monetizable in many ways. But it is indirectly in the sense that all that interaction with people that in and of itself is not really commercial in nature helps establish who is like other people.

The fact is Facebook, taking one atomic individual inside the social network, could figure out other people inside either that person’s social network or just in general who are very similar to that person because they’ve Liked the same things, they’ve interacted with the same content, whatever it might be. Certain observed behaviors mean that person A is very similar to person B. And so if person A has done some interesting thing that puts him in a targeting group, like bought a certain car or bought a certain thing online, whatever, it will show the ad to person B. And maybe we’re off a little bit but it’s certainly better than random chance.

It turns out that approach actually works really well, and advertisers love it because, again, they give Facebook a million people or whatever it is and they get back 5 million people that are kind of like that 1 million person. And so they expand that area of concern. That’s one of the topics and at least from public statements from the Trump campaign they use it pretty heavily. You know the metaphor, which is maybe a little bit creepy in the piece, is imagine a petri dish. You’re one little element, one little cube inside this little petri dish. Custom Audiences allows an outside advertiser to sort of put a little bacterium inside that nutrient rich petri dish which you’re on. And then you might share it organically, because you engage with it in some way, and that’ll share it [in the] little squares around you.

But then Lookalikes goes and expands it to the limits of the petri dish of everyone who’s in that petri dish with you and is like you, people that maybe you don’t even know, in which the advertiser doesn’t even have data on. So that’s the key thing, right? You target and you expand.

…And so that was the approach evidently that the Trump campaign took. And I want to be clear, like everything else in my piece, right, it comes as news now because I think that people are starting to understand what it was that Facebook created and developed over the course of the past five years. But this has been out for years. And it is the case that this is basically common advertising practice in most of the commercial world. It was just Trump that evidently used it for the first time so effectively. That’s the key thing to understand, is it’s been around for years. My book came out almost two years ago warning about this. Sorry, I shouldn’t mention the book. But you know, that’s the thing. None of this is novel. It’s only now that it’s come to people’s attention, I think.

And do you have concerns about a technology or do you have concerns about a technological tool like that?

Inside the political space, I can see why it would be concerning. Outside of politics, not really. You know, commerce.

Why is it concerning inside the political space?

Privacy in the U.S. is usually a consumer rights thing. You can always opt out of receiving things, by and large. Or you can opt out of buying a thing. You can’t opt out of being a citizen in a country.

In advertising, they refer to things as verticals. That’s the trendy term. The automobile vertical, e-commerce, these broad categories of commercial activity that sales teams go after and are paid to address. Well, politics is its own special vertical, because politics has people with guns that can kick in that door and drag you away, legally, right, because you did something that you know broke the law that was set up by the legislators or the government that you elected in power. Politics is a fundamentally different thing. Right? So convincing people to buy a candidate, effectively by voting for them, is very different than getting them to buy a sweater. And so although I’m Mr. Pro-use Data, whatever, pro-Facebook having special rules apply to political advertising, I’m totally on board with [it] and I think going forward that’s going to absolutely have to be the case.

The Facebook IPO

…We talked about the IPO, but do you think that the IPO changed the priorities of the company?

Oh, definitely.

How so?

Monetization was a much bigger deal. I mean ads was a necessary evil, almost something forgotten before the IPO.

…So I’ve always said that geography is kind of important at Facebook and how close you were to Zuck and his so-called Aquarium, his conference room, was always kind of important. And so Ads wasn’t quite in the ghetto, but almost. It was in the sort of back part, poorly lit with dirty floors of the engineering building. At least it was there. It wasn’t consigned to where legal was, which was another building. But yeah, it’s never really mattered. Come the IPO, things change. You start getting ads in Feed eventually. Feed is a core part of the experience. I mean, that is the most valuable real estate on the internet. Right? And having commercial content there a year before that would have been inconceivable. All this data hackery that I’ve been talking about would have been inconceivable before the pressures of the IPO. But if you had told me in late 2011 that very soon I would be building a real-time ad exchange that would link all the advertisers of the world within 100 milliseconds to every Facebook user experience, I would have said you’re crazy. But true enough, that’s exactly what happened. And so yeah, the priorities of the company very much changed. I should stress the company is still not super focused on the bottom line, like Zuck said and no one believed him at the time, but I still think it’s true. When he filed the IPO documents, he said, “We don’t build services to make money. We make money to build services.” And I think that’s still true at Facebook. Ads is still a second-class citizen. It’s not the focus of the company at all. But that said, by the time I left, and I’m sure it’s the case now, there [were] Ads engineers sitting close to Zuck. Things had changed, and [on] the floor closest to Zuck, there were Ads engineers and the Feed ads thing was a major part of the Facebook experience.

Once ads started to appear in News Feed, do you think that users were able to tell the difference between an ad and organic content?

Probably not, but it doesn’t sound as bad as it sounds. So this is deep Facebook history that you probably don’t care about. But ads and Feed had always been deeply tied to Facebook pages. You follow a page and that’s kind of why the ad looks like a post. Originally, you’d have an organic ad and you could quote, unquote boost it by paying for it, but most of your distribution would still be organic. As it is now, to get seen in Feed – again, you don’t see everything that you would see in Feed, Facebook chooses things for you – you basically have to pay to make that happen. And so that’s why the ads look like other page posts, because that’s kind of how they started. And again, that’s not necessarily a bad thing. Like, if you compare the mobile experience, like one of these pop-up ads that appears on your mobile app – really, completely what’s called interstitial, is the polite term – it looks horrible. Right? Well, a News Feed post actually is pretty good. And that’s part of the reason why Facebook won on mobile because the ad creative is better. But yeah, again, it’s mostly just to not upset the user at this sort of jarring ads experience.

Right. But isn’t there something concerning about the idea that you’re scrolling through a News Feed and a user …I mean if you’re actually serving your community, which ostensibly is the mission of…

Well, Facebook is not a nonprofit. Let’s not go too far. Well, I mean the way that Zuckerberg talks sometimes, it sounds like the user’s primary. Right?

That’s true. Inside the company, that’s definitely true, yeah.

So if the user’s interests are primary, then isn’t it in the user’s interests for his or her health as a human being and as a member of the Facebook community to know the difference between native content, a piece of news, and an advertisement or something that’s false or fake?

Well, there’s a tiny little “sponsored” there written in gray text that you can barely discern from the white background. And so legally, you have to set off ads from regular content. I don’t know that the smoking gun in the Facebook picture is the fact that ads look like conventional content, because usually, just by the nature of the content, you can tell when you’re looking at an ad. It can be a Vice article that you would normally read that they happen to boost and so it’s an ad, but you probably would have seen it anyhow. Or there’s a picture of a product in it and I think most humans are smart enough to realize, no, that’s an ad.

I think the real big issue with Facebook is not ad versus organic; it’s organic versus organic, or fake news versus real organic, if you want to look at it that way. When something is clearly a post – like [it] doesn’t seem like anyone’s paying for it but clearly it’s serving some ulterior motive, like the Russia situation – that’s where I think it gets a little dangerous, and that’s where the human brain doesn’t distinguish between an ad for shoes and news. That’s the real problem.

The Facebook Culture

…I’m just kind of curious culturally about this wartime footing and what that indicates about the internal ethos of a place and competitive nature of it. And also, what’s helped me to think about it is the Eli Pariser thing. Right? People aren’t necessarily going to be interested in what outside critique is going to be when you’re so focused on survival or competition. So draw it out for me.

Yeah.That was my first impression of Facebook. I mean, again, to set the context, I was selling a small, little, nothing startup you’d never of heard of to both Twitter and Facebook. So I interviewed at both places, because they interview you as part of the acquisition process, and I got the measure of the companies. And from the earliest days, it was clear that Facebook seemed like this army on the march led by this general that we all know, with his very clear vision, with a gargantuan vision that they actually wanted to realize. And that’s what they wanted to achieve and that was the attitude they had, that “We’re all in it together.” But there’s this outside force and yeah, it kind of felt like a war footing. I mean, in its own childlike way, this is people in flip-flops, wearing little fleeces, with tousled hair and food-stained carpets and the rest of it. It wasn’t quite that Roman, really, in looks, but in attitude it definitely was.

At Facebook this whole “Move Fast and Break Things,”people actually internalize that. When they’d finish a comment like, usually on Facebook itself, driving forward some product change or something, they’d literally close it with “Move Fast and Break Things.” People would actually recite it. Not commonly, like we weren’t sitting there just reciting mantras all day. I don’t want to give that impression. But people would actually kind of take that seriously, like, yeah, “What that poster says is how we do things here. That’s for real.”

Again, I was as sucked in as much as anybody. I was as much “Move Fast and Break Things” as the next guy. And it was very real and that was their attitude. And I think you lose a lot of subtlety in that. You don’t pause to think what you’re breaking necessarily. I guess at this point, maybe they should. I mean, yeah, it’s weird. They have this weird, juvenile, sort of embryonic feel to them even though they’re the biggest damn  Goliath in the world. So for example, I don’t know if this is relevant, but Facebook doesn’t have to comply with [the] Federal Election Campaign Act, the FECA law or whatever.

2011.

Yeah, because they got an exemption in 2011 claiming, “Oh, well, we’re just this little startup.” And so they’ve had that exemption since then. It’s insane. How on Earth do you give an exemption to literally a third of the internet – I don’t know what it is – half or a third of all internet advertising dollars. It’s insane for them to think that they’re still a startup and therefore subject to exemption. Right? But they kind of do. They just don’t understand. It’s a weird duality, that on the one hand they’re arrogant as hell  and just command and expect total market superiority, and even people that are partners are basically vassals, and on the other hand, a feeling, this weird paranoia and this weird sort of will to power of becoming something else. It’s a strange thing to keep those two states in your head. But somehow the company, certainly when I was there and even now, still seems to maintain that.

Facebook’s Response To The 2016 Election

Bring me through the story of the election.

Okay. You know, November 9th, 2016 rolled around. We all woke up from our respective hangovers and here we are. It hadn’t caught …Well, you don’t care about that so much, whether it caught me by surprise or not. Trump winning didn’t catch me by surprise. But, well, a little bit. I had predicted it though. I actually had a post saying…

I know, we talked about it. I think Trump is for real. I’m getting the “On This Day” things through the election, because it was a year ago. And it’s like, “Yes, I called it. Oh, yeah.”

…So Election Day rolls around, 2016, and a few days after, Zuck issues one of these personal statements as executives there do – Sheryl does the same – from his personal account, saying, I really don’t see how fake news or Facebook – I think he even used the word “ridiculous” – it’s ridiculous to suppose that Facebook actually influenced the election. And my jaw almost literally dropped. It was the most astonishing thing I think I had ever read Zuck say, precisely because up until the moment before Election Day, an enormous team of political salespeople was literally going out and telling every politician who had an ads budget, “By the way, Facebook can deliver you the election.” Like, literally saying the exact opposite of what he said, and to me it seems, speaking of ridiculous, a patently ridiculous statement to think that Facebook couldn’t sway the election.

Again, when I was there and I was there during the previous election, politics didn’t loom as a huge priority. But I think Facebook always understood that, if required, it could influence the election, for sure. And we used to joke about it in ways that I probably wouldn’t repeat here. But …

Why not? Repeat it.

Well, because there were jokes I said, in my way so I don’t want to say, “Oh, well, Facebook was saying this joke,” when it was mostly me saying it, but…

What was the joke?

Well, you know how they have that Get Out the Vote thing to remind you of an election? I’m like “Huh, so instead of selling ads and going through all this brouhaha, how about we just selectively show that to a few swing counties in a few swing states and then just like auction off the entire election? How about we just do that? Would that just be easier?” And so one day, I just came up with how about we just selectively show this reminder banner and then just swing the election that way? Anyhow, so it was a joke I would say, because it was always understood that if you ever got that wrong – like, Facebook was very careful, because if it did get it wrong, and for some reason it only showed up in Florida but not somewhere else, it’d be a major issue.

Your joke actually had some truth to it, in that you knew, inside that if you chose to target some people and tell them to go out and vote, you probably could have an appreciable impact on an election.

Right, particularly the last one. Yeah, that’s right. That’s right. When I saw Zuck say what he said, I was just gobsmacked. And sure enough, a few days later – I can’t recall the exact dates – he came out with a much more circumspect and diplomatic statement. Obviously, someone had intervened, and he came out saying – and this was also shocking for different reasons actually – he said, “Okay, look, we’ve heard you. And we’re going to take several steps to actually talk about this fake news thing.” And one of the things they mentioned – and this I also find astonishing, it was an astonishing week – was him saying effectively, “We’re going to take a role in determining whether something is fake news or not to some degree – either work with third-party sources or do it ourselves. Become an arbiter of truth.”

And that was astonishing to me, because forever Facebook had had this “algorithmic pass” that said, “Well, the feed algo just shows you stuff. We don’t take responsibility for what it shows you.” And this was Zuck literally [doing a] 180, saying, “No way we could have influenced the election,” to “Okay, we did, and by the way we’re also now editing.” That was huge to me. I thought, something has really changed inside Facebook, something has really changed in a big way for Facebook to even hint at that. And so as time went on, I think the company has really been seized by a call to action by this. …

…In that 180 that he made, what do you see the fundamental realization being? Look at what we’ve created here? Look at what we’ve actually…

Yeah, I think, again, you’re asking me to channel a guy that I don’t really know very well at all.

But you can interpret what he said.

Yeah, right. Look, here’s my impression on the whole Zuck post-election thing. He’s a visionary guy. I call him a genius in my writing. And I do think he is that. I think 2016 threw that company for a huge loop that they’re still reeling from and getting over not quite understanding. And I think what freaks out Zuck is that he created this thing, and they realized their vision. Going as far back as when I joined in 2011, they had the whole notion of “your only news source,” all that stuff. None of this has been a surprise. But the implications, the outcome of that has been … the effects of the cause they created has been a big surprise.

And I think Zuck basically doesn’t know what the hell hit him. And so, waking up from that blow and saying, “OK, I need to go visit all the 50 states in the union.” That’s not a fake thing. He’s not running for president. I think he really, legitimately, just wanted to know what happened. Like what is some dairy farmer in Wisconsin [thinking]? He had the famous photo of him feeding the cow or whatever. Obviously, they milked it for marketing reasons. That’s not why he did it. I think he really didn’t want to have breakfast or whatever with that guy and look across the table and say, “So what’s the deal? What are you feeling? Who did you vote for? How do you use Facebook?” Just get out of that bubble in Menlo Park, or MPK as it’s called, and just see the real world for a little bit. I think that’s really what he wanted to do. I think he just felt blindsided by the reality of the election. And I think a lot of people at Facebook have.

Random data point, you know Wired came out with a cover story about Facebook and their horrible two years. And it was a very good piece and whatever. One of the key, again, shocking-to-me things, they actually managed to source about 50 Facebook sources, which is incredible when you think about it. And the company was famously protective of its privacy and very good at keeping leakers out for a variety of reasons. Morale was high. They were just inculturated in whatever – like people just would not leak. And the fact that suddenly that many people are talking, that’s a huge sign that there is internal turmoil of some sort. People are saying, “Screw this, this is not headed in the direction that I think is healthy for society. I’m going to go talk to that Wired reporter and blab about whatever.” So to me, that was a big sign that something is really afoot internally in that company.

Silicon Valley Culture

…I mean, even former employees. Why so secretive?

Well, you can’t tell from me, but in Silicon Valley, there’s kind of a code of silence around things, right? I don’t quite understand why, because at some deep level, I don’t think there’s anything that deep to hide in Silicon Valley. I think, as perhaps with Facebook, it’s kind of arrived at the point in which it’s so important, it needs to be a little more transparent about how it works. Like, let’s stop the little bulls— parade about everyone in Silicon Valley creating, disrupting this and improving the world, right? It’s, in many ways, a business like any other. It’s just kind of more exciting and impactful  and so [forth]. But yes, ultimately there’s a code of silence, and going out and speaking, even if you protect confidentiality-like, nothing here is subject to NDA [nondisclosure agreement] or anything; like, this is all public-even that is generally looked down upon, by and large, because again, they’re very protective of the company’s image and all the rest of it. An early Facebooker, forget it-you’re never going to get them on the record unless they actually have their own personal brand, someone like Chamath [Palihapitiya] maybe, but by and large, no way.

So it’s like it’s omertà, right? It’s like this-

I think I used exactly that word in the book. Yeah, that’s exactly right.

Tell me that.

…There usually is this omertà about Silicon Valley; there’s a, you know, almost a mafioso code of silence that you’re not supposed to talk about the business in any but the most flattering way. Right?

Basically you can’t say anything measured and truthful about the business , which I was definitely breaking with the book, and I’m doing so now. But again, I think Silicon Valley, like Facebook, has reached, achieved a point which at this point you have to be more transparent.

…Isn’t it a problem both on the ads side and the content side that we public have no idea what engagement really means how it’s biasing every bit of information  or advertising toward us, and we actually don’t know what the secret sauce is?

Right. There’s a school of thought that typically lives on the eastern part of the Atlantic that thinks that companies like Facebook and Google should reveal or divulge their algorithms to the point where perhaps they might even be regulated in terms of what they can take, their data inputs. Every country has its various taboos. Every culture has its taboos. And in the U.S., targeting around certain racial categories might be considered taboo to a certain extent. So, yeah, there’s an argument to be made there. I think purely for practical reasons, it’s very difficult, because again, that model changes literally almost every day probably due to code changes. How do you regulate that? Is a government committee going to sit there and approve a code change every day? I don’t think so. That said, I guess there could be more disclosure. They could say, “OK, well, these are the types of features we use.” Right now, all we know is nothing: you know-click rate, likes and comments, the basics. But beyond that, we just really don’t understand it. Yeah, there might be a case to be made for that.

I think the other problem is-here’s putting my Facebook hat back on-asking Facebook to divulge its core intellectual property, which is how the News Feed works, it’s like-I don’t know. It’s like forcing Ford to divulge the plans for its next big hot engine or whatever. They’ll be like: “What? No, of course not. That’s ours. That’s the key. That’s the secret sauce. That’s why we’re worth X billion dollars.” I think that would be the Facebook response to that.

originally posted on pbs.org