Skip to Content

Podcast: Roger McNamee, ‘Waking Up to the Facebook Catastrophe’

Roger McNamee wrote Zucked: Waking Up to the Facebook CatastropheWhat happened to Facebook?

Particularly in the post-2016 political campaign, the realities of data, personas and manipulation have come out into the open, from the front pages to Congressional hearings. As policymakers consider regulating companies like Facebook and Google around issues ranging from speech to monopolies, companies and consumers are thinking in new ways about the business of data privacy.

The Facebook story, of course, is instructive.

Among other areas, it’s a story about business models and incentives and what can happen to a company when the two don’t align with a stated mission – or, perhaps, the public good.

It’s also a story about privacy, data and data portability. In other words – who owns your data, and what combination of personal, corporate and regulatory action needs to address the rules around it?

It’s also a story about one of the major tensions of our time – to whom should a business be responsible? Shareholders? The community? Employees? And in a time of globalization, what responsibility does a company have within a country’s borders?

It’s a story that Roger McNamee has deeply explored. McNamee has been a Silicon Valley investor for 35 years, co-founding successful funds in venture, crossover and private equity – including his most recent fund, Elevation, with U2’s Bono as a co-founder. Along the way, one of McNamee’s investments was in Facebook, where he served, in part, as a mentor to CEO Mark Zuckerberg.

However, following the 2016 election – as well as the Brexit vote – McNamee decided he had seen enough. He felt that Facebook’s execution of its business model sometimes found itself at odds with a well-functioning democracy. He laid out his case in the book, “Zucked: Waking Up to the Facebook Catastrophe.”

McNamee offers part history, part blueprint to the future. He not only outlines how we – and Facebook – got here, but also how we get out of this digital mess. He offers recommendations to policymakers, of course, but also to the rest of us – businesses and individuals – about how we can change the way things are done. McNamee answers the ultimate question of who has the power – Facebook or us?


Transcript: Roger McNamee, ‘Zucked — Waking Up to the Facebook Catastrophe’

Chris Riback: Roger, thanks for joining me. I really appreciate your time.

Roger McNamee: It’s a pleasure to be here.

Chris Riback: So, how would you characterize what you’ve written, before we get into it, and there’s just so much to get into, of course. Is it a public policy call to arms? Have you created a history somewhat of Silicon Valley or maybe a technology-based who done it?

Roger McNamee: I would say that the book is really written from the perspective of Jimmy Stewart in Rear Window. I had spent 34 years as a tech investor and a technology optimist. I had been involved with Facebook as an advisor to Mark Zuckerberg and Sheryl Sandberg. I was an unapologetic fan, and all of a sudden in early 2016 I started to see things that just didn’t fit with my understanding in my frame for what Facebook was about. And over the course of that year, I saw more and more of them and I began to suspect that there was something wrong with the business model and algorithms that allowed bad actors to harm innocent people. And I reached out to Mark and Sheryl just before the election in 2016 to let them know that I was concerned and to warn them, because I assumed the company was the victim. That began a long period of discovery for me.

In the book I essentially describe that journey and, for anybody who’s out there, I really aimed it at parents of children, because I think they have a multi-generational incentive to be all over this stuff. And I really wrote the book using my journey as a way to reveal all the issues that matter here and explain the pieces of the technology that you need to know and ignoring the rest and just getting to the challenges that face policymakers and the challenges that face each and every one of us.

“That was the one that finally caused me to write to Sheryl and Mark and say, ‘Hey guys, I think there’s something really wrong here.'”

The book’s very hopeful because I think what I’ve discovered in this whole journey is that the human beings formerly known as users have a lot more power in this than we realize. You know, these companies, and I’m really saying both Google and Facebook and all their subsidiaries, so YouTube and Instagram, WhatsApp, the people who use those products provide attention. That’s what drives the business model of those companies. And we can alter the attention we give them in ways that will have an effect on the outcome. And it’s a lot easier to do that if we kind of agree to do it together, but there’s a lot more of us than there are of them.

And not only that, policymakers who understand there’s something really wrong here need to hear from their constituents that it’s important to them to do something. And so the book provides both recommendations to policymakers and to all of us about things we can do to both improve our own lives and maybe change the way things are done.

Chris Riback: Yes, it covers all of that. And I want to ask you about many parts of what you just described. I want to ask you about whether we’re in need of something like a technology or a data #MeToo movement, and you write a little bit about that. To your point, you absolutely do describe the technology and persuasive technology and persuasive in the algorithms and not just filter bubbles but everything else that goes around it. It is described in a way that makes it really clear and connects the dots both along the technology, but also the public policy and just the human behavior and the social components.

Roger McNamee: Well, here’s the thing. I spent 34 years studying this stuff, and I didn’t know anything about it because I stopped being involved with Facebook before they evolved these techniques. So when people come up to me and say, “Roger, this stuff is complicated,” I know exactly what they mean. And so in trying to write this book, my goal was to make it accessible to people who didn’t have 34 years they could spend studying the tech industry and still not understand it even then. Knock wood, I’ve succeeded, and that’s the goal. The goal is just to help people improve their lives.

Chris Riback: You come here yourself and indicate that you feel a little bit like the Jimmy Stewart character in Rear Window. I wonder if as well you feel like the Jimmy Stewart character in Mr. Smith Goes to Washington because in a way it struck me, this all kind of began because you’re a political junkie, didn’t it? I mean in that 2016 timeline to me was fascinating because from my read, you were minding your own business, looking at your Facebook feed like any good American and world citizen should do.

And in fact, as you note, about 2 billion of them do it every day or one and a half billion. And you started to notice that a growing number of your friends were sharing highly misogynistic photos of Hillary Clinton that had allegedly been originally posted by the Bernie Sanders campaign and-

Roger McNamee: But Facebook groups that were calling themselves Bay Area for Bernie, right?

Chris Riback: Exactly.

Roger McNamee: It wouldn’t necessarily have been the campaign. It would more likely have been just fans, but nonetheless.

Chris Riback: Okay, so fans. People, fine, fans, supporters of Bernie Sanders, and something about that just didn’t smell right to you, did it?

Roger McNamee: Well, the key thing that I saw was that every day there were new images and each time there were more people sharing them, which made me think that almost certainly someone was spending money to get Facebook users into the groups that they were running. And in retrospect what we learned was that that is exactly what was going on. And it was only because I’ve run a Facebook group for my band and I understand how that works that I could see that. It would ordinarily have gone right over my head, but I just happened to have a specific form of training that caused me to see that.

Then I saw a lot of other things over the course of the year related to the harvesting of data on people who expressed an interest in Black Lives Matter and selling that to police departments, which seemed like an obvious violation of the Fourth Amendment. And then Brexit happened, and I started to wonder about whether Facebook incendiary political ads would have an advantage over positive or neutral ones. And then we found out later in the year that Housing and Urban Development had cited Facebook for violating the Fair Housing Act because its ad system allowed people to discriminate in housing. That was the one that finally caused me to write to Sheryl and Mark and say, “Hey guys, I think there’s something really wrong here.”

Chris Riback: And that’s what you did, right? I mean that you wrote the memo and you’ve got the actual memo in the book. In the back of the book is the first appendix where-

Roger McNamee: It was written as an op-ed, so it’s more intense than… If I had it to do over again, I would have rewritten it as a memo and made it less intense because quite clearly they were less receptive to the message than I would’ve hoped.

“I think the company for business reasons had developed a culture that essentially ignored criticism and ignored any form of friction that might slow it down. They had such a successful economic engine, and the inner workings of that engine were in a black box that nobody could verify. And as a consequence it was pretty straightforward for them to ignore criticism, ignore regulators and just keep doing what they were doing. It had worked so successfully for so long, I think it had become reflexive. I don’t think they sat there and looked at what I was suggesting as anything more than a potential public relations problem, and they wanted to make the PR problem go away. And to their credit, they didn’t dismiss me completely.”

Chris Riback: Do you think it put them on the defensive, and then that’s kind of interesting too, because I’m thinking about you quote Zeynep Tufekci. You quote her point in that from the MIT Tech Review piece she wrote about how when in a social media environment, when we take on criticism, it’s like we’re getting criticism from opposing members or fans of another team. Do you think, and when you just said that about your memo to Sandberg and Zuckerberg, did they take it that way, do you think? Or did they take it a little bit defensively as opposed to what I think you were, certainly what you would characterize yourself as, I believe at that time was and maybe even now, a friend of the court. I mean, I think you meant it to be constructive, I believe.

Roger McNamee: I definitely meant it to be constructive. I believe that your point may well be valid. That said, I think the company for business reasons had developed a culture that essentially ignored criticism and ignored any form of friction that might slow it down. They had such a successful economic engine, and the inner workings of that engine were in a black box that nobody could verify. And as a consequence it was pretty straightforward for them to ignore criticism, ignore regulators and just keep doing what they were doing. It had worked so successfully for so long, I think it had become reflexive. I don’t think they sat there and looked at what I was suggesting as anything more than a potential public relations problem, and they wanted to make the PR problem go away.

And to their credit, they didn’t dismiss me completely. They handed me off to one of their lieutenants, a guy I really liked, and for better part of three months I interacted with him trying to persuade him that hang on, this is a trust business. You need to take care of the people who use your product because no matter what the law says, if they lose trust in you, your business is going to be harmed.

I was very naive in those days. I was convinced that this was being done to them and that they really had not played any role in it, without realizing that the way they had structured the business model, the incentives created by advertising, the notion that they needed to manipulate people’s attention. They had to do things that kept them engaged for longer periods of time. That in doing that they would both spur their ad business. But the ad business would essentially enabled bad actors to then manipulate what people think with disinformation, conspiracy theories and the like, and that that is what had happened in both the United Kingdom and the US in 2016 and in many countries since then.

That part was what got me looking in the first place. And as I dug into it, what I discovered is that was only one piece of a much bigger tapestry of problems that include public health, products that have inappropriate content for little kids as you see on YouTube Kids, and products like Instagram that have bullying of preteens and fear of missing out for teenagers. Facebook’s just been busted…for having two different product problems. One was having in app purchases in some of their games that allowed kids to run up thousands of dollars of credit card bills without any alert to the parents and without Facebook being understanding that somehow that that might be exploitative. And then secondly, both Facebook and Google were busted by Apple-

Chris Riback: The whole Apple thing, yes.

Roger McNamee:  … for having a so-called research app that was basically spying on people as young as 13 without them understanding the implications. I look at that and I say that feels like really poor judgment. But in a business environment where we’ve told entrepreneurs that they aren’t really responsible for the consequences of what they do, it’s understandable how they got there.

The trick in my mind is you’d love to see some messenger, somebody who is more effective than I am, who can get through to these folks and say, “Guys, this is the world you live in. This is the country you live in, this is the town you live in. It would really be helpful if you would be open to recognizing that some of the things you’re doing in the business are worse than traditional advertising.

Chris Riback: You raise the business model question, and you’ve raised it now. And by the way, we could probably have a whole discussion on the  Apple-Facebook interactions, Apple, Facebook, Google interactions over the last a week or two.

Roger McNamee: We’ll save that for another day.

Chris Riback: Yes, yes. We’ll save that one, but that’s fascinating on its own. You write in this and on page 87 you write how Facebook’s business model is to give the opportunity to exploit that data to just about anyone who is willing to pay for the privilege. So one, isn’t that the key and perhaps what you or we all should have known? But two, isn’t it what we all did know?

I mean we’ve known for years that that’s Facebook’s business model. And so why should it have been surprising that when the business is predicated on selling personal data that bad things happen from that?

Roger McNamee: Well, they sell access to the personal data. I would say that it came as a surprise to me, and here’s what I’ve learned. They say in traditional advertising businesses that we, the consumers, are not the customer. We’re the product. The problem with Facebook and Google is that in their model we’re not even the product. We’re the fuel. That essentially, the notion is that in traditional advertising, you gather data in order to make the product or service better for the people who use it and buy it.

In the Google and Facebook model, they gather data to do a little bit of that, but much more it’s about gathering data they then use in other contexts that may in fact actually harm the people whose data they’ve gathered. And that’s what I mean by fuel. It’s this notion that they have in a model where your goal is to manipulate attention and where you have continuous availability so that people are checking your product multiple times a day every day. There’s a real danger of habits that are formed in the early days becoming addictions and that when people are in that addicted state, they’re vulnerable to manipulation. And when they’re being manipulated, if the intent of the manipulator is not good, then bad things happen.

I believe that in an electoral context, that’s incredibly dangerous for democracy, and we’re not talking about any specific election. What I’m really saying here is that these are tools available to anyone, and there’s two kinds. There’s the foreign interference, and then there’s the domestic kind where because the ad targeting of these products is so specific and so heavily targeted towards emotional triggers. It’s really changed how political advertising works. It used to be that a candidate would sit there and say, “I’m for issue one, two and three,” and would try to convince the audience to support him or her based on one, two or three.

Now what they do is they look at each individual person, find out what their hot button trigger is. And if they think that they can convince the person to vote for them, then they accuse the opponent of being wrong on that issue, and if they think that there’s no way to move them, they just  basically say both candidates are terrible and try to discourage people from voting at all. That happened to a huge degree in 2016. There was a ton of voter suppression done with that micro targeting, and that’s just really bad for democracy.

Literally, anybody can do either one of those, either the foreign interference or the political manipulation. I just think in a democracy, social media has a responsibility not to get in the way. I think it’s reasonable and many people would assert that the economic power that Facebook and Google have accumulated is the result of hard work and genius, and I think that’s a perfectly reasonable conclusion. Where I think we really need to have the discussion is on their political power.

Certainly, no companies in the past century have accumulated political power like Facebook and Google, and I would argue that no company since the Dutch East India Company centuries ago has had the worldwide impact that they’ve had and the control of politics that they have. And they are not elected, they are not accountable, and they pretend as though they are neutral, when in fact for business reasons they will always or almost always align with the powerful against the weak just because they’re ubiquitous and they can’t afford to be out of step with the powerful. And that I think poses really unique and difficult challenges.

Chris Riback: And are you calling for perhaps an external intervention into their business model?

Roger McNamee: Well, to be clear, I’m mostly just trying to have a conversation, but in the book I very specifically have chapters. One of them is for policymakers on what the challenges are in public health, in democracy, in privacy, and in the economy, really in innovation. And it’s super difficult because the vocabulary we use today doesn’t match the problem.

People say to me, “Roger, all my data’s already out there. I can’t get it back. And oh, by the way, I haven’t done anything wrong.” And I sit there and go, “You’re right, I get that. But the kind of privacy we’re talking about here isn’t just about somebody hacking into your account. That’s not what we’re talking about.” We’re talking about privacy as the freedom to make choices without fear, and in a world where these systems can take your data and have other people influenced by it so that they effectively are spying on you, suddenly fear comes into your decision making process so that we have to be careful about this issue of before and after, that this is one of the biggest changes in history.

Then the other thing I say is to remember that each of us has more power in this than we realize, that these companies depend on our attention. And we don’t have to delete the apps because let’s face it, they’re fun. Many of them are useful and they’re incredibly convenient, but it’s really helpful to change your usage. And here’s the question I always leave with people. I say, “If you knew that by giving up a little of the convenience of these products, you could improve the functioning of democracy, you could improve your own mental health and that of your children, you could regain the right to make choices without fear, and you could do things that would help our economy that’d be a lot more fair, would you be willing to do anything?”

Would you be willing to alter your behavior? Would you maybe stop doing politics on social media? Would you maybe get your news from other sources? Would you maybe reach out to people in the real world and find points of common ground? Would you find alternative apps to use so that when you want to organize a group or share photos, you don’t have to do it in Facebook? Because these are really important functions, but there are other ways to get them done, less convenient but nonetheless out there.

I just look at these things and think to myself that each person has an opportunity to make a choice here, and all I want to do is show them where the things are that I can see that they do.

Chris Riback: Do you feel like this was inevitable or do you feel like different behavior, different decisions may have changed the outcome? I was thinking about it and then you kind of addressed this a bit at one point in the middle of the book as well. And I found myself thinking, are these just the terrible brutal growing pains that come from what many of us might agree is an incredible good or an incredible opportunity, which is to say connecting the world?

So if one takes the acceptance that connecting the world can be or is a good thing, and I know a lot of people may disagree with that, but let’s say that maybe it’s better if societies connect and you have better communication between and among cultures globally, but that doing that is hard and dangerous and exploratory. And isn’t it the nature of explorations that they don’t go perfectly? If one believes that connecting the world is a good thing, that being able to connect with every culture in the world is positive, could anyone have done it, do you think, without massive issues and corrections needed?

Roger McNamee: The answer is I think it definitely could have been done better. So there are two points I would make. The first is to remember that for its first 50 years, the technology industry was absolutely committed to what Steve Jobs called bicycles for the mind, which is products that increase human capability. They make us smarter, more efficient, more whatever. But they were very focused on personal productivity and all kinds of values that were good for the people who use the product.

Google and Facebook are on a different model. They’re on a model of extraction. They treat us as fuel and they are creating, for example, artificial intelligences. And AI is a category where the top three use cases are eliminating jobs, filter bubbles that tell us what to think, and recommendation and just to tell us what to purchase and enjoy. Those are things that go to the essence of what make us individuals, so I would argue those are the opposite of bicycles for the mind. That’s a flaw and that is in no way inevitable.

The second thing I’d point out is the thing that really struck me about Facebook when Mark first conceived of it is that he had addressed the failure modes that had caused all prior networks to turn into cesspools at about 50 to a hundred thousand users and eliminating, if you will, the lack of control. So he had authenticated identity in the early years and that was transformational in my view, because basically as long as you have authenticated identity, then there is a social stigma for behaving badly. And then secondly, he did give you real control of your data.

The problem is that as the company began its business model, he essentially relaxed both of those standards by forcing the policing into the community of users, the result of which was that bad actors were able to have inauthentic identities and therefore all the benefits of that, which create the cesspool that we see on Twitter and on YouTube and increasingly on Facebook and Instagram. And then he basically created an ad model where the valuable data was what you did on the rest of the internet, which was then used for targeting inside Facebook. And that stuff is called metadata. It’s data about data, and none of your Facebook privacy controls give you any control over metadata.

We’re in the situation where having chosen to build the business that way, having chosen to eliminate friction by eliminating authenticated identity and absolute user control, they guaranteed this horrible outcome. And this is not a growing pain. This is what it looks like. They made no plans for circuit breakers or containment strategies for emotional contagion, and that’s totally on them, because there’s a reason fire departments have containment strategies. There’s a reason the New York Stock Exchange has circuit breakers. Emotional contagion is very dangerous, and these companies believe that’s the problem of the people who use the products, not the problem of the platform.

Chris Riback: Yes. And your writing about that is incredibly powerful – “Lizard Brain” is a term that I actually hadn’t been familiar with.

Roger McNamee: Well, I want to really thank you for having me on. If people are interested in the book, it’s called Zucked and it really is my attempt to make everybody aware of the challenges we face and the options they have for dealing with it. And I’ve targeted it at basically parents of children because the last thing I want to do is have people get drowned in technobabble. That’s precisely where the problem comes from. There’s [00:25:30] so many aspects of this problem. What’s interesting about it is a relatively small number of changes can address all the problems, and that’s what I advocate.

Chris Riback: Roger, thank you for your time.

Roger McNamee: My pleasure. Take care.