Tech'ed Up
What's happening on the frontlines of tech? Tune in for a zippy conversation about emerging technology hosted by industry veteran Niki Christoff. From the C-Suite and Capitol Hill to AI and crypto, quantum computing to the decentralized internet, Niki breaks down the trends in tech to help savvy listeners get even smarter. Guests include experts, enthusiasts, regulators, policymakers, CEOs, and reporters.
New episodes premiere bi-weekly on Thursdays. Subscribe for the latest episodes on YouTube or listen on your podcast app of choice.
Tech'ed Up
Facebook: "An Ugly Truth" • Cecilia Kang (New York Times)
New York Times technology reporter and Pulitzer finalist, Cecilia Kang, joins host Niki Christoff in the studio to talk all things Facebook. Taped the same day Congress grilled company executives over Instagram's effect on kids, this conversation covers Facebook's unusual corporate structure and how its apps are used around the world. Bonus: hot takes on Mark Zuckerberg’s tweenage ambition to be a modern-day Roman emperor.
“I think the key here is that they don't for a moment tap the brakes. [Facebook’s] goal continues to be: more and more engagement and staying ahead and growth and scaling.” -Cecila Kang
- Watch video content on YouTube
- Buy "An Ugly Truth: Inside Facebook's Battle for Domination"
- Follow Niki on Twitter
- Learn More at www.techedup.com
- Check out video on YouTube
- Follow Niki on LinkedIn
Niki: I’m Niki Christoff and welcome to Tech’ed Up. Today, I chat all things Facebook with Cecilia Kang, whose reporting on the company earned her a spot as a Pulitzer finalist in 2018.
We explore all the ways in which Facebook is different from other social media platforms, from its unusual corporate structure to the mindset of its CEO, and how billions of people outside the United States interact with its apps.
A note to our listeners: it is my intention that this show follow the Thumper Rule. That’s right, from Bambi. If you don’t have something nice to say about someone, don’t say anything at all. But in today’s episode, we’re talking about Mark Zuckerberg and I do mock him...just a little. I promise to do better next time.
Niki: Welcome Cecilia Kang. Thank you for coming on the show.
Cecilia: Well thanks, Niki, for having me.
Niki: You're currently a national technology reporter at the New York Times. You've just published a book, An Ugly Truth: Inside Facebook's Battle for Domination. Spoiler alert.
[laughs]
They don't come out that well. And so we want to dig into Facebook today.
Cecilia: Well, thank you. And Niki, it's so great to talk to you. I have known you for so long. So it's been so great to reconnect over the book but also just to have your wisdom and to always be able to tap into your thoughts on everything that's happening in the tech sector. Because it’s been a long journey and it will continue to be a huge story in Washington.
Niki: Absolutely. And we worked together on opposite sides of the net for 10 years. So I was a flack for Big Tech. You were reporting on us. And one of the things about this book that I think is important is: it's investigative journalism. You don't editorialize. You're just laying out the facts. The facts themselves can lead any reader to certain conclusions.
And I want to talk about that. But you did thousands of hours of interviews. It's sort of ironic that the book--that the content you're writing about, the company you're writing about, is known for not fact-checking but you spend all this time on it. You've come to this conclusion of how the company's structured. And that's what I'd love to talk about: is Facebook the worst?
Cecilia: I think Facebook is different. I think Facebook stands apart from the other Big Tech companies: Google, Apple, Microsoft, and Amazon. In the sense that I think to understand why Facebook functions the way it does and how Facebook got to here, to the point that it is at right now, you have to understand that Facebook is a company that is governed and ruled by one individual.
And that's Mark Zuckerberg. It's not a company that makes decisions based on committee. Often, it’s usually that Mark Zuckerberg’s who makes the final decision on very important issues and questions related to speech and misinformation, related to data privacy, related to how the tools are going to be used.
So that stands out as a very different thing and how Facebook is unique, in the tech sector specifically, is because Mark has 60% of voting shares of the company. And we were really surprised actually to hear the level of decision-making that all concentrated around Mark Zuckerberg.
We were surprised also to learn that the board is, in many ways, sort of a paper tiger. It does not have a lot of power. It is an advisory board and some board members that we talked to described it that way. They offer advice and Mark Zuckerberg can choose to take the advice or not. The other thing that we were really surprised to learn in our reporting was that other top executives, including Sheryl Sandberg and others within the company, do not push back that much when it comes to Mark's views on things. They may say, ‘Mark, we disagree with your position on antisemitic content, that it should be allowed on the platform.’ And that Holocaust denial for--specifically is permitted. Sheryl Sandberg voiced that once. He knocked it down and that was about it, you know?
When Nancy Pelosi's--a video of Nancy Pelosi that was doctored, that showed her as appearing inebriated was uploaded onto the site, Sheryl Sandberg protested. There was a huge protest within Nancy Pelosi's office that Facebook should take it down and Sheryl Sandberg and others within the company told Mark Zuckerberg, ‘we think it should be taken down.’
And you know this is just another example of where Mark Zuckerberg said, ‘No. I think we should keep it up.’ He made some weird distinction about like doctored videos versus deep fakes. AI versus like actual human manipulation of videos. And nobody really fought back or pushed back. And today even still Nancy Pelosi's office will not talk to Facebook.
So the repercussions have been lasting. So that is, again, like this being a company that was built in Mark's image that embodies who Mark is and is controlled by Mark Zuckerberg is something that I think is quite unique in the tech sector right now.
Niki: I’m so glad you talked about the corporate structure because all the other companies you mentioned are also public companies but they're not run by founders. They're run by professional CEOs, executives, right?
Apple's not a Steve Jobs company anymore. It's a Tim Cook company. You have Sundar Pichai at Google, right? And Google has YouTube, which has many of the similar issues that any other social media content-driven company might have, but Sundar Pichai is not an emperor.
In your book, you mentioned something about Mark that I thought was extremely interesting, which is that he models himself after a Roman emperor.
Cecilia: Yeah. It’s so fascinating. Mark Zuckerberg, from his middle school and high school days, became really fascinated with the classics, the Greek and Roman classics. And he particularly became really interested in Caesar Augustus.
And he was really enthralled with the fact that Cesar Augustus was a very controversial figure. Through very harsh methods--this is actually quoting Mark Zuckerberg--he said, ‘You know, look. Caesar Augustus through very hard harsh methods ultimately established 200 years of peace.’ The takeaway of that is that he had a great impact, he established the Roman empire. The takeaway from that is that Mark Zuckerberg envisions himself in his role, and Facebook similarly, to Caesar Augustus and the establishment of the Roman Empire in that there will be a lot of china that will be broken along the way. There will be a lot of controversy.
There'll be a lot of mistakes. But ultimately Mark Zuckerberg views Facebook, and he bets, that Facebook will be a net positive force and that it will be a lasting force and it will be historical. It will not be MySpace or AOL that will just, you know, historically potentially not be as significant or lasting in the history books.
He sees this as his company, as the establishment of something much bigger with historical impact and import. And he understands that he won't be liked. And this is another quote that Mark Zuckerberg has said, you know, ‘It's better to be understood than liked.’ And I think that that can apply to Caesar Augustus and Mark Zuckerberg’s view of himself.
He understands that in order to be of historical import and to build something that is so important and so impactful, there will be problems along the way. But in the end, we believe that what we're doing is good. And lasting, and I think that's a really important window into understanding how Mark Zuckerberg has been able to withstand so much public criticism over the last few years,
Niki: You think of middle schoolers maybe wanting to be president or win a Nobel prize. I don't often think of them wanting to be a Roman emperor. And I don't want to go too far down this path. But when I read that, I kept thinking he's not Octavian. He's not the first Roman emperor creating an era of peace. He might be Commodus [‘cah-ma-dus’] or Commodus [‘coe-ma-dus’]. I don't even know how to pronounce it, but the emperor who actually did end the peace, right?
Who ended up through his blinders, creating an era of instability and chaos. And in fact the reason the emperors could be so dangerous is it truly is a thumbs up or a thumbs down on what's going to happen. We've seen reporting about, in the Wall Street Journal about the damage that Instagram can cause to young people, specifically young girls that people within the company know.
And I'd like your thoughts on that, because you mentioned people don't really push back on Mark Zuckerberg. What we're seeing now is employees leaking data but are there any consequences for him? So the board can't control him. They're just an advisory board. He owns a supermajority of voting shares. Is this the only way people are going to try to get a change? And do you think they're going to change anything?
Cecilia: I really want to commend the Wall Street Journal because they've had a really fantastic series of stories that use documents and research that have been created within Facebook for the first time to show that in fact, executives have known for a long time about many problems within the company. And there have been employees and executives who have pushed back on decisions to prioritize engagement, to not heed warnings about the toxicity of Instagram, for example, on teens and teen girls.
And the company and executives, top executives, have chosen in all of these instances to continue to prioritize engagement and growth. That's the big level, the high-level takeaway from these. The Instagram stories, I think, this Instagram story that hit which essentially showed that there has been research over the last few years within Facebook that showed that one in three girls, one in three teens, particularly girls, have said in internal research that they feel bad when they use Instagram. Many have said in the UK and the US that it really contributes to very poor body image. It exacerbates anxiety, self-image problems. It, in the end, has really been a detriment to mental and emotional health. So this story is in many ways just validating what I think is obvious to many people for many, many years. Including lawmakers actually, who have said for so many years, ‘I see this with my own kids and with my own grandkids.’ And Facebook has over and over said there's no really conclusive evidence that shows that this is problematic for kids.
And so, the fact that the Journal was able to write a story based on internal research that shows, in fact, Facebook has had some evidence that this is problematic and has hidden that evidence from the public is, in many ways, indefensible. It's just been, so it just validates and proves what many of us as reporters, as well as public policy officials, lawmakers, have known is abundantly evident anecdotally. And I think that that is incredibly detrimental to Facebook's--the reputation of Facebook, but also any sort of lasting trust that people have in Facebook. So it feels like a tipping point moment when it comes to action.
That particular story really frustrated me in particular because I have for years talked to Facebook about what has been so clear and what third-party researchers have called out has been, you know, problematic for a generation of teenagers and youth who have grown up on Instagram and who viewed themselves and the world around them through the Instagram filter.
And it's just not been positive. And Facebook has over and over again, told me personally and other reporters, ‘You're just dead wrong, Cecilia.’ There's just not that proven and, you know, until you can point to research that really shows that then it's irresponsible for you to write any stories related to that.
Niki: The fact that Instagram makes teen girls feel bad about themselves is like: grass is green. Oh, everybody knows. And so it does feel a little bit like corporate gaslighting. I used to call it the “So Sorry Tour.” Sheryl Sandberg: ‘I'm so sorry, Mark’s so sorry.’ But no changes. And--
Cecilia: We have that on the book cover! That's why we put these blurbs of like, ‘I'm sorry, I'm sorry, I'm sorry’ from Mark Zuckerberg and Sheryl Sandberg over the years.
Niki: Absolutely. And saying you're sorry is different than saying, ‘We've got a problem and we're going to try to fix it.’ And I think this goes back to the original question which is, are they the worst? Because I can guarantee that YouTube has done research on radicalization through YouTube. I am sure they have. Twitter absolutely looks at this, right? People are doing internal work to figure out what's happening. But one difference is that I do feel that you see more concerted efforts at the product level to try to fix it. Is it perfect? No. But there's more accountability through governance.
And I do think you see choices to attempt to address what is a huge, difficult issue with these platforms. And I think that's where there's something that feels insincere about Facebook's apologies. Telling a reporter, ‘It's not true.’ There's no basis for what we all see as self-evident, that we can see with our own eyes, and I guess that's my question is: does it matter? Because really, if you had to handicap, if Congress can do anything, what--what do you think the odds are? That there could be legislation? Or what would happen? You can bring them in for a hearing but what's the legislative consequence?
Cecilia: Yeah, it's really hard to see how Washington, aside from public embarrassment and showing their anger, it’s hard to see beyond, like, if there's anything that can happen beyond the performative, you know? I think Facebook, you know, it's been three years since Mark Zuckerberg came to Washington and testified for the first time in the Cambridge Analytica hearings and nothing has happened. There's been a lawsuit filed by the Federal Trade Commission to try to break up Facebook and by state attorneys general, but that'll be a long years, many years long battle.
Cecilia: I'm sure. I do think it's--what's really interesting with the apology tour, Niki, is that they've decided to no longer apologize,
Niki: Right. They just announced this. They’re done apologizing.
Cecilia: They’re done apologizing. And it was really astounding to see Nick Clegg who is the VP of corporate communications, essentially the top sports spokesperson. Nick Clegg is the, also the former Deputy Prime Minister of the UK and he's now one of the top executives and he's crafting all of this sort of messaging and PR strategy for the company now.
And his response to the Wall Street Journal series last week was, with a lot of dramatic language, you know, basically saying that the Wall Street Journal egregiously portrays the company's intentions falsely and this is not a direct quote, but that's essentially his takeaway, but he does not dispute any of the facts.
And so I think the Wall Street Journal series and the reporters who worked on it, you know, the takeaway for them should be. And I think that, you know, I imagine is, is that, well, he's not. You know, he's not disputing any of the actual reporting in our stories. He's refuting, like, the process and the content--and the framing.
It's also notable that when those Wall Street Journal stories came out, Mark Zuckerberg and Sheryl Sandberg have been radio silent on the Instagram story on stories about disinformation in COVID, health. It’s just like the series of stories. Instead, they're posting on Facebook about, you know, Mark Zuckerberg posted photos of, you know, a video of him fencing with an Olympic champion and then he, and then one of my--
[laughs]
Niki: I\m trying to, I'm trying to keep a straight face.
Cecilia: It's hard. I mean he's just talking about his personal interests and my colleagues, Sheera Frenkel and Ryan Mac wrote a story about this ‘no apologies’ pivot at Facebook and this new program or new sort of strategy they have which is called Project Amplify, which is to inject, to go on the offense essentially, and to inject positive news and positive portrayals of Facebook within people's newsfeeds. To take Mark away from the public eye when it comes to these controversies. Ror the last few years, Mark Zuckerberg has been the face of the response to a lot of these problems.
And Sheryl Sandberg is as well, but you're hearing nothing from them. So they've decided essentially like, screw it.
They've become more combative and they've also decided to be less transparent and to cut off access to data, to researchers and journalists through--for the longest time, journalists were able to access this tool called CrowdTangle which shows you, which shows the content that is the most engaging and that people engage with the most on Facebook.
Cecilia: And now they're starting to cut that off and they've cut off access to information to academics. Some disinformation researchers at NYU, their project was cut off by Facebook. And so the company seems like it's retrenching and going on the offense.
Niki: So essentially what you've just outlined is: no more apologies, even fauxpologies, right? No more sending Sheryl Sandberg out as the human shield to try to get positive news stories. They're just going to go silent. They're going to use their own platform to promote positive content about themselves.
They are no longer going to give access to research. In fact, I think I saw that they're not even going to do any more research, you know, why should we see what the rot is internally if people are just going to leak it? And, and I think that this goes to an imperviousness or sort of a, there is, there is no consequence.
Partly--we continue to see the value of the company explode. And one thing I'd love to talk about that I don't think people necessarily have front of mind. So we think about how we interact with Facebook in the United States. But in other parts of the world--so actually this is one of their weakest markets.
It's declining in revenue. I deleted my Facebook account six years ago. Not because I was mad at Mark Zuckerberg, but I just thought it was lame. I am on Instagram but I was acquired as a user. And on Instagram and, and that's how we think of it. But in other parts of the world, Facebook is actually the only way some people get onto the internet.
More than half of their revenue is based on users outside the United States and Canada. And in some ways that might embolden them toward U.S. regulators because does it even matter if they can just keep growing globally? And I'd love your thoughts or some explanation on how other parts of the world interact with this company.
Cecilia: Facebook is so important around the world. it’s still so important outside the U.S. and that's where, as you said, all the growth will continue to be. The U.S. market is saturated and people are declining their use. So I think it's helpful to go back to how they got there in the first place.
And around 2014, Mark Zuckerberg decided, ‘You know what? I want to reach the next 4 billion people.’ When they, once they reach 1 billion users around the world. And he launched this program called Free Basics or Internet Essentials is what, another way they described it, where Mark Zuckerberg traveled around the world and he sent his deputies around the world to negotiate with regulators and with wireless carriers and phone manufacturers around the world to get into emerging markets, including Myanmar, the Philippines, Sri Lanka, and other places, and India.
And what they did was he essentially decided, okay, as long as there is an internet connection, what we can do is we can provide internet access for free. And make Facebook the first point of entry onto the internet and what that looked like was in Myanmar and the Philippines, for example, offering for cheap or for free very basic feature phones with nearly-free internet service from a wireless carrier. And this is like really basic stuff that can operate on, like, you know, 2G, 3G wireless networks.
And those phones came preloaded with the Facebook app--a very dumbed-down version of the Facebook app. So it didn't require a lot of bandwidth. And so for many people that was their first entree to using internet service. And so their whole experience was through essentially the Facebook filter. That's how they communicated with each other.
Cecilia: They connected with each other and their network effects happened very quickly in those countries where friends were able to connect with other friends and they were able to establish these networks within these--on these phones and through the Facebook app. And it worked, and a lot of people adopted this. And they continue to and this program still is alive in a lot of countries.
I think it's really also important to know that Facebook is still very popular through its other apps, particularly WhatsApp in emerging markets and WhatsApp’s a huge presence in countries like Brazil, Sri Lanka, in India. It should also be noted that Brazil potentially will have an election this fall and disinformation about the election and health is a huge problem on WhatsApp, another company owned by Facebook. And so Facebook's presence as--is not just through the Blue app, which is how Facebook describes its flagship product internally, but through WhatsApp as well.
So it continues to be a huge, huge presence globally. And it's--it continues to be, it is viewed as a utility in these countries. It's so cheap, accessible, and important for people. It's the way that they're communicating. And it also suffers from the same problems of disinformation and privacy problems that, you know, that Blue, the Blue app, continues to suffer from in the U.S.
Niki: I think that's so important because I do feel that there's some--taking a look at how they operate globally because we always have such a U.S.-focused perspective but in other parts of the world. You know, I sometimes feel like I have to be on Instagram but I don't have to be on Instagram. And I use Signal, I don't use WhatsApp. But in some places, it was in your book in fact, that people think of the word ‘Facebook’ and ‘internet’ as interchangeable.
Cecilia: Oh yeah. Facebook is the internet to a lot of people in the world. And I think what we wrote in the book about Myanmar and Facebook's entry into Myanmar I think is really important to understand because the problem still exists in that Facebook has not ramped up its security forces and the number of content moderators that speak the local languages of the countries that they are in.
In Myanmar, they had one person when disinformation was spreading by the military against the Rohingya and other Muslims In Myanmar. And this is a country that has more than a hundred languages and they had one person. That person was not even located in Myanmar. That person was located overseas in Dublin and they do not--and I think about the number of people who speak variations of Arabic.
They don't have the number of people with local cultural and language knowledge to be able to police disinformation that is incredibly harmful. That is interrupting elections and affecting how people are making decisions on coronavirus vaccines, et cetera. So it continues to be a huge problem.
Niki: And I think that's a point we might even end on, which is when I think through legislation, most legislation, it shouldn't be targeted at one company. And when you think about Facebook, there are platforms with problematic speech on them: Reddit, NextDoor. But those are tiny companies. But for Facebook to not have enough people to speak the languages in the countries where they're not just operating but where they know there is an immense amount of political risk with what's happening on the platform, they have every resource in the world to put money into it.
And they don't; it's a choice.
Cecilia: It is a choice. And I should say Facebook has definitely tried to hire, to expand its content moderation and security team. And they have now something like 40,000 content moderators. But they still admit themselves, that they cannot catch up to the amount of content and the scale that the company's at right now and the amount of content that flows through its various apps.
They're still behind. They absolutely will admit that. But you're absolutely right, Niki, in that this is a company with a trillion-dollar valuation.
Niki: Trillion dollars! With a T.
Cecilia: And I think the key here is that they don't for a moment tap the brakes. It all continues to be, the goal continues to be: more and more engagement and staying ahead and growth and scaling. And I think that's where the disconnect is. Like, yes, they're far behind and they're trying but they're also pushing far ahead as well.
Niki: I think that's fair. And it's a good point because they have done--I actually know some people who work on security there who are trying to help with misinformation and disinformation, which leads to this sort of question of, okay, you're leaking internal documents to the Wall Street Journal, but you're still investing a lot of stock in this company.
And sometimes I wonder if the real accountability is when you start losing engineers when you start to have a retention problem. Could there be a shareholder lawsuit? Are there things that could happen that will move faster than the federal government where people just take a principled stance?
Cecilia: Yeah. I mean, first of all, it absolutely has to be an external force. That pressures to sort of change. And I agree there could be things before legislation. They do have trouble with retention. some top executives have been leaving, including Scheffner, Michael Scheffner, the CTO who recently announced his departure. And two very senior women on the executive staff,: Carolyn Everson, head of ads, as well as Fiji Sumo, who was I believe head of the Facebook app itself.
And so they've decided, they've announced their departures as well. So people are leaving. The engineers are the ones that are really important as you say, Niki. I also have been thinking recently about how other industries that have been under similar scrutiny and who faced similar scandals, how things changed. When it comes to energy and tobacco, for example, it was mass consumer movements. Slow-moving, but mass consumer movements where people just decided ‘we're done with cigarettes,’ or ‘we're done with buying certain things that contribute to environmental problems.’
And that was where the power rested was actually in the people. So as you said, a lot of people in the U.S. are not using Facebook as much as they used to. You're seeing that un-coolness factor, like really, you know front and present when it comes to Facebook, the Blue app in the U.S. and even the growth of Instagram is questionable whether it can continue to be sustained in the U.S. and in Western Europe as well. Maybe there'll be a consumer movement, you know like people thought that Yahoo would stay as popular as ever and forever. And look, you know, do we often talk about Yahoo now?
Niki: I'll say that leads to the federal government. And so there is some responsibility on the federal government for allowing this unending acquisition strategy with no, you know, no brakes on it.
Cecilia: And there's a new vanguard of people in these agencies who have a very clear, like very, very strong and aggressive agenda when it comes to antitrust and when it comes to consumer protection and I think the President has surprised everyone with his picks for the head of the FTC Lina Khan, as well as the DOJ for antitrust, Jonathan Cantor, and putting in place someone like Tim Wu, who is very vocal about his criticism towards Big Tech and his views that many companies should be broken up.
Niki: I think that's right. Now, of course, this is where the institutionalist in me comes out and says, ‘Congress should not pass laws that are specifically aimed at one company.’ We should be giving resources to the FTC, not necessarily putting activists in charge. However, that's my personal opinion.
And I do think, at a minimum, these agencies need more resources to enforce the laws that are already on the books and to protect consumers from harm which is really the point: it's not to protect competitors. it's to protect consumers.
Cecilia: Absolutely. Absolutely. And I think that there is, really interestingly--just a side note: a lot of competitors are very happy right now and they are lobbying very hard for their own causes. And so the balance is, and what we should be paying attention to is whether some companies benefit from this who are just as big and as powerful as well.
Niki: Absolutely. They’re all snitching on each other to the feds. I've been in the room for that. So you make a great point which was, it may end up being the people themselves, the teenagers themselves, knowing that actually, they don't feel bad about themselves because something's wrong with them. They feel bad about themselves because there's a model, a business model, that reinforces that.
So I'm going to encourage people to buy your book. A lot of people have already bought it. An Ugly Truth: Inside Facebook's Battle for Domination. I think the important thing, even if nobody wants to read about Mark Zuckerberg anymore, supporting investigative journalism is so important because if we don't have the facts, we can't make informed choices. And I think you are, you have a sterling reputation as a really meticulous reporter and so people should support your work.
Cecilia: Thank you so much. Thank you. I appreciate that.
Niki: Thank you for coming on.
Next week our guest is Congressman Will Hurd, who explains AI and 5G and why he believes we’re in a new cold war with the Chinese government, and how the US can catch up. Be sure to subscribe to Tech’ed Up wherever you get your podcasts and video content is available on YouTube, the link is in the show notes.