You don’t like Facebook. Democrats don’t like Facebook. Republicans don’t like Facebook. It may be the one thing on which most of the United States agrees.
But the “Facebook Problem” is almost certainly worse than you think.
That’s the core message of Frances Haugen, the former Facebook projects manager who, in September 2021, disclosed over 20,000 pages of documents that shed light on the darkest of online places. The evidence fueled the reporting of The Wall Street Journal’s “The Facebook Files,” which found the company, whose corporate name is now Meta (FB), “knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands.”
This article is part of Road to Consensus, a series highlighting speakers and the big ideas they will discuss at Consensus 2022, CoinDesk’s festival of the year June 9-12 in Austin, Texas. Learn more.
Just some of the damning takeaways: Facebook’s algorithms knowingly made the site “angrier” to boost engagement, Facebook was intentionally trying to recruit preteens (even though the minimum age is 13) and that Facebook knew that Instagram is toxic for teenage girls.
Haugen then testified before the U.S. Congress, where she charged that “Facebook’s products harm children, stoke division and weaken our democracy.” Based in Puerto Rico, she now describes herself as an “advocate for accountability and transparency in social media,” recently writing an essay for The New York Times praising Europe’s Digital Services Act, which she thinks will “for the first time pull back the curtain on the algorithms that choose what we see and when we see it in our feeds.” She’s happy about the Digital Services Act, but knows it’s only a start.
One possible solution?
Blockchain.
On a recent Zoom call, Haugen suggested an arresting thought experiment: What if Facebook had been founded as a decentralized autonomous organization? “If there had been a DAO that was regulating Facebook – if it had been owned by the users – I don’t think we would’ve said, “Hey, keep putting random [stuff] in our accounts that we didn’t ask for,” says Haugen.
She suspects that a “distributed” type of social network – one that’s truly knitted from our friends and family – could offer a better path forward. Haugen also explains why the problem is more global than you think, what she wants to see as a solution (hint: it’s not great news for CEO Mark Zuckerberg) and why Elon Musk buying Twitter (TWTR) might be a win for social media.
So it’s been over seven months since the “Facebook Files.” What strikes you as the biggest problem today?
I would say it’s around algorithm ranking. So in March of 2021, Nick Clegg [ex-deputy prime minister of the U.K. and vice‑president of global affairs at Meta] – oh, bless his heart – came out with an article called “It Takes Two To Tango.” I strongly encourage you to go read that editorial. It’s a work of art.
He says [and I’m paraphrasing], “Hey, you keep blaming us for the stuff you see on Facebook, but let’s be honest here. You chose your friends, you chose your interests. It takes two to tango. Watch where you’re pointing that finger, because four fingers are pointing back at you.”
I’m guessing you don’t find this particularly compelling!
Talk about victim blaming. He said this while knowing that Facebook’s researchers had run the same experiment over and over again, where they took blank accounts and they followed moderate interests. In the case of Instagram, they followed healthy eating accounts. And let’s be honest, we could all eat a little healthier.
Or a lot healthier, in my case.
And all [people] did was click on the first five or 10 things each day, and follow any hashtags that were suggested. Within two to three weeks, they were being sent actively pro-anorexia content, and active self-harm content. There was no “two people tangoing.” It was just the escalator of engagement-based ranking.
Can you elaborate on why this is so important?
I’ll give you a little example. So I was interviewed by a journalist maybe two weeks ago, and he just had a baby and he made a new account for his baby. It’s a very cute baby. The only things they post in this account are cute baby photos. There’s one photo a day. The baby has no friends other than other cute babies, right?
It sounds like a pretty great account, to be honest.
They only post cute baby photos. And yet, 10% of his feed is of children suffering. It’s of children in the hospital, the tubes coming out of them. It’s of severely deformed children. Dying children. What in the world takes you from cute baby to mangled child?
Read more: David Z. Morris – Facebook Steals Another Crypto Idea for Its Nonsensical Rebrand
Because all the algorithm knows is there’s a thing called babies, and that some baby content ends up getting higher engagement than other content. It turns out that even though he has not put a comment on a single one of those photos, and he has not put a Like on any one of those photos, he probably dwells on them.
That’s unsettling as hell.
Think about what that does in other contexts, right? In the case of teenagers, it leads to kids starving themselves or killing themselves. In the case of adults, it pulls people towards these extremes. When they ran the experiment with a center-left kind of [viewpoint], they got pushed towards let’s-kill-Republicans. When they did it on center-right, they got pushed towards white genocide. And this is not on a month’s time horizon. This is on a week’s time horizon.
It’s terrifying.
Think about what that does in society. And this is where it gets really scary. This is why I get up every single day. The version of Facebook we interact with in the United States is the cleanest, most sanitized version of Facebook in the world.
In 2020, they spent 87% of their misinformation budget on English, even if only 9% of users spoke English. Most people are unaware that there are at least a billion people in the world – if not 2 billion people – for whom the internet equals Facebook.
2 billion?
Facebook went into their countries and because of something called Free Basics they said, “Hey if you use Facebook, your data is free. If you use anything on the open web, you will pay for the data yourself.”
So think of what that market dynamic does in terms of pushing everyone onto Facebook. Now you have a situation where it’s a very fragile country. The most fragile places in the world are often linguistically diverse, they speak smaller languages and now Facebook’s business model can’t support giving you safety systems.
When we focus on censorship, instead of focusing on product safety we basically leave behind people who are in the most fragile places on Earth. And those people don’t get to leave Facebook. They don’t get to consent.
What do you see as blockchain’s role in all of this, as a possible solution?
The thing that makes me most excited is, [what if] there had been a DAO governing Facebook in 2008? To be clear, the problem with Facebook is not your family or your friends.
Facebook has run experiments where all [it does] is give you more content that you consented to. That’s content from people you actually friended, pages you actually followed, groups you actually joined. When [it does] that for free, you’ve got less hate speech, less nudity, less violence. They’re just like, “Hey, let’s trust your judgment, and give you more of what you ask for.” Not rocket science.
[But] Facebook had to get you to consume more content every single quarter since 2008, and family and friends let Facebook down. Facebook needed them to keep producing more and more content. And when they didn’t, they started doing all these weird little hacks.
So how would a DAO fit into this?
If there had been a DAO that was regulating Facebook – if it had been owned by the users – I don’t think we would’ve said, “Hey, keep putting random [crap] in our accounts that we didn’t ask for.” We’d still have something that was like Facebook of 2008. So I’m cautiously optimistic that exploring different business models could have potential.
The secondary thing is, I think it would be easier to run a version of Facebook that was about our family and friends. Family and friends are not the problem. A system of amplification that’s using algorithms to direct our attention – that’s the problem.
Podcast: What Facebook’s Patents Tell Us About the Fight for the Soul of the Metaverse
You know, if you can do it in a distributed way, and [build] a system that’s very much like Facebook, but it’s just your family and friends, I think that would be much safer.
What do you think it will take to ultimately solve the Facebook problem?
I think, at a minimum, it’s a corporate governance issue. One of the fundamental problems with Facebook is that [it] won’t acknowledge power. [It] can’t acknowledge responsibility. For example, let’s look at the situation where high school kids are getting broken bones because kids are picking fights to put them on Instagram. Think about that for a moment. What would it take for Instagram to take that account down? Why aren’t they taking that down?
They can’t acknowledge responsibility. So unless there’s a major leadership change … I spoke at a risk conference yesterday, and the CEO of the trade organization was saying, “It’s not about having checklists. It’s not about making sure someone goes through a form. It’s about having a culture of accountability.” And fundamentally, Facebook lacks a culture of accountability.
What are you hoping to see?
My hope is that the [U.S. Securities and Exchange Commission] acts. And one of the things we’re going to ask is for them to require Mark [Zuckerberg] to sell some of his stock. That would allow the shareholders to step in. So that’s my big hope. Who knows if it will happen or not?
And I think the fact that the DSA [Digital Services Act] passed means that we’re going to be able to start developing solutions.
Give us an example?
I’ll give you one that works in every language. Should you have to click on a link to re-share it?
That makes so much sense.
Twitter requires you to do it, Facebook doesn’t require you to do it. And it reduces misinformation by 10% to 15%.
Speaking of Twitter, you once suggested that a private Twitter – owned by [Tesla (TSLA) CEO Elon] Musk – might actually be safer. Why is that?
So remember how I was talking about how before, if there had been a DAO for Facebook, we probably wouldn’t have gotten a bunch of [crap] injected into our accounts? It would have stayed much more about our family and friends. There are a lot of non-content based solutions.
What do you mean by that?
That means focusing on product safety, not magical AIs [artificial intelligence] that pull things out [as a form of censorship]. But the only way you can do those things is if you’re willing to sacrifice little slivers of profit and little numbers of users. So part of why I’m willing to root for Elon is that the first thing he said publicly is that we’re taking down the bots.
A thing that most people aren’t aware of – and this might be interesting for this specific [crypto] audience – is that when we talk about dollars, we have very detailed accounting laws, right? So, if you want to say I have a dollar or I have a liability, [there are] very specific rules on when you have to acknowledge these things.
We don’t have a similar set of rules for what is a person, but the values of companies are incredibly dependent on the number of users they say they have. And I’ve talked to people who run the biggest captchas in the world, and they say there are websites where 90% of the reported users are bots.
Damn.
Right. And those sites intentionally choose very lax captcha settings because they want to have bigger numbers, but the number-one threat to us is bots. And Elon was, like, “We’re going to take advantage of the fact that we don’t have to report user numbers anymore to wipe the slate clean.”
Let’s finish on a personal note. What you did was incredibly brave. If you don’t mind sharing, what has the aftermath been like for you?
You know, it’s interesting. People imagine the gnashing of teeth, the drama, all of these things. But I’ve had a remarkably uneventful transition.
I’m surprised and delighted to hear that.
I guess I get interviewed a lot more. But in Puerto Rico no one recognizes me, which is great. Even online it’s crazy. I think I’ve gotten maybe 15 to 20 mean things ever sent to me in my DMs. So if you’d like to be the first … [Both laugh.]
And the part I find remarkable is that [often] women who are public figures get really badly sexually harassed. I don’t even get sexually harassed, which is shocking to me as someone who has worked on four social networks. So I can’t really complain. It’s been pretty chill.
Fingers crossed it stays that way. Thanks for doing this, and see you on our new DAO version of Facebook!
Save a Seat Now
5.56%
4.09%
3.85%
6.88%
6.32%
View All Prices
Layer 2
Sign up for The Node, our daily newsletter bringing you the biggest crypto news and ideas.