Facebook Is Bad at Publishing
Would normal publication practices help to make social media safe and pleasant?
You saw it earlier in the day, you could swear. You now want to show it to your spouse, friend, or child. It was really cute/interesting/relevant. But now you can’t find it. Your newsfeed has morphed, like it always does. When was it published? Why couldn’t I bookmark it? Was it unpublished? Where has it gone? Who even wrote it?
Facebook and social media generally continues to be a source of confusion and intrusion in our lives, and more people are looking to regulate it, break it up, or change its business model. I personally think all three would help. But what would also help, and what isn’t talked about as much is this — What if Facebook and social media generally just published well?
This begs an important question that is slowly clarifying: Is Facebook a publisher? My definition is that a publisher is anyone or any organization assuming risk on behalf of authors in order to commercialize their works. This makes WordPress, Twitter, Facebook, and even Spotify publishers.
Companies like Facebook, Google, and Twitter call themselves “platforms” so they can take advantage of an exemption from risk — Section 230 of the Communications Decency Act, which grants platforms like these immunity from libel and other liabilities, and reads in part:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
But is this defensible anymore? For years, smart people have been going after this increasingly suspect and outmoded regulatory loophole. In a favorite interview from 2017, Scott Galloway from NYU summarized it as follows [bold emphasis added]:
I think Facebook and Google both face the same issue, and that is they want to sell advertising against content and then say, “But we don’t have the responsibilities of a traditional media company.” . . . It’s total BS. . . . What if I were McDonald’s and 80 percent of the beef I was serving before the election day was fake beef, and people ended up getting encephalitis and making bad decisions? . . . [And what if to avoid getting sued out of existence,] I said, “Wait, wait, wait. Hold on, I’m not a fast food restaurant. I’m a fast food platform, so I can’t be responsible for the beef I serve.” . . . “We’re a platform, not a media company.” No you’re not. You run content, you run advertising against it. Boom, congratulations, you’re a media company. . . . You have some onus of the wonderful things that come along with being a media company, including 90 percent gross margins, influence of unbelievable magnitude, but there is a level of responsibility and wow have they let us down.
Protections like Section 230 are being eroded because the platforms have been greedy and irresponsible.
Do these exemptions matter? When such exemptions were removed in Germany, Facebook somehow found a way to both hire more editors per capita than anywhere in the world and to eliminate racist, violent, and harassing posts almost entirely. In short, with risk comes good publishing practices.
Suddenly, with liability staring down the barrel, Facebook became a far better publisher, a far better media company.
What does it mean to publish well?
It means to publish in a way that makes the information discoverable. It means to insist on accurate and accountable authorship. It means creating and maintaining an archive of published works. It means taking responsibility if something wrong or damaging was published, which means vetting things before they go up.
What if Facebook had a searchable archive? They certainly do inside the organization somewhere for their ad network and rudimentary AI to use, but they don’t expose it to users in a straightforward or useful manner. But what if they did? Do you think there would be a faster reckoning around Section 230 and their liability? Do you think it would become clearer faster to users that they’re being manipulated, and how? Do you think the manipulation would be less possible, because you could see everything that was published in the last hour on Facebook, and wonder why you were shown what you saw and not shown other things?
What if Facebook had a normal, comprehensive search engine? Why don’t they?
What if you knew who, precisely, published a post and took responsibility for it? What if it wasn’t a murky situation but one that was obvious, like in scholarly publishing or journalism where you can contact the author, the editor, and the owner to inquire if there is a problem?
What about the publication of advertisements? What if you knew, as you do in all good publishing venues, who is paying for the ad you’re seeing?
We now have perhaps the most reckless and dominant publisher ever in Facebook. The result? Scads of misinformation, a lack of a full accounting of what has been published, by whom, and who it is reaching. Nobody has the same version of Facebook — yours doesn’t match mine — so there is no Version of Record.
Of course, publication and dissemination of information isn’t Facebook’s core business. The core business is an advertising engine that uses traffic created by its distribution system to generate revenue. Facebook doesn’t really care what it disseminates, only that doing so generates traffic it can monetize. And this is the main reason Facebook is such a bad publisher — it has no soul. Most publishers are actually quite devoted to their audiences, their authors, and the interactions between the two. Facebook doesn’t care about either in any qualitative way. As long as the two are interacting — in anger, in murder, in suicide, in joy — Facebook can sell ads, which makes Facebook’s leaders happy.
Perhaps Facebook would like to learn how to publish safely, accountably, and reliably, in a manner that fits with the progression of knowledge and understanding. It does respond rationally to risk and liability. When (not if) they face the full risk of publishing, I know a few thousand people who can teach them how to publish properly.