Halftime: Zuckerberg’s Finally, and Inevitably, Gone Full Bro
The internet is far too large to ever meaningfully govern — but what we just had is better than self-removed, invincible tech oligarchs wiping their hands of the mess altogether
Welcome back to another edition of Halftime, the links roundup counterpart to my weekly Posting Nexus essay. This week, it’s the NFL season and The Athletic has your Super Bowl predictions, a Polygon reporter’s husband had sex with a clown (!!) in Baldur’s Gate 3, and some personal thoughts on Meta’s decision to say “fuck it” when it comes to content moderation and fact checking.
P.S. — the photo below is on behalf of every non-Chiefs NFL fan who desperately does not want another Chiefs Super Bowl…
Data!
The NFL’s regular season has come to an end, and with it, questions about who will win the Super Bowl are dominating group chats and betting boards. It’s a difference of 0.1% — 0.1%!! — between the Kansas City Chiefs (22.5%) and the Detroit Lions (22.4%) — (The Athletic)
Deaths in 2024 beat out deaths in 2023 for the most Wikipedia’d queries of the year. What beat out deaths in 2023? ChatGPT. — (Reddit)
Three Important Arguments
Attention — The Internet is Worse than a Brainwashing Machine — (The Atlantic)
Identity — The R-Word’s Comeback is a Grim Sign of Our Political Moment — (Rolling Stone)
Platform — Meta is scaling back on moderation because history always repeats itself — (New York Times)
Seven Must Reads of the Week
The Business of History is Booming thanks to podcasts and books — (Bloomberg)
Casual One: Why Netflix Looks Like That — (N+1)
The Video Game Industry is Finally Getting Serious About Player Safety — (Wired)
Meta Now Lets Users Say That Gay and Trans People Have Mental Illnesses — (Wired)
NYC Plans Crackdown on Obscured License Plates as Congestion Pricing Starts — (Gothamist)
Peacock is Launching Mini-Games and Short Form Video — (TechCrunch)
Bonus: I wrote about why trying to build miniature TikTok clones is not the best use of anyone’s time — (Posting Nexus)
Three Fun Time Wasters
Funny One — Help! My Partner Accidentally had Clown Sex (in Baldur’s Gate 3) — (Polygon)
Wild One — Why the WalMart Birkin Struck a Nerve — (Business of Fashion)
Endearing One — A rose’s journey from a Columbia farm to a parade float in Pasadena — (Los Angeles Times)
A Movie, a TV show, a Podcast, and a Book
Movie — A Real Pain
A Halftime Thesis: Zuckerberg’s Finally, and Inevitably, Gone Full Bro
I’ve got a longer piece on this coming, but I wanted to jot down a few thoughts on this week’s Meta news regarding a pullback on content moderation, and specifically around fact checking. A slippery slope is the phrase that you’ll hear over the next several years, and it’s certainly the one bouncing around my head. One careless step in either direction sends you tumbling to the bottom of the slope, preventing any real hope of course correction in the middle of a downward spiral. Missteps are also often difficult to spot. That’s what makes them so dangerous. Take Meta’s new policy, the reasoning of which C.E.O. Mark Zuckerberg gave to people via a Reels video on Instagram. I’ve collected the most important parts below (italicized emphasis mine):
It’s time to get back to our roots around free expression and giving people voice on our platforms
Replace fact checkers with Community Notes
Simplify our content policies and remove restrictions on topics like immigration and gender that are out of touch with mainstream discourse
Bring back civic content
Work with President Trump to push back against foreign government going after American companies to censor more.
So much of the response I’ve seen to Zuckerberg’s announcement perpetuates that he’s enacting these new rules to please President Trump. Submission that a near autocratic-minded politician would find enormously endearing. Trump even blatantly told reporters that Zuckerberg probably introduced the new policy because of him. It’s not difficult to see why. I don’t doubt there is certainly an element of a man bowing to an incoming king with the presentation of a sacrificial lamb in order to ensure that the watchful eye never hovers over his own house.
That said, I also firmly believe that Trump didn’t cause Zuckerberg to change his policy; Trump allowed for it. Similar to Elon Musk and X’s radical shift regarding content moderation policies, these moves didn’t further free speech, they championed a certain kind of speech. This wasn’t a speech or a culture that Trump defined, but one that he tapped into. I’m not a political expert, nor will I ever claim to be one, but I am someone who writes about what commands our attention and the incentives to do so. This wave of right-wing broism that seems to define almost every piece of political and cultural news that appears on our phones these days isn’t so much an out-of-left-field rise, but a permission slip co-signed by the president of the free world and tech C.E.O.s (or, those who control the flow of information and communication) for those who want to indulge in pushing the boundaries.
The emphasis is on brosim, not right-wing. Not to get back into the whole “Joe Rogan won Trump the election” conversation, but some of the right-wing culture influence seems to arise from politically pacifist influencers whose specialties lie in their ability to contribute to political discourse while coming across apolitical. When George Bernard Shaw was told by the BBC that he couldn’t talk about politics and religion on their early airwaves because they didn’t want to produce controversial content, Shaw responded that politics and religion were all he ever talked about. Shit, we do too — especially when we’re not. Everything is politics and religion. The more it’s nowhere, the easier to proliferate everywhere.
The internet is far too large to ever meaningfully govern — but what we just had is better than self-removed, invincible tech oligarchs wiping their hands of the mess altogether, and watching as the digital civil war plays out from their yachts in unimpeachable waters
I want to be as clear as I can be here: I do not care about people posting their preferred political, economic, or religious stances so long as those stances do not sacrifice the safety of others. As it relates to Meta, now allowing people to say that trans and queer people are mentally ill is a type of genuinely harmful thinking that could theoretically result in physical harm in states like Florida or Texas.
Zuckerberg either fundamentally understands or doesn’t grasp this idea. There are two ways of looking at Meta’s content moderation attempts over the last few years. Firstly, Zuckerberg invested in content moderation and it didn’t work. Semafor’s Reed Albergotti argues this in a new column, pointing out that all the “appropriate targets of moderation — false health information, foreign propaganda, bizarre conspiracy theories — flourished anyway.” Albergotti added that the “ultimate result of the era of content moderation was to embolden the political right and add credibility and reach to lies and half-truths on social media.”
Secondly, and my own personal point of view, is that the internet is far too large to ever meaningfully govern, but having some course of action is better than throwing caution to the wind and embracing pure anarchy. What we’re willing to give up, even if it’s not perfect and even if it didn’t feel like it accomplished much, is better than self-removed, invincible tech oligarchs wiping their hands of the mess altogether, and watching as the digital civil war plays out from their yachts in unimpeachable waters.
Bluntly, it’s easy to not give a flying fuck about what happens next when you know that you’re wrapped in an extra layer of protection by leaning into the next sitting president’s interests. It’s easy not to give a flying fuck when you know people are still going to use your products no matter what you do or don’t do. It’s easy to not give a flying fuck when your revenue isn’t taking a hit and these moves embolden a user base who may increase their overall engagement and activity. Lines on the graph go up. That’s all that matters in a country soon to be captained once again by a man who epitomizes all that defines late stage capitalism.
My much smarter friend, Ryan Broderick, laid it out well in his Garbage Day post yesterday:
“But I can tell you where this is all headed, though much of this is already happening. Under Zuckerberg’s new “censorship”-free plan, Meta’s social networks will immediately fill up with hatred and harassment. Which will make a fertile ground for terrorism and extremism. Scams and spam will clog comments and direct messages. And illicit content, like non-consensual sexual material, will proliferate in private corners of networks like group messages and private Groups. Algorithms will mindlessly spread this slop, boosted by the loudest, dumbest, most reactionary users on the platform, helping it evolve and metastasize into darker, stickier social movements. And the network will effectively break down.
But Meta is betting that the average user won’t care or notice. AI profiles will like their posts, comment on them, and even make content for them. A feedback loop of nonsense and violence. Our worst, unmoderated impulses, shared by algorithm and reaffirmed by AI. Where nothing has to be true and everything is popular. A world where if Meta does inspire conspiracy theories, race riots, or insurrections, no one will actually notice.”
My friends like to make fun of me for tying everything back to Rome. But, conceptually, republics falling before autocratic dictatorships are established is kind of the “in” story for 2025. One of the great theories about Rome’s republic falling, beyond the advent of political propaganda, political violence, and power hungry egotistical demagogues, is that it got too large to govern. Chaos follows scale. Growth as end points can only result in uncontained self-immolation. Combining self-interests governed by immense wealth and power, influence over unprecedentedly large bodies of people, and destruction of trust in institutions ultimately leads to two things: unrest and faithlessness. Those lead to war.
The only grand weapon the public can arm itself with is information. Not posts. Information. Perhaps I’d be less pessimistic if any of the minor efforts companies like Meta and Google put into ensuring established and new independent media companies — those organizations that do actual reporting, actual fact checking, actual editing, and are upheld to actual standards — were more supported. At some level, we can’t solely blame the tech companies for being where people gravitate toward. The telephone usurped the telegraph for obvious reasons, and television did the same to radio. The internet, and the social internet, was the next wave of inevitable innovation, disruption, and communication evolution. But monopolizing all the world’s attention by disguising a platform as a source of all truth will only lead to ruinous information seeking habits in the generations ahead.
Meta’s new policy stance dares to ask the unavoidable question: what happens when those information systems cave into the very hole of unrest and faithlessness that people are trying to climb their way out of?
What then?