Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> In the short term, I don't see how we force them to host actively harmful content without recategorizing their role in society.

This recent flip from "Facebook et al aren't doing enough, put the screws on them" to "we can't force Facebook not to censor" is quite disingenuous. These companies and their founders are famously liberal, and were dragged kicking and screaming into ever more heavy-handed moderation, by both public opinion and veiled threats of regulation from politicians. There's plenty of statements on the record of eg Zuckerberg saying what was obvious to most of us: nobody, including Facebook, wants them in the position of deciding what is true and what is false.

Leaving aside whether content moderation is a good thing, let's not pretend that the situation here is that Facebook really wanted to become the arbiter of truth and misinformation and we can't stop them from being so.



They already were at that point, regardless of what people claim.

Facebook had the ability to paint whatever picture they wanted as truth by controlling what their users saw at what time. And they utilized it proudly long before COVID to increase engagement.


Sure, and long before Covid, that was a valid (and much-expressed) criticism of them, as well as a general criticism of using non-federated platforms.

But expanding explicitly into deciding what users are allowed to see and express to each other is a million times worse than the type of banal malevolence that arises from "show people what they like to see".


There's the fundamental difference! I love finding the "fundamental difference". The crux, the place where philosophies diverge, where the understandings break down.

I fundamentally disagree with the premise that "keeping harmful, intentional disinformation away from people" is worse than "letting people unknowingly subscribe to disinformation".

I would argue, perhaps, that there should be open policies on what topics are off-limits. That Facebook, et. al. should have to document to the public what "viewpoints" and disinformation they limit - and furthermore, that more of these content-display algorithms should be auditable, competitive secrecy be damned.

I wouldn't call hundreds or thousands of people dying due to disinformation-backed vaccine skepticism "banal", either.


> "keeping harmful, intentional disinformation away from people"

This is begging the question though. It's assuming that "harmful, intentional disinformation" is 1) well-defined and 2) always going to be determined by those with your best interests at heart. It relies on a blind faith in the fundamental and eternal moral purity of Facebook and other mega-corporations. I wholeheartedly disagree that they fit this mold.

This is true even if you turn your religious passion towards government institutions instead of Facebook. Do you similarly agree that criminal trials and due process are unnecessary? After all, the same pre-hoc confidence in your ability to categorize without rigor leads to "Why would we want criminals going free due to technicalities and lawyer's tricks?". I assume you don't agree with this statement, because in that context, you've internalized the idea that institutions are both corruptible and flawed even when uncorrupted, and it behooves us as a society to have some epistemic humility instead of pretending that Truth is carved onto clay tablets and handed down from God.

If you've paid any attention to the pandemic, you'd know that even a situation where government is in full control of defining "misinformation" can be consistently and significantly misleading. "Mask truther" used to mean someone who thought wearing masks was a good idea for preventing spread, discussing the lab leak hypothesis was "misinformation", the vaccines were "rushed and unsafe" until Trump was out of office, etc etc etc. It's hard to pick a topic where it wasn't trivial to front-run official advice by several months, repeatedly, over the entire pandemic.

It's a bit of a paradox: The very certainty and (illusory) Truth-with-a-capital-T that you take for granted is forged through a process of skepticism, second-guessing, and constant poking at beliefs. Hamstringing that process is like killing the golden goose to make more room for all the eggs you plan to have.


Your entire paragraph of "if you've paid any attention" are questionable.

I never heard people get mocked at any point for wearing a mask during the pandemic, even in the beginning when the CDC said it wasn't necessary.

In my PERSONAL opinion, the "lab leak" theory was never misinformation, but uninformation: Discussion pushed forward by right-wing outlets to generate an enemy, a "they" to blame. They used it as a cudgel without evidence against Anthony Fauci, and they beat the shit out of Asians because of it. Most importantly, it was completely irrelevant to the extent of us debating it, when the focus was on containing a disease for which there was no cure or vaccine.

And while there was some public skepticism about the pace of the vaccine process, I likewise don't think there was a "switch-flip" of trust in it like you suggest. When it came time to take it, everyone who wasn't a vaccine skeptic already went to get it when they could, and clearly the Trump admin was in charge through the development of the vaccine.

---

There is also a difference between someone posting "I do not trust the government not to be incompetent, or not to run a mass trial on people" (though I think those people are nuts re: the vaccine), and someone saying "I know that Bill Gates and Anthony Fauci put microchips into a bone-melting serum that will activate in a year!"

It's a huge, multi-faceted issue. In the end, the problem to TRY to solve in coming years will be sifting between legitimate skepticism and good-faith debate, and nation-states/Fox-News lies that intend to manipulate you into anger and division, and whether private entities have the obligation to allow harmful information across their channels.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: