Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't understand what you mean with your last paragraphs, but I disagree that there's a centralized moderation. Of all the social media I'm aware of, moderation is mostly distributed, and weighted. Instagram flagged posts go to "a team" to review, but they weigh that against the account and the content. Reddit moderation is per-subreddit. Facebook moderation is a combination of group moderators and site moderators.

In any case, I don't think moderation isn't a factor, because moderation is for commenters. Something like 90% of people just lurk. And that's where ads, influencers, comments, and everything else, are targeted.

How can we get more people viewing TikToks, or YouTube Shorts? How can we get more people to sub to our Patreon? How can we get more followers, or likes? This has nothing to do with moderation. It's the math of "what can we post that will make people "engage" with their eyeballs and their clicks?" That's what matters. Partly because eyeballs equal ad dollars; but also because eyeballs equal influence. The science of manipulation is getting you to see what I want you to see, and you coming back for more. The more you come back, the more you're part of my in-group.

Another example is astroturfing. I can't remember if it started for commercial or political gain, but the point is the same. Post some fake shit to make people believe there's a grassroots opinion, in order to get them to back it, with them assuming it's really a grassroots movement. Whether you're Vladimir Putin or DuPont, you benefit the same way: manipulation of public perception, through the science of social media disinformation.



I don't mean there's a single centralized moderating authority overseeing everything, but rather the general tools and mechanisms used for moderation on the Internet are centralized and undemocratic. They produce groups with authoritarian power structures and norms. Those groups get larger over time with no limit on their size. When they get large enough, they fight over which one gets to run a country. This is how modern democratic countries can turn authoritarian very fast, and it's already happening.

When I say "group", I don't mean an actual Facebook group or subreddit (it could be, though). I mean a group of intellectually/ideologically homogenous people. They may be distributed across many subreddits and comment sections. Forums/subreddits/servers can be separate entities in form, but not in substance. Two Discord servers that moderate content the same way are the same group in this context.

Moderation is not just for posters. It also affects lurking viewers because it changes what they will see. If a post is deleted by a moderator, then that moderator has decided for the viewers what they can and cannot see.

Up/down voting (aka likes/dislikes) is a hidden form of moderation as well. People's likes and dislikes are deciding what other people are more likely to see because upvoted posts get to the top of the feed. Recommendation and ranking algorithms do the same thing.

I'm not making a statement about who has nobler goals, be it the ad companies or Putin or the US gov or the people here on HN. I'm saying that the concept of centralized moderation on the Internet is itself the problem. Regardless of what or whose goals these tools serve, they're bad because they coagulate people into intellectually and ideologically homogenous groups, and there is no group size limit due to the practically infinite connectivity of the Internet. This will create nasty real-world consequences in the long run. But we can defuse this by moving all moderation and ranking to the client-side.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact