>
I guess we’re at the mod coverage problem - take volunteer mods; it’s very common for mods to be asleep, when a goat related link is shared. When you get online 8 hours later, theres a page of reports.
When I consider my colleagues who work in the same department: they really have very different preferred schedules concerning what their preferred work hours are (one colleague would even love to work from 11 pm to 7 am - and then getting to sleep - if he was allowed to). If you ensure that you have both larks and "nightowls" among your (voluntary) moderation team, this problem should become mitigated.
Then this comes back to size of the network. HN for example is small enough that we have just a few moderators here and it works.
But once the network grows to a large size it requires a lot of moderators and you start running into problems of moderation quality over large groups of people.
I admit that ensuring consistent moderation quality is the harder problem than the moderation coverage (sleep pattern ;-) ) problem.
Nevertheless, I do believe that there do exist at least partial solutions for this problem, and a lot of problems concerning moderation quality are in my opinion actually self-inflicted by the companies:
I see the central issue that the companies have deeply inconsistent goals what they want vs not want on their websites. Also, even if there is some consistency, they commonly don't clearly communicate these boundaries to the users (often for "political" or reputation reasons).
Keeping this in mind, I claim that all of the following strategies can work (but also each one will infuriate at least one specific group of users, which you will thus indirectly pressure to leave your platform), and have (successfully) been used by various platforms:
1. Simply ban discussions of some well-defined topics that tend to stir up controversies and heated discussion (even though "one side may be clearly right"). This will, of course, infuriate users who are on the "free speech" side. Also people who have a "currently politically accepted" stance on the controversial topic will be angry that they are not allowed to post about their "right" opinion on this topic, which is a central part of their life.
2. Only allow arguments for one side of some controversial topics ("taking a stance"): this will infuriate people who are in the other camp, or are on the free speech side. Also consider that for a lot of highly controversial topics, which side is "right" can change every few years "when the political wind changes direction". The infuriated users likely won't come back.
3. Mostly allow free speech, but strongly moderate comments where people post severe insults. This needs moderators who are highly trustable by the users. Very commonly, moderators are more tolerant towards insults from one side than from the other (or consider comments that are insulting, but within their Overton window, to be acceptable). As a platform, you have to give such moderators clear warnings, or even get rid of them.
While this (if done correctly) will pacify many people who are on the "free speech" side, be aware that 3 likely leads to a platform with "more heated" and "controversial" discussions, which people who are more on the "sensitive" and "nice" side likely won't like. Also advertisers are often not fond of an environment where there are "heated" and "controversial" discussions (even if the users of the platform actually like these).
>Simply ban discussions of some well-defined topics that tend to stir up controversies and heated discussion (even though "one side may be clearly right").
Yup. One of my favored options, if you are running your own community. There are some topics that just increase conflict and are unresolvable without very active referee work. (Religion, Politics, Sex, Identity)
2) This is fine ? Ah, you are considering a platform like Meta, who has to give space to everyone. Dont know on this one, too many conflicting ways this can go.
3) One thing not discussed enough, is how moderating affects mods. Your experience is alien to what most users go through, since you see the 1-3% of crap others don't see. Mental health is a genuine issue for mods, with PTSD being a real risk if you are on one of the gore/child porn queues.
These options to a degree are discussed and being considered. At the cost of being a broken record, more "normal" users need to see the other side of community running.
Theres MANY issues with the layman idea of Freespeech, its hitting real issues when it comes to online spaces and the free for all meeting of minds we have going on.
There are some amazing things that come out of it, like people learning entirely new dance moves, food or ideas. The dark parts need actual engagement, and need more people in threads like this who can chime in with their experiences, and get others down into the weeds and problem solving.
I really believe that we will have to come up with a new agreement on what is "ok" when it comes to speech, and part of it is going to be realizing that we want freespeech because it enables a fair market place of ideas. Or something else. I would rather it happen ground up, rather than top down.
"Then this comes back to size of the network. HN for example is small enough that we have just a few moderators here and it works.
But once the network grows to a large size it requires a lot of moderators and you start running into problems of moderation quality over large groups of people."
As you said, consistent moderation is different that coverage. Coverage will matter for smaller teams.
There’s a better alternative for all of these solutions in terms of of consistency, COPE was released recently, and it’s basically a light weight LLM trained on applying policy to content. In theory that can be used to handle all the consistency issues and coverage issues. It’s beta though, and needs to be tested en masse.
When I consider my colleagues who work in the same department: they really have very different preferred schedules concerning what their preferred work hours are (one colleague would even love to work from 11 pm to 7 am - and then getting to sleep - if he was allowed to). If you ensure that you have both larks and "nightowls" among your (voluntary) moderation team, this problem should become mitigated.