Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Here's a possible system: replace "random people" with "people with a solid track record of previous correct reports". When something gets reported, human moderators categorize it as either "correctly reported, and should be taken down", "incorrectly reported, but possibly a misunderstanding or a borderline case", or "a clearly bad-faith report against content that no reasonable person would actually believe breaks the rules". Keep track of how many reports from each user end up in each category.

If almost all of your reports are in the first category, then you're considered trusted, and if enough trusted users report a post, then it can be automatically removed before a human moderator sees it. If you haven't reported anything before, or too many of your reports are in the second category, then your report only helps to get the submission in front of a human moderator and doesn't directly contribute to it being removed. If more than a handful of your reports ever end up in the third category, you get banned for abusing the report system.



Is that not already the case? Do you think facebook doesn't already weight reports by the historical quality of the reporter?

That still seems like a violation of "No content should ever be taken down automatically just because a bunch of random people report it", as you wrote above.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: