I think the discussion is important, because something needs to be done. It's a scary situation going on.
Yes, the videos are kids being themselves, and typically uploaded by the children. However, then these predators show up pretending to be other children, and they talk these kids into doing more explicit and inappropriate things. They ask children to film a video sucking on a lollipop, they ask them to make a video showing how to do the splits, or they ask them to make a video playing twister with their sister. I even saw them asking children to do the "toothpaste challenge", where they try to get them to fill their mouth with toothpaste so it looks like cum. If I start typing "toothpaste challenge" YouTube autocompletes "toothpaste challenge drool", "toothpaste challenge tongue", "toothpaste challenge little girl", "toothpaste challenge girls".
These kids don't know any better. They make 50 videos that all have 50 views, then they make a yoga video and it gets recommended over and over again to predators, and they end up with one million views. Then the above situation unfolds where they're requested to make more inappropriate content, or the children realize their gymnastics videos are by far the most popular, so they make more, and more. They think they're making tutorials for other children, but the other side of the camera is just thousands of predators watching and pushing them to go further. Then they start trying to trade contact information with the children to get them off the site.
I don't know how you fix the problem, but it is a problem. We're talking about videos with millions of views here. It's not one or two people trying to exploit these children, it's literally hundreds of thousands, and I think the kids need some help protecting themselves.
> However, then these predators show up pretending to be other children, and they talk these kids into doing more explicit and inappropriate things.
So instead of trying to ban the videos and restricting user freedom to prevent the videos and deploying automation to recognize them, why don't we ban the predatory behavior and set system rules to prevent and/or direct AI at identifying that.
Blocking comments only solves the direct contact between predators and children.
As I said, some of these children will make 50 videos with no viewers, but then they create a gymnastics or yoga video that goes viral with predators.
Even if the comments are disabled, these children realize what subject matter grows their subscribers and view count the fastest, so they keep producing more of those videos. Then another kid sees their channel, and notices their yoga video got one million views, so now they try making yoga videos to replicate it. To their surprise, they also gain traction, and the cycle continues.
So, even without a single comment, you still end up with thousands of children making tutorials on how to do the splits, while they cannot begin to comprehend what is taking place and how they're being exploited.
> Comments are the only visible indication of the predatory behaviour
Strange, because the post bugging the limitation of blocking contents identified other visible manifestations of predatory behavior.
Admittedly, they may be ones that are difficult to distinguish in a single act from acceptable behavior, but there is no reason an automatic detection system needs to consider each action independently in isolation.
I'll tell you why, and it's going to be unpopular: Money. Millions of views === money, no matter WHAT or WHY it's being viewed ( remember that many of these videos are monetized ). Do you seriously think Google encourages their developers to develop systems which will reduce viewing?... The only way I see this happening, is under duress, i.e. profits suffer because of bad publicity, or similar social pressure.
Yes, the videos are kids being themselves, and typically uploaded by the children. However, then these predators show up pretending to be other children, and they talk these kids into doing more explicit and inappropriate things. They ask children to film a video sucking on a lollipop, they ask them to make a video showing how to do the splits, or they ask them to make a video playing twister with their sister. I even saw them asking children to do the "toothpaste challenge", where they try to get them to fill their mouth with toothpaste so it looks like cum. If I start typing "toothpaste challenge" YouTube autocompletes "toothpaste challenge drool", "toothpaste challenge tongue", "toothpaste challenge little girl", "toothpaste challenge girls".
These kids don't know any better. They make 50 videos that all have 50 views, then they make a yoga video and it gets recommended over and over again to predators, and they end up with one million views. Then the above situation unfolds where they're requested to make more inappropriate content, or the children realize their gymnastics videos are by far the most popular, so they make more, and more. They think they're making tutorials for other children, but the other side of the camera is just thousands of predators watching and pushing them to go further. Then they start trying to trade contact information with the children to get them off the site.
I don't know how you fix the problem, but it is a problem. We're talking about videos with millions of views here. It's not one or two people trying to exploit these children, it's literally hundreds of thousands, and I think the kids need some help protecting themselves.