1: No spam or advertising. This basically means no linking to your own content on blogs, YouTube, Twitch, etc.
2: No bigotry or gatekeeping. This should be obvious, but neither of those things will be tolerated. This goes for linked content too; if the site has some heavy "anti-woke" energy, you probably shouldn't be posting it here.
3: No untagged game spoilers. If the game was recently released or not released at all yet, use the Spoiler tag (the little ⚠️ button) in the body text, and avoid typing spoilers in the title. It should also be avoided to openly talk about major story spoilers, even in old games.
Apart from what you‘re interpreting into my words, I said if someone is harassing you or speaking about lets say the things they did with their daughter yesterday, you can report them and have a computer look into it instead of a human.
Whatever privileges you have in your discord, you cant kick just anyone in every place. You either need privileges or a moderator to do it normally and my idea was to use AI to analyze the reported stuff.
I completely understand the sentiment of protecting children, but at the same time under that argument you can push the most dystopian and intrusive, overreaching legislature imaginable. It is the old balance of freedom versus safety, we can’t have complete safety without giving up all freedom.
And I think a constant ai driven monitoring of everything people say in the general vicinity of a microphone is very dystopian; which would be the eventual outcome of this.
I'm just gonna repeat myself since this is the most common answer I get in those topics:
The vast majority of people is being listened in on, analyzed and manipulated on a daily basis by far, far worse actors. Storing 1 minute of VC for 1 minute only accessible to this hypothetical bot *if someone reports them - facing wrongful report consequences themselves is not comparable to real privacy threats.
You don’t need to repeat yourself (and neither, be this condescending), I am well aware that this is happening to some degree already. Doesn’t mean I have to happily concede the little that is left.
Apart from what you‘re interpreting into my words, I said if someone is harassing you or speaking about lets say the things they did with their daughter yesterday, you can report them and have a computer look into it instead of a human.
Whatever privileges you have in your discord, you cant kick just anyone in every place. You either need privileges or a moderator to do it normally and my idea was to use AI to analyze the reported stuff.
I completely understand the sentiment of protecting children, but at the same time under that argument you can push the most dystopian and intrusive, overreaching legislature imaginable. It is the old balance of freedom versus safety, we can’t have complete safety without giving up all freedom.
And I think a constant ai driven monitoring of everything people say in the general vicinity of a microphone is very dystopian; which would be the eventual outcome of this.
I'm just gonna repeat myself since this is the most common answer I get in those topics:
The vast majority of people is being listened in on, analyzed and manipulated on a daily basis by far, far worse actors. Storing 1 minute of VC for 1 minute only accessible to this hypothetical bot *if someone reports them - facing wrongful report consequences themselves is not comparable to real privacy threats.
You don’t need to repeat yourself (and neither, be this condescending), I am well aware that this is happening to some degree already. Doesn’t mean I have to happily concede the little that is left.
You‘re again interpreting something into my words that I didnt say. Maybe try not to play the victim in every comment. It’s abrasive.
It’s not happening to some degree. Its happening left right and center. Denying that a computer would help with vc moderation does not help at all.
Good day.
Right back at ya buddy. I’m not putting words in your mouth.
And no matter how often times you repeat it, my discord call doesn’t constitute a threat to public safety.