The Instigator
Pro (for)
0 Points
The Contender
Con (against)
0 Points

Should sites used for gaming have tighter restrictions on the social media interface?

Do you like this debate?NoYes+2
Add this debate to Google Add this debate to Delicious Add this debate to FaceBook Add this debate to Digg  
Post Voting Period
The voting period for this debate has ended.
after 0 votes the winner is...
It's a Tie!
Voting Style: Open Point System: 7 Point
Started: 1/2/2018 Category: Games
Updated: 3 years ago Status: Post Voting Period
Viewed: 498 times Debate No: 106316
Debate Rounds (3)
Comments (1)
Votes (0)




I believe that we should have tighter restrictions on what can be said online in gaming chats and what people can see. since the number of people who are being targeted on gaming sites is getting larger every year, due to new games coming out. here is some links and proof I've provided.

GROOMING: "is the process by which a child predator gains the trust of a victim by building a relationship with the child and then breaking down his or her defenses. After the predator has built a foundation of trust with the child, he begins to make some sort of sexual contact with the child, whether it is sending explicit messages and photos or attempting to meet in person to commit sexual abuse or rape."

"In fact, in 2012, Microsoft, Sony, Disney, and Electronic Arts shut down the accounts of over 3,500 registered sex offenders in a mission called "Operation Game Over."

"They highlight research suggesting 200 million girls and 100 million boys across the globe will be sexually victimised before they reach adulthood - and a significant number of those will have been lured online."


If I am interpreting your meaning correctly, you think that certain content should be automatically blocked by game servers.

Before I make my point, let me be clear that I am in NO WAY WHATSOEVER supporting people who use online chat interface or video games to manipulate children, or manipulate anyone for that matter.

If online chat interfaces and video game servers were programmed to block certain content, that would be pointless. People out there who want to talk dirty online will always find a way around restrictions. If, for instance, a server is programmed to block a certain dirty word, the people who want to talk dirty will be able to find a MILLION loopholes and ways to get around it. Because a computer can only block a certain word. It cannot look for meaning. I don't even talk dirty online and I can think of infinite alternate spellings for curse words within minutes and still let the other person know that I intended it as a dirty word.

Now wait! Before you tell me "a restriction with loopholes is better than nothing," let me suggest an alternative, which is much better than moderators wasting time and energy setting up restrictions.

Chat sites and game servers should introduce the following safety measures:

-Dirty words and even alternate spellings like the examples above, can set off an automatic flag that alerts a human being moderator working for the company, who will then read the conversation and decide whether or not the comment was inappropriate. If the moderator decides that it is definitely inappropriate, the offender will receive a warning, or a suspension or deactivation- depending on how inappropriate or persistent the offender is. It is also possible that if the moderator is unsure whether the comment was inappropriate, he can send a message to the potential victim asking for their perspective. The company can decide on the specifics for what types of offenses will result in what punishments.

-When creating an account, the server would require that the new user submit a valid age verification (like a credit card) in order to be treated as an adult. If a new user does not have valid age identification, the account has to be linked to a parent account that DOES have valid age certification. The parent account can then approve all potential conversations, video chats, and incoming and outgoing messages. This would depend on the settings from the parent account.

-(this is just an idea) EVERY user that creates an account MUST watch a mandatory video about what constitutes inappropriate behavior and how to deal with it and report it to the proper authorities, whether the victim is a child or an adult.
Debate Round No. 1


While I agree that the safety measures provided in your post would be more secure, some games and sites already have this feature, but still, people find loopholes and ways around it. For example, say that hambo657 is a sex offender and I am a child, here is a sample of what you could see:
hambo657: Hey kid.
MrTako: Yas? what is it
hambo657: Wanna come 2 my house so we can meet in real life
hambo657: Hey kid.
MrTako: Yas? what is it
hambo657: Wanna come 2 my
MrTako: Yay! now we can be friends irl
MrTako logged off

That is an example of how people can bypass an inappropriate filter, I went on some kids gaming systems and found that Minecraft and Roblox have the most of this kind of bypassing with stuff like:

There are many ways to bypass this system, so I propose that we would simply get rid of the chat or use a select library of phrases such as "Hello" or "Get to the point" so that there is no way a predator could access this system, but outside of gameplay then there would be maybe a room chat limited to very few words, and would block all configurations of inappropriate words. Also even if they were caught by the moderator they could just make a new account and friend the person they were targeting and claim that they are hambo657 by playing all the same games and eventually gaining the trust of the target, and this child won't know whats going on because depending on their age the children who are the targets are usually very young.


I see your point, and I had to think about this idea for a while. Then I realized that with the hypothetical parent (a) approving their kid's ability to talk to hambo657 and (b) monitoring all incoming and outgoing messages, hambo657's to prey on the kid would be shut down before it started. Limiting chat sites to just select phrases is limiting freedom of speech, and plus I don't want to use a website where I can't type whatever I want. Even if I'm not preying on children (which I don't), I still wouldn't want to use select phrases. Think about trying to have this debate built out of generic phrases like "How are you?". It would be impossible! Even if you're just trying to discuss attack strategy in a video game or something, it would still be really frustrating.

You also forgot something pretty huge about the video game world. Most gamers communicate by voice on headsets. You can't restrict that, but you CAN let a parent approve the child's right to talk to specific and listen to their game sessions live or afterwards.
Debate Round No. 2


While I agree on that, most sites don't require their users to have their parents enable them to chat with people, but some may do. So, I agree that both of the resolutions provided by me and you would be sufficient, they might just be no way to limit social media interface, for the time being, given our current forms of communication.

Can't wait to see how this debate ends up.
Good luck.


I would like to make the final point that the level of restrictions you originally proposed would discourage people from using the site, and also limit free speech.

Yes, I realize that if you sued a website for only allowing preset statements, the website would probably win since you still get to express opinions. However, I think that it still limits free speech because not everyone will be able to say what they want.

If you want freedom of speech, hate speech will come with it. It's a package deal.
Debate Round No. 3
1 comment has been posted on this debate.
Posted by Minddagger 3 years ago
interesting.... wish you luck in the debate
No votes have been placed for this debate.

By using this site, you agree to our Privacy Policy and our Terms of Use.