Closely monitor all use - safety concerns
Discord is a fully functional audio and text chat platform that allows kids to share images, videos, text and participate in live audio chats. The platform is based around chat channels and servers that people can create around various interest topics. It integrates with game platforms like Steam and Xbox (allowing a linked gamer profile) and can be connected to social media accounts. My review here is about kids who are gamers, who use discord to communicate with other kids who play the same games as them around the world. The platform is heavily promoted to kids for this purpose, and that's a source of concern as a parent for a few reasons.
On the surface of things, its great that kids can interact with kids around the world and chat as they play. I have no problems with that. The thing is, all of the profiles are anonymous handles and you have no idea who your kid is talking to, how old they are.. nothing. Perfect environment for grooming if you ask me. On that level, it's flat out dangerous. There's no question of that at all.
After seeing my 11 year old son use Discord I decided to log on to his account to check it out. I spent a few hours looking at what was going down and also checking out the platform itself. I'm a pretty tech savvy parent, so I can find my way around this stuff quite easily. I was pretty worried by what I came across. There's quite a bit of porn, swearing and kids really testing boundaries with each other to out do each other around being 'tough' with this stuff.
The big problem with it is anonymous profiles and that there is no effective content filtering at all. Kids can create servers around a topic or theme and then add text and voice channels to that with sub-themes. This directs discussion on various topics. As a user who is running a server I, and anyone that I make an admin, can add channels and invite other kids to join. I could join as an adult, and that worries me a lot. No age checks or verification processes are in place.
Discord claims to have content filtering. Well technically it does, but it is controlled by the user, which is a totally ridiculous idea when it comes to kids. A child can elect to filter their messages and have them 'scanned' before the arrive. But they turn it on or off themselves! Why bother with it at all. Admittedly, a 'virtuous' child could create a channel and turn on a filter at server level but none of the servers I looked at had that turned on. The server functions allow for channels to be created that are NSFW (which most of us understand as Not Suitable for Work) but they translate that to kids via a cartoon character called 'Wumpus' so they translate NSFW as 'Not Suitable for Wumpus'. NSFW is an 18+ concept signalling porn and other content. I find it pretty amazing that a platform like this can promote this idea to children via a cartoon character. It's normalising stuff like porn for kids who are too young to understand it. I think this is irresponsible and I feel a need to alert other parents to it, as it signals a kind of sick culture in the company. It's like 'we're being responsible because we're creating spaces away from the main channel for kids to post NSFW material. Anyone can view these channels. The only thing that happens is a click through warning appears that the content is ahead, you certify that you're 18+ with a single button click (no age verification required) and you're in. Yeah right... that's one hell of a control!! Maybe they should go and take a look at how Roblox does this stuff to see how they might do it better and also maybe they should stop encouraging kids in this way.
So if you are going to let your child use this platform, you will need to commit to monitoring and reading the messages they are sending and receiving from time to time. You'll need to decide whether they can be on servers that have NSFW channels. My audit of my son's servers turned up NSFW channels in most instances. Beyond this channel naming convention, there's nothing to stop kids posting this material in regular channels if the 'kid admins' allow it. A lot of the time the admins are posting this stuff themselves. My take away is that you need to view this platform as an entirely unregulated and make your own decisions about whether to allow participation or how to monitor your child's activity (I would suggest closely and regularly).
Another important thing to mention, which relates to game addiction, is that a lot of the servers have a rank system of seniority which the kids are obviously keen to earn. It gives them a sense of status. These ranks are usually given out according to the number of hours a user spends in the channel, as you need to be seen as a 'worker' and put in the admin time on the server in order to get ranks and promotions. This is a huge concern for me, as it incentivises excessive screen time and kids become obsessed with getting back on the server to rack up their hours and do admin tasks. Imagine having an unpaid job that ties you to a computer where your hours are logged and clocked and used as a sign of status. I think that is extremely unhealthy. I am going to have issues trying to wind my son back from all this.
I don't want to leave a review that only mentions the negatives (which are considerable) and so what I can say is that this kind of platform does teach kids server moderation commands and technology which I guess is a skill that can be used productively. There's also something good about kids being able to share gaming experiences in real time. I just wish the platform designers had some kind of responsibility for making the platform kid safe. Rather than making it safe, it seems like they are actively normalising the kind of behaviours that can be really problematic for children.
I hope this review is helpful. My advice - go in and take a good look through your child's channels and messages. Come back here and write a review to help others!
This title contains:
Privacy & Safety