Predators, disturbing content, identity theft attempts, and other things you wouldn’t let them experience in real life are all waiting for them.
Shockingly, 1 in 5 U.S. teenagers say they have been approached in a sexual way by strangers online; only 25% told their parents.
As a parent, that’s terrifying.
Young people have access to a nearly infinite pool of content thanks to websites like YouTube and Reddit. Worse, the Internet, the rise of smartphones, and the culture of social media allow us to access these things from anywhere. No matter what you think of it or how much you know about it, platforms like YouTube are changing how children are growing up.
While that’s not always a bad thing, kids have more access to new sources of information, some good and some bad. Finding useful information on YouTube is easy, but so is stumbling across harmful or even malicious content.
How can you restrict what your children watch on YouTube?
The answer to this question isn’t simple. Fortunately, there are options available to parents when it comes to controlling YouTube and Internet access.
The number one priority for parents should be to teach their children to protect themselves online and use social media safely. As a parent, you can’t monitor everything your child does all the time, and older teenagers might want (and genuinely need) some digital privacy. Instead, focus on being proactive about their safety while teaching your children how to protect themselves online.
In 2015, the Pew Research Center found that 92% of teenagers go online daily and that 75% own a smartphone. An Australian study later reported that 95% of 8- to 11-year-olds have accessed the Internet in the last month.
The most common websites they used were YouTube, Facebook, and online games like Roblox. In fact, email and even text messaging have taken a backseat to social media for many younger users. Platforms like Instagram, YouTube, Snapchat, and others provide a nearly infinite supply of content.
Sites like Instagram and Twitter not only let you connect with close friends and family members, they also open communication with complete strangers. Nearly half of Facebook users accept friend requests from people they’ve never met before, and most users are familiar with stories of social media or chat requests gone wrong. But did you know that YouTube allows comments on most videos and that those comments sections can contain links posted by predatory adults hiding behind fake profiles?
Many modern devices, apps, and web browsers offer parental controls that restrict access to certain content for their kids but did you know that many antivirus software titles already include parental controls? It’s two layers of protection with one installation. Some popular options include:
A quick search will show you which antivirus software includes parental controls but in our experience, the best way to encourage safe Internet browsing is education and conversation.
Many platforms like Netflix have built-in parental controls that restrict content with a passcode. Netflix supports kid-focused user accounts to block adult-only shows. iPhones also have parental controls in their settings menus.
Some tech-savvy kids can bypass parental controls by installing certain software. Prevent them from doing so by giving them access to non-administrator user accounts on your operating system. Most will only let certain accounts install new software.
Finally, educating your kids on some smart browsing habits goes a long way toward ensuring their safety. You want to teach them to:
These are just general guidelines. It’s worth looking into the specific services and platforms your children enjoy using the most to see if they offer any customized parental controls.
Facebook is still one of the most popular social media platforms, but younger audiences are increasingly turning away from it. Chances are if your child does use Facebook, having a profile plays a huge role in your child’s ability to fit in with friends at school. Not only are the usual online dangers present, but what your child posts can impact his or her livelihood down the road. College admissions officers and job recruiters have reported that content on Facebook and other channels could harm an applicant’s chances.
What does this have to do with streaming videos? Facebook introduced a streaming feature called Facebook Live. These are videos being broadcast to the user’s friends list, but the recording of it can be shared. That means raw, unedited footage of anything the user opted to broadcast can be shared to unconnected users. Unfortunately, Facebook has come under fire for some of these videos, especially ones of gang rape and of the murder of Philando Castile.
YouTube modernized access to digital media and this popular video sharing website is used by people of all ages. Many parents rely on YouTube to entertain or instruct their children and while this works for many, there is a lot of troubling content on YouTube that masquerades as kid-friendly.
YouTube has everything: children’s shows, toy reviews, video game footage interlaced with player commentary (known as “Let’s Plays”), and so much more. YouTube’s content creators are more than eager to provide content that appeals to young children. The official Peppa Pig YouTube account, for instance, streams episodes of the show live for free.
But you can’t expect everything to be kid-friendly. Plenty of malicious and racy content can be found on YouTube. While users have the ability to “flag” objectionable videos and YouTube offers a kids’ mode, these solutions aren’t perfect. The only way to ensure safe browsing for very young children is to be aware of what they are watching.
Among the kid-friendly channels, it’s important to know how to recognize any questionable content your child might be exposed to on YouTube. There are countless channels that appear harmless on the surface, however, a deeper look will reveal how disturbing the content actually is.
Logan Paul is a popular YouTuber with millions of subscribers, including many younger fans. The online celebrity experienced controversy in late 2017 after posting a video of a dead body while exploring Japan’s infamous Suicide Forest.
He was heavily criticized for exploiting the man’s suicide for his online show. Logan eventually took down the video and issued an official apology but it was deemed “too little, too late” by many.
Another YouTube controversy is the Elsagate scandal. Recently, several videos featuring iconic child-friendly characters like Spiderman, Elsa from Frozen, and Peppa Pig emerged in videos that contained disturbing, non-child-friendly themes. These videos were produced illegally without permission from the official owners, yet many children watched them daily.
While some of these videos were clearly fake, others featured standard animation, making it difficult for kids to tell the difference. In these videos, the characters would often:
There are also popular videos in which actors dress as Disney characters and create live action skits. Once again, these videos are full of disturbing content that is far from kid-friendly.
The majority of these questionable videos often ranked well because their creators knew how to manipulate the YouTube algorithm. A simple search for Spiderman or Elsa would often put these videos near the top of the search because they were viral.
Another issue parents have to deal with is suggested content. Many children are still building their attention spans, and after a few minutes in one video, they might click a suggested video from the sidebar. Suddenly, they’re falling down the rabbit hole that is YouTube’s suggested content feature.
YouTube displays suggested content based on a set of specific ranking factors. If a video is extremely popular and somewhat related, it will be displayed. So, if your children click suggestions after watching a Frozen clip, it won’t take long for pregnant Elsa and Spiderman videos to pop up in their feed. As parents, this means your children are being served content that even adults would find creepy.
How can you stop it? There’s virtually nothing you can do to stop this other than strict parental controls and monitoring your kids’ internet use in real-time.
Parental controls exist on YouTube. They are far from perfect and some bad content will still sneak through, but you can minimize the risk that your children will see something disturbing. Here’s what you can do through YouTube’s parental controls:
Snapchat is a messaging app for mobile devices where users can send images and videos to friends. The app is popular with teens and young adults, and surveys have shown that 32% of US teens have used Snapchat.
Unlike Facebook, you cannot monitor your child’s activity on Snapchat without having direct access to his or her account. Instead:
In 2014, Snapchat was in the spotlight after a third-party “snap saving” app was hacked. Over 90,000 revealing photos were leaked on the Internet, many of which contained underage nudity.
Remember, “snaps” can be pictures or videos, so the same concerns you may have about Facebook Live or YouTube content applies to Snapchat.
Video games have increasingly moved toward an online multiplayer business model where players can interact and communicate with other players. Games like StarCraft, World of Warcraft, PUBG, and Fortnite are entirely multiplayer, while others like Minecraft and Call of Duty offer both single player and multiplayer modes.
It’s important to understand that the ESRB rating system for determining age and content ratings for video games typically doesn’t consider online interactions. So, while a video game like Fortnite may seem child-friendly on the surface with its colorful graphics and cartoony art style, adult players can say anything they want to in the game’s voice chat.
Video game controversies can be extremely complex. Counter-Strike: Global Offensive (CS:GO) is an online multiplayer combat video game that offers in-game microtransactions, where virtual in-game items can be purchased using real money. In particular, CS:GO sells “skins,” or spray-painted designs to decorate a player’s weapons or character. Certain skins have been known to cost upwards of a thousand dollars.
It’s also possible to trade skins between accounts. This has led to the creation of websites built to allow players to wager the value of their skins against other user’s skins. Trading websites like these employ famous YouTube celebrities to advertise the service to millions of viewers as well.
Don’t fool yourself, these are gambling websites.
On these websites, players bet and win skins based on random chance and betting on live matches. While this setup is essentially gambling, trading skins manages to dodge the legal definition of gambling by not involving actual money, but rather virtual items that can be sold for money. Nonetheless, the game’s developers have been hit with class-action lawsuits over the possibility of underage players gambling on these sites.
There was even a major scandal in 2016 when popular Counter Strike YouTubers TmarTn and Syndicate, who were famous for creating videos of them gambling on these types of websites, were caught rigging bets on a website they owned. In these videos, the two players would gamble and show players how much they were winning. However, they never revealed to their audience that they were gambling on a website they owned. All of their winnings were staged, misleading their millions of supporters, many of whom were underage, and encouraging them to gamble.
A similar new trend is the rise of “loot boxes.” In certain games, players can purchase or earn a virtual crate which, when opened, give the player a randomized selection of further virtual items that can be used in-game. Thanks to the randomization factor, loot boxes have been claimed to be gambling in disguise.
In fact, the Belgian government ruled against loot boxes, calling them a form of gambling. Major corporations continue to push loot boxes because of how lucrative they are. In the video game industry, major companies have even coined the term “whales” for users who purchase an excessive number of loot boxes.
You might hear about loot boxes under different names. The video game Rainbow Six: Siege, for instance, refers to them as “Alpha Packs.” The screenshot above shows a seasonal “loot box” from Overwatch.
Mobile phone games usually include microtransactions. These transactions involve any form of transaction that takes place in a game. They typically charge whatever credit card is linked to your App Store or Google Play Store account. PlayerUnknown’s Battlegrounds, Pokemon Go, and Roblox are a few examples of free-to-play games that include excessive microtransactions.
Parents need to be aware of microtransactions. They can be addictive for your children, and in extreme cases, your child may rack up a large bill on your credit card.
For the majority of kids and teenagers, the Internet has had a positive impact on their lives. Your responsibility as a parent, though, is to ensure a healthy balance between Internet use and your kids’ online privacy and digital security. Talk to them extensively about social media and content consumption, and be prepared to step in whenever issues do come up.
Be an active listener, educate them about the dangers of specific content, and make sure they understand that their actions online can have consequences. Take some time to understand the websites and services your children use the most as well. Teaching them how to engage with others online is integral to succeeding in an increasingly Internet-centered society.
Original post:
https://www.safetydetective.com/blog/parents-guide-for-safe-youtube-and-internet-streaming-for-kids/
About the author: AVIVA ZACKS
Aviva Zacks is a content manager, writer, editor, and really good baker. When she’s not working, she enjoys reading on her porch swing with a cup of decaf.
[adrotate banner=”9″] | [adrotate banner=”12″] |
(
[adrotate banner=”5″]
[adrotate banner=”13″]