The video social media app TikTok has been given a record-breaking fine of 5.7 Million U.S dollars (8 million Aus) for not complying with the U.S Children’s Online Privacy Protection Act . COPPA insist that any app that collects personal and private data particularly without parental consent is set at an age limit of 13+. It was also found that TikTok did not remove underaged accounts, they did not ask for date of birth on sign up, and did not comply with parents requests for the deletion of data and videos of their underaged children. The proliferation of children who reported being approached by adults on the app, and the ability of the app in previous releases to determine your location by a radius of 50 miles was also a danger to children, not to mention the adult content hosted on the app.
What Happens Now?
After this fine and ruling, TikTok will now ask U.S users to verify their age as being over 13+ or redirect underage U.S users to a platform that has been limited for underage users.
The users of the more child-friendly version of the app will not be able to publish their own videos to the app or share them, and only be allowed to follow users and watch other users videos. They are also not able to message others, and others can not see their profile. I would warn parents NOT to allow the more child-friendly side of TikTok without side-by-side supervision; parents should scroll through the videos on the app previous to their child’s use to ensure that no adult content is displayed.
What About Australia?
Children who reside in the U.S and are under 13+ are having their TikTok accounts set for removal. Because the owners of the app are based in China, the FTC ruling only applies to U.S Citizens.
The Australian and UK Government have also recently been discussing independent regulators for social media companies; I expect that it will not be long before the Chinese company ByteDance who own TikTok may have to apply these COPPA restrictions for other countries outside the U.S.
What Is The 13+ Age Rating?
Some parents are confused about the 13+ age rating on apps like TikTok, Facebook, Snapchat, and Instagram. The age rating system for iTunes or Google Play often does not show the same age rating that is listed in these apps terms of service.
Instagram is rated on iTunes as 12+ instead of 13+ which is the actual legal age requirement of use. The online app stores are seemingly arbitrary as to recommended age ratings.
Sadly I fear these discrepancies are purposely engineered to merely to get more users and to bypass parental control restrictions which allow a setting of either 9+ 12+ or 17+. I’m waiting for the day, that iTunes or the app developers like Facebook, Snapchat and Instagram are legally forced to set the age rating correctly.
The 13+ age rating that appears in most social media apps within their Terms And Conditions has been determined by the U.S Federal Trade Commission Children’s Online Privacy Protection Act to protect children’s privacy and data from the use or from being shared with third-party apps and advertisers.
It is also safe to say that most apps that have a 13+ rating also have an online social side or content that is only suitable for an older user. For parents and users, the 13+ age rating is a good indicator that the platform may have explicit adult content and the ability for online interactions with strangers.
Why Are Parents Allowing 13+ Apps?
There are many parents who children don’t adhere to the 13+ rating. Some parents don’t actually know the age ratings for many apps, they have never checked, or perhaps they rely on the rating that is on the iTunes store or Google Play.
Some parents may view the suggested 13+ rating as a similar rating to a PG rating where there may be a lesser exposure to adult content, and therefore suitable for younger children with some supervision. This assumption is not the case, 13+ does mean the app is rated for teen users, if not older. Commonsense Media, for example, have rated some 13+ platforms as more suitable for 16-year-old users as TikTok has been.
Some parents have told me they have never actually looked through TikTok to check the content, and are under the impression that because many underage children are using the app, it is therefore suitable. Safety in numbers is no way to judge an apps suitability for your child. Apps can go viral and pick up a lot of users, as this one has before the regulators and parents find out that the apps are actually very dangerous for children. Snapchat has similar issues for younger children.
If parents choose to ignore the 13+ age limit on these apps, there can be no legal recourse if your child is then harmed on these platforms by exposure to cruel behaviour or adult content. By agreeing to the app’s terms of service that your child is old enough to use it when they are under 13+, you have broken the legal agreement for using the app.
More apps need to rated 16+ if they show media that is disturbing to younger teens. Some social media apps that are known to host pornography, excessive violence and leave some users open to predators and bullying, need to be protective of younger teens. However, some app developers know that if their apps are rated 16+, it limits users and download.
Banning Doesn’t “Work”….
If you think banning doesn’t “work” you are correct, no law or ban will “work” for some. Right throughout human history, humans have broken laws, but we don’t often revoke good laws just because they don’t “work”. Many people still go over the speed limit in school zones, so yes that law technically isn’t “working” but it will for the majority, and keeps children safer.
Kids Will Get Around Bans
Children either respect your boundaries or they do not. Some children can certainly get around any digital blocks or settings you enable, even simply by using someone else’s device to access an account, but not all children do. Many children generally respect their parent’s boundaries, but if they don’t you may have a much larger issue than just access to adult apps in the future.
Banning Things Make’s Them More Attractive…
Some people say, “bans” only make things more attractive…yes they can in some circumstances, and particularly where someone hasn’t had the reasons for the ban clearly explained to them, or the consequences for not heeding a fair and justifiable ban has not been a deterrent. But if we follow that logic, that banning things make them more attractive, let’s not ban or make laws around violence against others, bullying, kids watching porn, kids driving cars, drinking alcohol, or taking illicit drugs.
My Child Will Be Upset
It is important for parents to be able to withdraw apps or other things when they find out that they are not suitable for their child or even harmful. If your child had an allergy to chocolate it would probably be devastating, but if it meant saving them from a bad reaction you would be negligent to give in to their nagging and tears just because you can’t bear to see them upset.
Parents need to be parents and sometimes show they care through restricting their child’s access to something that is not suitable if not harmful. It is not easy, but it is certainly part of parenting to say “This is not suitable for you, and I’m sorry but I cannot agree to you signing up for this”
Do Your Research
Understanding that an app is unsuitable for your child via your own investigation if you are tech savvy enough to do this, via reports from reputable sources, should give you plenty of confidence to make sure your child is safer by avoiding it. Read reviews, ask your friends and educators what they have heard.
It is entirely up to you as a parent to make decisions for your child, but not setting safe boundaries because they are likely to be pushed or broken is not a valid reason. Digital parenting is no different from offline parenting.
How To Supervise Your Child On Their Device?
- Read reviews about apps before your child uses them on https://www.commonsensemedia.org and https://www.thecybersafetylady.com.au
- Employ Side-By-Side supervision with younger children on digital devices. Never leave a child alone with an internet connected device.
- Discuss age ratings and online safety with your children regularly
- Give your children a safe space to report anything they see online good or not so good.
- Keep digital devices including gaming consoles out of the bedroom.
- Get onboard as much as you can with your child’s digital world, don’t demonise the technology.
- Find out what your child enjoys on their digital device and have your child show you how to play.
- Keep control of passwords for all internet connected devices
- Do spot checks WITH your child on their devices, keep supervision transparent and open, no spying.
- If you do find out that your child has an account on a 13+ app against your wishes, then deal with the infraction as you would for any rule or boundary you have set.
- Set parental controls to block apps or control downloads if your child is not respecting your boundaries. But don’t rely only on settings
- Adult content filters on browsers and apps that enable them is essential
- Set the digital rules for your home, guests must also respect your rules, within reason
- Expect your children to respect your boundaries no matter where they go. This applies to anything both online and offline.
- Inform extended family members of your boundaries, help them to support them.
- Set positive incentives for respecting boundaries around digital devices, rewards work better than punishments.
Don’t Give Up!
Don’t simply give up on digital boundaries just because it is hard, or because you are sick of the nagging. Research the safety of platforms, really understand the risks as you do for anything your children are involved with and set your standards.
My manual Keeping Kids Safe Online has reviews and settings to help kids stay safer online.