Age Verification in the Digital Age

Age Verification in the Digital Age

When Tinder announced it was shutting down all accounts of users younger than 18, many parents were surprised to find out that their 13 year old had access to the dating app in the first place. One of a parent’s worst nightmares is for their child to go out into the real world and meet a stranger that they had met online, which is exactly what Tinder is designed to do as efficiently as possible.

 In today’s Digital Age, children are exposed to an unending stream of social networks, apps, and games –some are designed for all ages, but many are not. Age verification is supposed to act as the doorway between these worlds. But as most parents find out, children know how to [and will] get around it.

 Many apps and sites with age restrictions simply ask for a month, day and year of birth to sign up.  There is nothing to prevent kids and young teens from picking a year that allows them to access things and people that nobody intended for them to access, and yet the hurdles to get there are so low.

 Today we have self-driving cars. Precision gene editing. Reusable rockets. A rover cruising around Mars sending us selfies. Yet, age verification remains elusive. Why can’t we figure out whether someone online is a 12 year-old girl or 50 year-old man? How will Tinder ensure that no one under the age of 18 will be able to sign up now that they’ve implemented their ban?

 There are many partial solutions in place that have attempted to solve this, from age-ratings for apps to stricter terms of use, and laws like COPPA which require additional parental approval before a child under 13 can use an online service. Each is imperfect, difficult to enforce, or both. Most age-restricted apps haven’t gone as far as implementing biometric security gates, such as retina or finger-print scans; this would likely stifle innovation and prevent any apps from existing because of the sheer cost and security requirements of implementing them. And, who wants all of this information about us hosted by a small start-up running the latest popular app?

 The truth is there is no easy answer. But, by working together, technology developers, parents, educators, communities and the internet industry as a whole can take some steps towards further ensuring kids are sticking to the content and services that are really best for them.  Below are a few ways we can do this, with the ultimate goal of helping kids thrive online:

Enforcing Industry Standards on Content Ratings

First, app developers should consistently use age ratings standards as recommended by Google and Apple.

 While they do not require developers to choose an age rating, they are heavily encouraged to do so.  The more information provided about the content and intent of an app, the better informed we all will be. These standards have been a big improvement over the last few years, when there was less clarity about the ideal age for apps or other details such as whether or not an app contained in-app purchasing. Google introduced age ratings for apps similar to those used for video games, where “E” is for everyone, “T” is for teen, and “M” is for 17+.  They took things a step further by offering a “Designed for Families” option for developers which helps families find their apps more easily.

Apple provides developers four age brackets, 4+, 9+, 12+, 17+. It would be a bit easier for consumers if both companies could standardize onto a single format, but something is better than nothing.

The ratings alone will do nothing to prevent kids from accessing any app they wish; parents should pay attention to the age ratings and descriptions to decide if it’s ok for their kids or not. This still may not prevent kids from accessing apps that are not designed for them, but having standard information about this media is a critical step in the right direction.

Implementing Age Verification Processes

 In the real world, if you want to enter a place or buy something that has an age restriction, you need to show identification.  But how do you do this online? Tinder today requires people to have a Facebook account in order to sign up for a Tinder account, but that does not prevent people from lying about their age when they open a Facebook profile in the first place. Tinder could require a credit card number, to verify that someone is 18+, but this means everyone needs a credit card, and it excludes 17 year olds (which to some may not be a bad idea). Collecting and storing credit card data also puts it at a security and privacy risk that didn’t previously exist.

 Today, it is instead incumbent upon parents to do the work of restricting their kids from accessing apps they are too young to use. One easy way is to require a password before apps are downloaded onto any device their children use, as long as parents keep those passwords to themselves.  If a child is really interested in an app, they will have to come to you to allow them to download it, which is a perfect time for you to find out more about it and decide if it’s ok or not.

Setting Up Restrictions on Devices

 Parents can also use the features already on many devices to help prevent kids from accessing apps or content that is not ok for them. These features have different names and exist in different places, but they are often called parental controls or restrictions, depending on the device. Today, they exist in some form or another on mobile devices, gaming devices, and PCs/laptops. They also are offered as parental controls in PC/laptop security products (like Trend Micro’s Premium Security).

 On mobile devices, the age ratings listed for an app are used to set the filters.  Through these settings, you can decide if you only want to allow apps designed for a specific age or below to be downloaded onto the device.  Once the filters are in place, they can be locked so that only the parent can change them using a password.

 One of the biggest issues I encounter in my travels and discussions with parents is that many are not even aware of the age ratings or the ability to restrict content on devices.  Companies who host apps and who create mobile devices that kids use could make a very big difference by making a greater effort to educate the public about both of these.

Educating and Communicating With Each Other

 In my talks with hundreds of 10-12 year old students over the last 6 months, I’m amazed at how many enthusiastically reveal they are on Instagram or Snapchat (both have a minimum age of 13), and how many of their parents have allowed it; some parents are not aware of the minimum age, but many are.  Tinder, Facebook, Twitter, and many other apps do state they will kick you off their service if they find out you’ve violated their terms of use, in this case, meeting their minimum age requirement.  They put a lot of trust into people self-reporting their ages, and would have a difficult time policing this, especially if those users are behaving just fine while they’re there.

 Age ratings for more traditional media – like movies or television shows – have often been more about making the public aware of the contents, rather than a legal imperative to restrict people from consuming it. A theater won’t allow kids under 13 into a rated R movie, but won’t stop them if they’re with their parents. It seems an increasing number of parents are comfortable with their underage kids using social media (whether they’re aware of the age requirement or not) and I suspect I will continue to find many young middle schoolers telling me about all the things they do on these apps. So, while age ratings are useful for telegraphing the age-appropriateness of an app, it hasn’t seemed to do much by way of restricting their use among kids.

 The most powerful answer to this problem is to become educated consumers, parents and advocates. Talking with neighbors, friends, relatives, school communities and your own kids about what apps are popular and why or how other parents decide what is or isn’t ok for their kids can be helpful in making informed decisions about what’s best for your kids and family.  It’s a conversation that should happen as early and as often as possible.

 Additionally, it’s important that parents utilize the tools already out there, such as age ratings and restrictions or filtering features, to make the most informed decisions for your kids. The best way to fully understand an app is to use it. Many parents are intimidated by this advice, but if kids are able to do it, why can’t we?

 As for Tinder’s move to raise their age limit, and any other app with an age requirement, it seems it would be in their best interest to reach as many parents as possible and make them aware of their age restrictions and reasons for it.  It’s a worthy investment in time and energy versus dealing with even one public case of underage swiping gone awry.