Commentary

YouTube Tells Aussie Kids To Come Back At Age 16

Google’s video network YouTube confirmed Wednesday in Australia that it will adhere to the country’s upcoming social-media ban and remove the accounts of Australian children under the age of 16.

Australia this month becomes the first country to ban children from some of the world's most popular social-media platforms. In addition to YouTube, Facebook, Instagram, TikTok, Snapchat and X also will be impacted by age restrictions.

“This is a disappointing update to share,” Rachel Lord, public policy senior manager of Google and YouTube Australia, wrote in a blog post. “We deeply care about the safety of kids and teens on our platform; it’s why we spent more than a decade building robust protections and parental controls that families rely on for a safer YouTube experience.”

Google had no choice. The Act of Parliament, Online Safety Amendment (Social Media Minimum Age) Bill 2024 passed the Australian House of Representatives and the Senate in November 2024 and received Royal Assent on December 10, 2024.

advertisement

advertisement

It has not been without controversy. Some media reports argue that it is too restrictive and express concern that children will be able to circumvent any bans.

Children with accounts will be signed out on December 10, and will be barred from signing in or from creating new accounts until they turn 16.

“This includes any supervised pre-teen and teen accounts,” a company executive wrote in a blog post. “We recognize this has important implications for users, and we encourage you to review your options.”

When signed out of their account, kids will not have access to features such as likes, subscriptions, creating private playlists, or becoming members of any YouTube channel. Those who have a YouTube channel will not be able to access it and it will not be visible to others.

YouTube is allowing channel owners to download the data and content created.

The Australian government insists it is not a ban, but rather a delay in the ability for kids to access the sites. Australia's eSafety Commissioner ruled that several services -- including Discord, Roblox, WhatsApp, Google Classroom, Pinterest, Microsoft, GitHub, and YouTube Kids -- do not meet the criteria of age-restricted social media.

In July, Google and Microsoft announced that by the end of the year, Australians must be logged into their search engine accounts. The technology checks ages for online safety regulations, which the eSafety Commissioner developed with the technology companies. 

The security measures will affect advertising and media buying. When search engine age assurance technology believes a signed-in user is likely to be an Australian under the age of 18, they will need to set safety tools such as “safe search” functions at the highest setting by default to filter out pornography and violence.

Search engines operating in Australia will need to implement technologies that check the user's age when they are logged in within six months. The rules were published in  July.

Australia has strict rules to protect the country and its citizens through biosecurity rules that apply when goods, vessels, and travelers enter Australian territory. Travelers, for example, must declare food, plant materials, and animal products, and goods become subject to control 12 nautical miles from the mainland. 

Malaysia early in December also moved to raise the minimum age for social-media accounts to 16 in 2026, requiring platforms to verify users' identities through eKYC checks using official documents as part of its Online Safety Act.

While there is no current U.S. federal law today that would set the social-media age limit at 16, lawmakers are pushing for stricter regulations and age-verification technology that could raise the age limit to 16 or 18 by requiring parental consent for minors. 

With social-media restrictive technology developed and ready for use, and the concerns surrounding artificial intelligence and chatbots, I would not be surprised to see some type of age restriction implemented by the end of 2026.

One challenge is that minors in the U.S. can become brand influencers, and many already are. But this involves significant legal, ethical, and safety considerations that require parental involvement and oversight.

Google’s AI Mode suggests that Nike, Lego, Target, Walmart, Crayola and others work with brand influencers who are minors. These relationships are typically managed by their parents, because they can reach younger audiences, but the practice raises concerns about child labor and privacy. 

Next story loading loading..