
I was flying home from a business trip, and next to me was
young mom and her small child strapped in a car seat. The toddler turned to his mom, and as the plane taxied away from the gate he said, "Phone." And then, more urgently, "Phone." His mom put big
headphones on him, and handed him her phone with a game app opened up. As the plane climbed skyward, the toddler didn’t look at the window, or notice. Instead, he was deeply engrossed in playing
his video game. And clearly, he was good at it.
No judgments here, just a simple fact: Children use devices from a very young age -- and that doesn’t seem like it’s going to
change.
Emma Lembke and Zamaan Qureshi say the road to a safer internet for young people is to define the problem clearly and hold the makers responsible. They are media activists on a
mission.
“As youth activists who have grown up with Instagram accounts since middle school, we know both the benefits and the harms of social media platforms -- we live them every
day,” wrote Lembke and Qureshi on Gizmodo.
advertisement
advertisement
And just to be clear, they’re not advocating for two-year-olds to be on phones -- but they do believe that technology should be
built to serve youthful users, not to harm them.
“While we appreciate politicians’ increasing focus on the harms young people face online, Utah, Arkansas, and the other states are
letting Big Tech off the hook for building products that exploit and addict young people for profit by design. Big Tech doesn’t deserve a free pass on this. They should not be allowed to shirk
all responsibility,” they wrote.
They are calling for a movement to establish standards that make platforms safer, not to keep young people off platforms.
The organization they
co-lead is called Design it for Us, and the mission of the organization has a clear set of principles.
Design It For Us supports policy that adheres to five key principles, as spelled out on
its site:
“1. The responsibility of safety rests on Big Tech. Big Tech companies are the designers,
creators, and manufacturers of product features that have been proven harmful to kids, teens and young adults. Tech policy should put the responsibility of safety on Big Tech – not kids or
parents – to make their products safer....
2.Address the business model. Proactively and directly regulate surveillance capitalism employed by social media companies and
online platforms to exploit the wellbeing of users for profit. Centralize privacy as a fundamental and essential value, not profit....
3. Provide and prioritize user agency.
Kids, teens and young adults must be able to make their own choices about their online experiences. It is critical to give power to users to co-create an online environment that will allow us to reap
the benefits the internet has to offer....
4. Algorithmic accountability. Deprioritize algorithmic and engagement-based mechanisms and hold companies and platforms accountable
for manipulative algorithmic features that amplify harmful content and addict users....
5. Data use, minimization, and user control. From the ideation to the production
to the use of a product or platform, users should have the comprehensive and unconditional ability to have visibility into and control of their personal information at any moment or point of use."
DIFU says in its mission what its creators clearly believe in: “The online ecosystem can be a productive and positive place. We benefit every day from the creativity it fosters, the
communities it strengthens and the vast personal and intellectual growth it enables. But the unchecked, profit-driven mechanisms employed by Big Tech on social media and online platforms have caused
immense and unnecessary harm. Big Tech has addicted and exploited our generation. Our mental, physical, and emotional wellbeing is at stake. Accountability is long overdue.”