Commentary

Hey, Children -- Welcome To Tinder

People are very good at falling into a trap of blaming everything on big companies and the government.

It isn't always without reason, but it's a dangerous trap to get into because it engenders reliance on others.

The social media debate raging now about protecting children is a great case in hand. Now, don't get me wrong -- the platforms involved have to step up to the plate and take a great deal more responsibility.

Answer me this, though. Have you not been shaking your head at the parents lining up to accuse the social media companies of letting their children down and opening them up to bullying and abuse online?

All sounds fine until you realise they are talking about children who are not old enough to have social media accounts. They have clearly lied about their age to get an account and then their parents take up a battle against the platforms, rather than the child who has lied or themselves who have let their guard down. 

I certainly know countless parents who have set up Facebook accounts for children they know are too young to be on the platform. They even tag them on posts. 

This morning, though, we have questions being asked after The Sunday Times ran an eye-catching feature concerning children signing up for dating websites resulting in serious sexual crimes being committed against them. 

Again, we have to insist the platforms do more to protect users who are too young to even sign up in the first place, but we also have to wonder what was happening at home for children to be signing up for Tinder and Grindr and then popping out to "hook up" for dates.

Yes, of course, the children involved may not have been under parental supervision, and even if they were, they were likely to be able to fabricate a story about going to the cinema with friends.

So, two things need to happen in the corridors of power -- and then one big step needs to be taken closer to home.

First up, the Government is due to announce later this month its white paper thoughts about how internet companies can be better regulated. The responsibilities they have will be more clearly laid out, and it would seem fair to imagine that the prospect of fines will be raised.

I doubt if the next step will feature in the white paper, but it would be helpful if it was made a statutory requirement to check the age of people applying for accounts.

The fines for not taking down harmful material would provide a focus to ensure teens using social legally are better protected, but it surely has to be backed up by age verification.

Just think about the trouble a bar could be get into for selling a child alcohol. It's surely the same for dating apps and social media platforms.

Checks can be deployed against the electoral register, and for over 18s, you also have the reassurance of asking for a valid credit card held in an applicant's name.

It's not foolproof, but it has to be a lot better than allowing children to self-select a birthday until the years are rolled back sufficiently to gain access. 

The other development has to be better scrutiny among parents of what children have on their smartphones. Nobody is perfect and all us parents make mistakes, but asking to at least flick through a child's screens to check for unsuitable app icons has to be a bare minimum of parental care, doesn't it?

Again, it's not perfect, but if platforms begin to ask for age verification as well, it would at least protect most children from unsuitable material on unsafe platforms.

A responsibility to remove harmful content or face fines and to check ages of account applicants must be pressed upon the internet giants.

At the same time, however, parents need to be more proactive and take more responsibility for children's online activity. Protecting minors online can't be solely the government's or big tech's job or a parent's job. It will surely work so much better when all three are involved. 

Next story loading loading..