April 20, 2024

Why you should think twice before posting photos of your children on social media

The images that Maxine saw were like any other pics of children that a mum might encounter while scrolling the ‘gram within the confines of her safe and sheltered mummy network.

Joyful pics of babies in nappies and kids in bathing suits in paddling pools with strategically placed emoji. On the face of it they don’t seem harmful or inappropriate. But the big difference here is context. It was alarmingly clear to Maxine that these images had been taken from other people’s pages without permission, shared now for inappropriate means.

Maxine reported the account to Instagram and received an automated reply saying it likely did not go against their community guidelines. After that the number of other accounts that Maxine and her mum community noticed began snowballing. “Overall in terms of how many accounts I’ve reported, there are hundreds. Together as a mum community, thousands,” says Maxine. “Some accounts have been reported hundreds of times but nothing has been done about them. ”

A feeling of utter helplessness through the reporting process prompted Maxine and her friends to start actively campaigning on the issue with a petition asking Instagram to add additional features to protect children. “How come I can report something simply because ‘I don’t like it’ but in 2020 there isn’t a child safety button on Instagram? ” asks Maxine.

As a new mother to an eight-month-old boy, mummy blogger Maxine’s Instagram page is idyllic. She posts cute pics with her baby wearing a range of stylish outfits in scenic locations. It’s a happy and helpful space where she can share advice and experiences with other parents and be part of the mummy blogger community. She sees it as a powerful platform to share beautiful images.

But then something happened which changed Maxine’s point of view and behaviour overnight.

“I’m part of a few WhatsApp groups with other mums,” she tells me. “One day I got a message from a mum saying ‘block this man’. ” The message was accompanied by a screenshot of the man’s Instagram account with a bio that read ‘I like girls and kids. Message me. ’ Maxine clicked onto the man’s account and found that he’d been sharing images of a range of different children on his Instagram Story. “I feel physically sick just thinking about it. They were clearly not his own kids,” she says. “One of the images was of a little girl fully dressed sitting with her legs slightly apart with an inappropriate caption saying ‘open your legs wider little girl’. It honestly makes me tremble. ”

Overall in terms of how many accounts I’ve reported, there are hundreds. Together as a mum community, thousands.

Maxine believes that the people behind these accounts are operating in a coordinated way. “These people are working together,” she says. “They use hashtags like #adorablebaby and #cutebaby. What’s on their publicly available Story is one thing, but god only knows what’s being shared in their DMs. Once you start looking into it you find that so many of them are connected. They follow, like and comment on each other’s pages and Instagram serves you similar accounts to follow. ”

GLAMOUR has seen a list of 53 Instagram accounts that Maxine has reported in the past few days and has contacted Instagram to establish whether Maxine and her friends have uncovered a network of predators. There was a pattern to the language being used on the sample of 53 accounts. The term “mega links” is frequently used in the usernames, comments and bios.

Accounts also provided details of more anonymous platforms like Snapchat, Kik and Wickr indicating that Instagram is an entry point before connecting on more ephemeral spaces where it’s harder to be tracked down. Wickr is an instant messaging app that allows users to exchange end-to-end encrypted messages which expire. Kik is known for preserving user anonymity where people can register without even providing a telephone number.

The descriptions in the bios say things like: “Come to a place to enjoy the little things on sale and on offer” and “I have several mega links to exchange. Call me on chat (9 to 13 year old girls)”. Some also brazenly did not care about being reported. One bio read: “got deleted, I always come back. DM for Wickr. ”

What Maxine noticed is not an isolated incident. “The prevalence of this kind of behaviour online is hard to put into words,” says author and tech expert Jamie Bartlett. “People share an image for one reason. But there are a lot of other people who’ll take that image and share it in ways that the original owner would never consent to. ” In 2014 Bartlett wrote a book called The Dark Net where he tried to find and understand the darkest human behaviours in online forums and communities.

“There used to be a geography to crime where things happened in specific locations. But online there are potentially eight billion people in the world who could cause you harm,” Bartlett says. It sounds extreme to think about it in those terms, but given that Bartlett has been in the most extreme corners of the web, he’s more qualified than most to describe the potential dangers. What’s happening on Instagram is just the tip of the iceberg and publicly available forums are often an entry point for much more extreme behaviour and explicit content.

“Typically, images are harvested from social media and downloaded to an encrypted hard drive and put up as a collection to a site on the dark web which is visited by other paedophiles,” he says. “It’s a decentralised system of image sharing which is hard to control. Authorities have a big challenge in clamping down on it. By the time they may have detected one image, it could already have been downloaded on hundreds of other encrypted (i. e. hidden) devices. ”

Bartlett says it’s a challenge to discuss the dark web without terrifying people or normalising bad behaviour. “You don’t want people to be terrified of being online. That would be counterproductive. On the other hand, you’d be nervous taking your kid to a swimming pool if you knew that there would potentially be paedophiles watching. You have to think about it like that. Ask yourself why you need to post pics of your kids on public forums? People know they’ll get more shares and likes. But there are so many safer ways to share images of children privately. ”

It’s a point that comes up again and again. Do we simply cease posting images of kids on public forums? And how does that apply to mummy bloggers like Maxine, where content creation is dependent on showing images of children? There’s also the issue of consent where parents are laying a digital footprint for their children without fully understanding the implications of what that will mean for them in the future. There have even been instances where older children have taken legal action against their parents for over-sharing their lives and breaching their privacy.

You’d be nervous taking your kid to a swimming pool if you knew that there would potentially be paedophiles watching. You have to think about it like that. Ask yourself why you need to post pics of your kids on public forums? There are so many safer ways to share images of children privately.

One of the first steps is understanding the risks so that parents and children can more confidently navigate the difficult corners of these spaces. “Many mums don’t understand the risks of the darker side of the net as they haven’t been exposed to them,” says Sharon Pursey, co-founder of Safe To Net, a cyber safety company that uses artificial intelligence and other technology to help protect children from things like cyberbullying and abuse. “I would say you should never post images of your children in bathing suits even if you have a private account with 100 followers. Do you really know every single one of those followers and what is going on in their heads? ”

While parents have a duty to educate themselves and ensure they’re keeping their children safe online, responsibility also lies with legislators, educators and tech companies. It’s clear that Instagram is aware of the scale of the problem. In the first quarter of this year Instagram removed one million pieces of child nudity and exploitation content from the platform, 94% of which Instagram found themselves using their technology, before anyone reported it to them.

The platform employs 15,000 people dedicated to reviewing content in over 50 languages on their safety and security team (a sizable team of 35,000 people). And the company has worked with safety experts to devise detailed policies against child nudity and sexual exploitation of children (this applies to both Facebook and Instagram). Instagram is currently working to strengthen its approach in how it finds and removes content that sexualises children or puts young people at risk.

But is it going far enough? For a platform that was launched in October 2010 and has hundreds of millions of monthly active users, almost a decade had passed before a child specific reporting option was made available. In April 2020, Instagram released a new reporting option in the ‘nudity or sexual activity’ category so that users could flag content because it ‘involves a child’. It’s a welcome development but Maxine feels children deserve a wider range of protections which she’s asking for in her petition.

Even if you have a private account with 100 followers.

Do you really know every single one of those followers and what is going on in their heads?

“This is not just a parent responsibility. We need to move on from the parent-child dynamic being the main focus. Tech companies need to do more to keep children safe,” says Abhilash Nair, a Senior Lecturer in Internet Law at Ashton University and author of the book Regulation of Internet Pornography. “While I wouldn’t recommend posting anything overt that could whet the appetite of someone with a devious interest in children, there’s only so much parents can do given the spectrum of digital technology,” explains Nair. “You don’t even need a nude or semi nude image to be at risk of pseudo photography. The tech these days makes it easy to distort reality. From a simple photoshop job where the face of a child is taken and blended on to a different body, to more complex deep-fake imagery. ”

Nair is referring to synthetic media or “deep fakes” as they’ve also been called. This is where facial recognition and machine learning are used to combine images together creating new footage of things that never actually happened. It’s computer generated video but it looks so real and is so convincing that it can be extremely difficult for the naked eye to spot that it’s false. The organisation Better Internet For Kids (which works with the European Commission) gives an even simpler description: “Users of Snapchat will be familiar with the face swap or filters functions which apply transformations or augment your facial features: Deep fakes are similar, but much more realistic. Fake videos can be created using a machine learning technique called a “generative adversarial network” or GAN. ”

The damage deep fakes cause is extensive. They’ve been used to spread disinformation (or fake news) videos of political leaders saying things they would never say. And in theory they can be used to manipulate the image of a child taking their face and superimposing it elsewhere. “One could argue that no harm is caused to the child as there is no physical abuse. But the psychological harm to the child is very real,” says Abhilash Nair, “Indecent imagery causes harm over and over again. ”

For mothers like Maxine the application of this technology doesn’t bear thinking about: “They can take a picture of my baby and superimpose his head on something else. I had some pictures up on Instagram which I thought were OK, like, just of him sitting on my lap. But now I’m taking down everything because I’m so frightened of predators taking a picture of him with his mouth open or smiling for something crude and disgusting. ”

GLAMOUR reached out to Facebook (which owns Instagram). A company spokesperson told us: “Keeping young people safe on Instagram is our top priority. We remove content that sexualises children and do not allow convicted sex offenders to have an account. ” The spokesperson added: “We invest heavily in industry-leading technology which helps detect and ban accounts which may be used to exploit or endanger children. We work closely with expert organisations and share reports and consult with specialist law enforcement teams”.

But people like Abhilash Nair believe there is much more that can be done in terms of the amount of liability tech companies take on. “Platforms do a reasonably good job when it comes to illegal images. For example, scanning for known images using Microsoft PhotoDNA and other technology so that the redistribution of harmful images can be prevented. But that doesn’t go far enough,” he says. “Under existing law the liability on platforms is woefully inadequate. The law predated social media. So I’m hoping that the Online Harms Bill (eagerly awaited new legislation) gets through parliament soon as it imposes a form of ‘duty of care’ on platforms. ”

This is not just a parent responsibility. Tech companies need to do more to keep children safe.

So where does one draw the line on that duty of care? This isn’t about stopping parents from posting pics of their kids online and there’s only so much legislation can do to teach people about the dangers, according to Abhilash Nair. Parents need to educate themselves and their children, but the issue of online safety also needs to be an intrinsic part of the syllabus, he says: “Considering that 1 in 3 internet users is a child (and that figure is even higher in developing countries) people don’t talk about children enough.

We need changes in norms and awareness so that kids understand how to get all of the benefits of using the internet in the most safe way possible. ” That’s a point echoed by Sharon Pursey. “There is so much more for children to learn and explore online, she says. “The possibilities are amazing. Tech is a good thing so long as we guide them. ”

For mummy bloggers like Maxine, she’s been making changes to her page by archiving posts, removing hashtags and sharing a list of safety tips. She’s also just switched her account to private. Her latest post reads: “My decision at present is to no longer show my son on here as I don’t feel it’s a safe platform for children. Until actions are taken by Instagram to put more extreme reporting measures in place to protect our children I will be holding off on posting anything of him. ”

So the next time you’re about to post that picture of your baby in the bath because you’re simply bursting with love for them and want to show their innocent little chubby form to the world, perhaps check your privacy settings and ask yourself if the likes and comments are worth it. Because not everyone is looking at that beautiful and innocent human being with the same lens.

Leave a Reply

Your email address will not be published. Required fields are marked *