Sextortion is when the perpetrator makes demands from a user, with the threat to leak, or release private or personal content to the masses.
Tech giant Meta has announced two initiatives related to teen safety, aimed at equipping parents, educators and teens with the information they need to protect against intimate image abuse.
“Having a personal intimate image shared with others can be devastating. It feels even worse when someone threatens to share it unless you provide them with more photos, engage in sexual contact, or pay them — a crime known as sextortion.”
Sextortion, it said, is when the perpetrator makes demands from a user, with the threat to leak, or release private or personal content to the masses.
Meta, together with National Center for Missing & Exploited Children (NCMEC), announced the expansion of Take It Down, a program which helps teens take back control of their intimate images and helps prevent them from being shared online.
The company has also partnered with Thorn to develop updated guidance for teens, parents and teachers on how to prevent and handle sextortion.
Meta wants industry-wide labels for AI-made images
Meta also launched a global campaign to help raise awareness about these scammers and what people can do to avoid them, working with safety organisations around the world, including Missing Children South Africa, Media Monitoring Africa and Digify Africa.
ALSO READ: New protections to give teens more age-appropriate experiences on Facebook and Instagram
It has also teamed up with Naledi Mallela, Nandi Khubone and Melanie Bala to help raise awareness of the Take It Down program among teens and parents in South Africa.
“This builds on Meta’s existing tools and features to help protect teens from unwanted contact – including our recent announcement to default all teens under 16 (and under 18 in certain countries) into stricter default message settings, meaning only people they follow or are already connected to can message them,” Meta said.
Take It Down
There are several ways people can use Take It Down to find and remove intimate imagery, or help prevent people sharing them in the first place.
To start the process, young people under 18 who are worried their content has been, or may be, posted online can go to TakeItDown.NCMEC.org and follow the instructions to assign a unique hash — a digital fingerprint in the form of a numerical code — to their image or video, privately and securely from their device.
Teens only need to submit the hash, rather than the intimate image or video itself, which never leaves their device. Once the hash has been submitted to NCMEC, companies like Meta can find copies of the image, take them down and help prevent anyone who’s threatening them from posting them in the future.
“Making Take it Down available in 25 languages is a pivotal step towards safeguarding children from the horrors of online exploitation all over the world,” said John Shehan, a Senior Vice President with the National Center for Missing & Exploited Children.
“We aspire to ensure that every child, regardless of language or location, has the opportunity to reclaim their dignity and privacy by having their illicit content removed from participating platforms.”
Meta said the updates build on the work it already does to help young people know there are steps they can take if someone has shared, or is threatening to share, their intimate images.
“We show Safety Notices to people on Instagram when they’re messaging someone who has shown potentially scammy or suspicious behavior. These Safety Notices urge people to be cautious, encourage them to report any account that threatens to share their private images, and remind them that they can say no to anything that makes them feel uncomfortable,” Meta said.
ALSO READ: AI chatbot Google Bard with Gemini Pro now available in SA