wordpress blog stats
Connect with us

Hi, what are you looking for?

Facebook, Instagram ramp up efforts to fight revenge porn, partner with StopNCII.org

Once the non-consenual intimate imagery is flagged, the platform run by a non-profit takes certain steps.

Meta (formerly Facebook) on December 2 announced its support for StopNCII.org, a newly launched platform operated by the UK’s Revenge Porn Helpline that aims to curb the sharing of non-consensual intimate images (NCII) on the internet.

The sharing of non-consensual intimate images, also known as “revenge porn,” is a serious issue around the world because of the devastating effects it has on the victim’s life. Although there are no official revenge porn statistics for India, according to the National Crime Records Bureau (NCRB) data, India recorded 50,035 cases of cybercrime in 2020, out of which sexual exploitation accounted for 6.6% (3,293 cases).  Furthermore, in a survey conducted by the Cyber and Law Foundation, it was found that 27% of internet users aged 13 to 45 in India have been subjected to such instances of revenge porn.

How does StopNCII.org work?

  1. Opening a case: When someone is concerned their intimate images or videos have been posted or might be posted online, they can open a case on StopNCII.org. The person opening the case must be:
    • The person who is in the image
    • 18 or older at the time the image was taken
    • Currently over 18 years old (for people who are under 18, there are other resources and organisations that can offer support, Facebook stated)
    • In possession of the image or video
    • Nude, semi-nude, or engaging in a sexual act in the image or video
  2. Upload the concerned imagery for hashing: The user is expected to upload the intimate image(s) or video(s) for hashing. For each piece of content, a unique hash value (a numerical code) is generated, creating a secure digital fingerprint. “Algorithms we use are PDQ for photos and MD5 for videos. They are open-sourced and are industry standard for applications like ours,” StopNCII.org says on its website.
    • Limitation: One major limitation of this technology is when the images are slightly altered. If the image that has been hashed is cropped or has a new filter, the image will need to be hashed again as the original hash will not recognise the image.
  3. The digital fingerprint is sent to participating companies: Online platforms partnering with StopNCII.org receive the digital fingerprint and can use that to detect if someone has shared or is trying to share those images on their platforms.
    • Participating companies: Currently, Facebook and Instagram are the only partners. However, other large companies including social media companies, adult sites, and message boards have expressed interest in joining, Sophie Mortimer, Revenge Porn Helpline Manager, told NBC News.

“We’ve heard from victims and experts about the need for a stronger platform adopted across the tech industry that puts the victim first. By allowing potential victims to access the hashing technology directly, StopNCII.org gives them more agency, protects the privacy of their images and opens the door for other companies to participate in this effort.” – Facebook

How are privacy concerns addressed?

Back in 2017, Facebook launched a limited pilot in Australia that worked similar to StopNCII.org. However, this initiative attracted flak because it involved human representatives reviewing the concerned images when they were submitted.

But this time around the image does not leave the device. “Only hashes, not the images themselves, are shared with StopNCII.org and participating tech platforms. This feature prevents further circulation of that NCII content and keeps those images securely in the possession of the owner,” Facebook said in its announcement.

However, this raises another important question: how will the platform know if the uploaded image is indeed non-consensual intimate imagery? Without manual review, can this system be misused to flag non-violating content as well? Turns out, platforms carry out a manual human review if a hash match is identified. The difference seems to be that the human review happens after a match is identified and not at the upload stage, offering the victim some level of anonymity.

Advertisement. Scroll to continue reading.

“StopNCII.org represents a sea-change in the way those affected by intimate image abuse can protect themselves. At the heart of the work developing this tool have been the needs of victims and survivors by putting them in control without having to compromise their privacy.” – Sophie Mortimer, Revenge Porn Helpline Manager

Who is behind StopNCII.org?

StopNCII.org is operated by Revenge Porn Helpline but is supported by over 50 organisations across the world. The platform has been developed with “extensive input from victims, survivors, experts, advocates and other tech partners,” Meta said.

The Indian partners currently are Social Media Matters, Centre for Social Research, Safe City (Red Dot Foundation), and Breakthrough. MediaNama has reached out to these partners asking for more clarity on their role and will update this post once we get a response.

India-specific developments on woman safety

In a related, India-specific development, Meta’s Women Safety Hub is now available in 12 Indian languages: Hindi, Marathi, Punjabi, Gujarati, Tamil, Telugu, Urdu, Bengali, Odia, Assamese, Kannada, and Malayalam. “This key initiative by Meta will ensure millions of women, especially non-English speakers, do not face a language barrier in accessing information easily that will enable them to stay safe online,” Meta said.

Meta has also appointed two Indians, Bishakha Datta, executive editor of Point of View, and Jyoti Vadhera, head of media and communications, Centre for Social Research, to its now 14-member Global Women’s Safety Expert Advisors team.

“India is an important market for us and bringing Bishakha and Jyoti onboard to our Women’s Safety Expert Advisory Group will go a long way in further enhancing our efforts to make our platforms safer for women in India.” – Karuna Nain, director (global safety policy) at Meta Platforms

Also read:

Have something to add? Post your comment and gift someone a MediaNama subscription.

Advertisement. Scroll to continue reading.
Written By

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

While the market reality of popular crypto-assets like Bitcoin may undergo little change, the same can't be said for stablecoins.

News

Bringing transactions related to crypto-assets within the tax net could make matters less fuzzy.

News

Loopholes in FEMA and the decentralised nature of crypto-assets point to a need for effective regulations.

News

The need of the hour is for lawmakers to understand the systems that are amplifying harmful content.

News

For drone delivery to become a reality, a permissive regulatory regime is a prerequisite.

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ