Reddit users crowdsourcing explicit images and identities
2022-8-23 01:0:0 Author: www.malwarebytes.com(查看原文) 阅读量:20 收藏

A woman covers her face.

Posted: by

The BBC is warned of a large photograph trading ring which operated on popular group forum site Reddit. These warnings are in relation to stolen nude photographs and other content shared without permission.

In this case, even non-explicit photos are being posted alongside frequently degrading and inappropriate comments. Some of them even tip into potential threats and harassment. What is going on here?

Non-consensual image theft on a grand scale

We’ve previously highlighted regular images stolen and used as bait to lure people to pornography sites. On this occasion, the reporter was tipped off after a contact found their own photograph posted to the subreddit (which is a topic-specifc forum on Reddit) in question alongside various derogatory comments. The BBC reporter quickly discovered a large ring of individuals not only “sharing, trading, and selling explicit imagesm," but also teaming up to figure out where the women in the photographs live.

Addresses, social media accounts, phone numbers and more were pieced together by Reddit users. Threats, blackmail, and related comments would soon be sent to their chosen targets.

In fact, this isn’t the work of “just” one subreddit but dozens. One subreddit played host to around 20,000 users and contained more than 15,000 images. And 150 of those were sexually explicit. It seems at least some of these images are being shared by ex-partners without consent. Others victims have been filmed or photographed secretly, and then had the images posted online at a later date.

These can be typical actions within abusive relationships where the abuser misuses technology to exert control. In this case, the abuse is being farmed for public syndication, likes, virtual hi-fives, or even a small profit.

The problem of taking down stolen images

Images are often shared between different subreddits, and also messaging apps. This makes it quite difficult to guarantee that stolen images are gone forever. Hoarding of images ensures that if the original source goes down, someone else is almost guaranteed to simply repost the theoretically lost photographs.

Several women told the BBC they’ve struggled to have images of them posted without permission removed. In other cases, some images would be removed quickly but others would take a long time to resolve.

Slow response times and failure to remove images generally is a big problem in tech. Back in February, the BBC reported 100 images in Telegram. A month later, 96 were still available. The other 4 were in groups which were no longer accessible.

Another issue weighs heavily on whatever your local laws happen to be. If you’re in the UK, revenge porn legislation requires proof of the sharer doing it to cause distress to the victim. If they claim that the images were not shared to cause distress or harm, they can try to wiggle out of trouble through a potential legal loophole.

Sadly, for the most part, the bottom line here seems to be that people suffering this kind of abuse are mostly on their own.

One down, many more to go

The main subreddit from the BBC investigation has been shut down. The person who operated it was identified and has deleted their Reddit account. Many other unrelated subreddits likely exist, and many more away from Reddit.

Bad people are more than willing to do awful things with your most private data. Having said that, there’s much you can do to keep everything as secure as possible.

Stay safe out there.



RELATED ARTICLES


文章来源: https://www.malwarebytes.com/blog/news/2022/08/reddit-users-crowdsourcing-explicit-images-and-identities
如有侵权请联系:admin#unsafe.sh