this post was submitted on 28 Aug 2023
5 points (100.0% liked)

Meta (lemm.ee)

3467 readers
10 users here now

lemm.ee Meta

This is a community for discussion about this particular Lemmy instance.

News and updates about lemm.ee will be posted here, so if that's something that interests you, make sure to subscribe!


Rules:


If you're a Discord user, you can also join our Discord server: https://discord.gg/XM9nZwUn9K

Discord is only a back-up channel, [email protected] will always be the main place for lemm.ee communications.


If you need help with anything, please post in !support instead.

founded 1 year ago
MODERATORS
 

Sorry for the short post, I'm not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:

Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.

It will not possible to upload any new avatars or banners while this limit is in effect.

I'm really sorry for the disruption, it's a necessary trade-off for now until we figure out the way forward.

top 12 comments
sorted by: hot top controversial new old
[–] [email protected] 3 points 1 year ago (1 children)

This is sick. Kudos to mods for dealing with this garbage. I hope the posters are all hunted down and punished.

[–] [email protected] 1 points 1 year ago

Yeah, the admins deserve all our support on this. Not only to protect themselves as server owners, but to stop the spread. Hopefully a longterm solution will be found soon

[–] [email protected] 2 points 1 year ago (1 children)

We had to deal with something similar on lemmygrad a while ago. All power to you for destroying these annoying bastards.

[–] [email protected] 1 points 1 year ago (1 children)

How did you guys deal with this?

[–] [email protected] 1 points 1 year ago

We doubled the amount of mods, and banned anything remotely resembling the things on-site. Sadly many times it had to be a brave lemmygrad to check it first and take the bullet for us to report it. I was one of those people on several occasions. I still cringe at the memories. It lasted a few months iirc.I haven't seen whatever is hitting you guys, but our bots had some recognizable features, usually hiding their spam behind spoilers or links.

It really was just a mobilization, lockdown, and purging everything that was suspicious until it stopped. That or they found a way to block those bots. I wasn't in the command center by any means so the internal decisions I don't know too much about.

[–] [email protected] 1 points 1 year ago

My honest reaction to this:

[removed externally hosted image]

[–] [email protected] 1 points 1 year ago

I think this is a great move until we have something rock solid to prevent this. There are tons of image hosting sites you can use (most of which have the resources to already try to prevent this stuff) so it shouldn’t really cause much inconvenience.

[–] [email protected] 1 points 1 year ago

I honestly think this is the reason why message boards generally don't have the feature to attach images to posts anymore.

[–] [email protected] 1 points 1 year ago

If you're concerned about legal liability I think it's worth noting that there is some protection for websites in this matter. For the most part as long as you're taking "reasonable action" against it you're not liable, and that most laws take into consideration the resources of the site dealing with the uploads.

Not pleasant for users though of course. And the speed at which its handled is obviously a concern.

[–] [email protected] 0 points 1 year ago (1 children)

I know there are automated tools that exist for detection CSAM- given the challenges the fediverse has had with this issue it really feels like it'd be worthwhile for the folks developing platforms like lemmy and mastodon to start thinking about how to integrate those tools with their platforms to better support moderators and folks running instances.

[–] [email protected] -1 points 1 year ago

Better shut the internet down then. This will only continue to worsen now that anybody can generate whatever images they want with AI assistance. Such image hashes will not be in CSAM databases (if AI generated imagery is even CSAM)