I posted the other day that you can clean up your object storage from CSAM using my AI-based tool. Many people expressed the wish to use it on their local file storage-based pict-rs. So I’ve just extended its functionality to allow exactly that.

The new lemmy_safety_local_storage.py will go through your pict-rs volume in the filesystem and scan each image for CSAM, and delete it. The requirements are

  • A linux account with read-write access to the volume files
  • A private key authentication for that account

As my main instance is using object storage, my testing is limited to my dev instance, and there it all looks OK to me. But do run it with --dry_run if you’re worried. You can delete lemmy_safety.db and rerun to enforce the delete after (method to utilize the --dry_run results coming soon)

PS: if you were using the object storage cleanup, that script has been renamed to lemmy_safety_object_storage.py

  • webghost0101@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    We Evolve!

    There something really satisfying about witnessing a community starting to talk about serious a issue and days later see things already improved.

    Lemmy is now the internet, glory to all volunteer devs. Lets make it the best place we possibly can!

  • BitOneZero @ .world@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I hope people share the positive hits of CSAM and see how widespread the problem is…

    DRAMTIC EDIT: the records lemmy_safety_local_storage.py identifies, not the images! @[email protected] seems to think it “sounds like” I am ACTIVELY encouraging the spreading of child pornography images… NO! I mean audit files, such as timestamps, the account that uploaded, etc. Once you have the timestamp, the nginx logs from a lemmy server should help identify the IP address.