Through the efforts of @dandroid@dandroid.app, the lemmy-safety tool can now run via docker, which should help you run it without having to mess with python on your end.

For those who don’t remember, lemmy-safety is a script you can read to clean up your pict-rs images from potential CSAM.

Thank you for this great service to the community.

@koper@feddit.nl
link
fedilink
English
-21Y

Applying AI-voodoo to a non-existing problem with unknown side effects? Sign me up!

konalt
link
fedilink
English
21Y

This was created in response to mass posting of CSAM on lemmy recently. It might be one of the reasons its a “non-existing problem”.

@chrisbit@leminal.space
link
fedilink
English
41Y

nvidia-container-cli: initialization error: load library failed: libnvidia-ml.so.1: cannot open shared object file: no such file or directory: unknown.

I was getting this error with docker-desktop installed, but it worked after purging and installing docker-ce instead, and running with the --gpus all command.

Dandroid
link
fedilink
English
21Y

Interesting. This might be a different between podman and docker. I was using podman in my setup. Unfortunately we might need two different sets of instructions for podman and docker.

Does this run some ML wizardry on the images?

db0
creator
link
fedilink
English
21Y

Yes, it’s uisng the clip image2text model

Fidelity9373
link
fedilink
11Y

By the looks of it, pretty much. Not in the sense of building an AI model, but more like traditional image recognition. Seems to process everything locally too, which is a plus; no sending data off to unknown servers.

Dandroid
link
fedilink
English
161Y

Thanks for the shout out!

Full disclosure, I use podman, not docker. If anyone has any issues with this using docker, let me know and I’ll get it fixed ASAP. I’m not 100% sure the --device option works the same way with docker.

I added instructions on how to add the nvidia-container-toolkit repo on the two distros I have (one rpm based, one deb based). If adding the repo is different on your distro, please consider adding it to the instructions. The instructions on the nvidia-container-toolkit web page are… subpar in my opinion.

@Rescuer6394@feddit.nl
link
fedilink
English
7
edit-2
1Y

for docker the syntax is --gpus all

https://docs.docker.com/config/containers/resource_constraints/#expose-gpus-for-use

bonus: syntax to expose the gpu in a docker compose

    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: 1
              capabilities: [gpu]

more at https://docs.docker.com/compose/gpu-support/#example-of-a-compose-file-for-running-a-service-with-access-to-1-gpu-device

Dandroid
link
fedilink
English
21Y

Would it be too much for you to ask to test that out and update the documentation? I don’t have docker, and installing it would mess up my podman-docker setup, which would impact some things I have running. podman-docker simulates docker with podman so I can use docker-compose with podman.

YⓄ乙
link
fedilink
English
11Y

Thank you.

Create a post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

  • 1 user online
  • 219 users / day
  • 442 users / week
  • 1.15K users / month
  • 3.85K users / 6 months
  • 1 subscriber
  • 3.71K Posts
  • 74.7K Comments
  • Modlog