A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.
Rules:
Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.
No spam posting.
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.
Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
No trolling.
Resources:
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
removed by mod
I guess not everyone treats their PC as an ephemeral storage, huh? I don’t trust anything that’s available only locally to survive.
Then backup whatever you set your docker local storage to?
removed by mod
Two good points here OP. Type
docker image ls
to see all the images you currently have locally - you’ll possibly be surprised how many. All the ones tagged<none>
are old versions.If you’re already using github, it includes an package repository you could push retagged images to, or for more self-hosty, a local instance of Forgejo would be a good option.
I’d also like to add that you can save an image to a local file using
docker image save
and load them back usingdocker image load
. So, along with the options mentioned above, you have plenty of options to backup images for offline use.Just use a sonatype nexus 3 image and proxy docker hub, etc. Then you pull images through it.
We run this at work so we have forever copies of image tags and to reduce dockerhub rate limit issues. Works well even for a large dev team.
Sorry for the link dump - I just glanced over the content and it seems like this might help you:
https://www.warpbuild.com/blog/docker-mirror-setup
https://medium.com/@shaikrish27/deploying-a-docker-registry-mirror-as-a-container-59565ff92c48
https://blog.alexellis.io/how-to-configure-multiple-docker-registry-mirrors/
https://stackoverflow.com/a/41593925
Isn’t a Docker registry just HTTP? Would a caching proxy be too hard to use for this?
For most of you suggesting hosting a repository - yes but,
Host forgejo. Just host the git mirror. It comes with a package repo out of the box. Then you have the source code and the docker images
An alternative method is to run an actions workflow that syncs from upstream images directly, like what Forgejo actually do.
https://code.forgejo.org/forgejo/oci-mirror
oh freaking awesome, this looks amazing! Thank you so much for this!
Or Gitea if you want to run the upstream.
At my job, we run goharbor.io and use its Replications feature to do just that.
I mean you have the current image cached on the local server when you use it.
I don’t know if this will help you, but I wrote a tutorial on how to setup a local registry on the LAN on a Fedora Server or RHEL-compatible server. https://techne.hyperreal.coffee/tutorials/setup-a-lan-container-registry-with-podman-and-self-signed-certs/
But anyway, it’s unlikely docker.io or quay.io or ghcr.io will go completely offline. If anything they might experience a DDoS, in which case I imagine they have competent devops employees who would ensure they become functional again within a matter of hours.
Very interesting thanks!
The vast majority of selfhosters probably don’t but if you want its called a private repository
https://www.digitalocean.com/community/tutorials/how-to-set-up-a-private-docker-registry-on-ubuntu-20-04