• 6 Posts
  • 31 Comments
Joined 1Y ago
cake
Cake day: Jun 12, 2023

help-circle
rss

Damn! I missed that one. Working now. Thanks!


Won’t connect on either port using http or https.


Can’t access immich on local network
I've been banging my head on this for a few days now, and I can't figure this out. When I start up immich container, I see in `docker ps`: ``` CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 1c496e061c5c ghcr.io/immich-app/immich-server:release "tini -- /bin/bash s…" About a minute ago Up About a minute (healthy) 2283/tcp, 0.0.0.0:2284->3001/tcp, [::]:2283->3001/tcp immich ``` `netstat` shows that port 2283 is listening, but I cannot access `http://IP_ADDRESS:2283` from Windows, Linux, or Mac host. If I SSH in and run a browser back through that, I can't access it via localhost. I even tried changing the port to 2284. I can see the change in `netstat` and `docker ps` outputs, but still no luck accessing it. I also can't telnet to either port on the host. I know Immich is up because it's accessible via the swag reverse proxy (I've also tried bringing it up w/ that disabled). I don't see anything in the logs of any of the immich containers or any of the host system logs when I try to access. All of this came about because I ran into the Cloudflare upload size limit and it seems I can't get around it for the strangest reason!
fedilink

This is really helpful. I’ll look into that. Thanks!


I can upload files outside of the docroot, but if they stay there for too long, I get a nasty email from Dreamhost reminding me that this is for web space and not offsite storage (something they also sell). I haven’t tried uploading something inside the docroot and just setting permissions to 400 or something!


I haven’t played w/ memory limits, but when I tried messing w/ buld download of raw TIF files, it ran out of memory pretty quick. I may look into what I can to about the limits, though.


Same. I have a mediawiki install on the shared hosting still, but I haven’t updated it in forever. For the $10.99/month I’m paying for shared hosting, I could save a little and do a more powerful VPS to host similiar stuff… Of just keep doing what I’m doing w/ my S12 pro & Synology. Might look at some kind of failover down the road.


Fair point. Currently, everything that requires off-site backup is sent to my father’s Synology using hyperbackup. So off-site is sorta self-hosted already. Was thinking in terms of a second fallback option.


Anyone still use “traditional” shared hosting?
A long long time ago, I bought a domain or two, and a shared hosting plan from Dreamhost w/ unlimited bandwidth/storage. I don't have root access, and can't do containers on this. It's been useful for a Piwigo instance to share scanned family photos. The problem I have is the limited resources really limit Piwigo's ability to handle the large TIF files involved in the archival scans. There are ways around this, but they all add time to the workflow that already eats into my free time enough. I'm looking at moving Piwigo to my local server that has plenty of available resources. That leaves me with little reason to keep the Dreamhost space. So what's a decent use case for cheap, shared hosting space anymore? To be clear, I'm not looking for suggestions to move to a cheap VPS. I've looked into them, and might use one in the future, but don't need it right now. The shared hosting costs about $10.99/month at the moment. If there was a way I could leverage the unlimited bandwidth/storage as an offsite backup, that would be amazing, but I'm not sure it would be a great idea backing up stuff to a webserver where there best security I can add it via an .htaccess file.
fedilink

At the ends of the day, it’s about what you’re comfortable working on. My daily driver is a MacBook Pro. I have a BeeLink S12 Pro that runs most of my self hosted stuff, and a Synology that runs a couple things. I also have an HP Z440 as a test bed box (powered off unless I’m working on something). I’m comfortable working with Linux and power draw was important for me in setting up my always-on server (my power bill is already high).

The only minor concern I would have with a mini is you’re limiting your support base. This isn’t to say there’s no support, there’s just less. Most self hosted are using something like a unraid, a beelink, or an old micro Dell/HP/Lenovo. Because of that, there’s a ton of stuff out there about getting various services running on these setups. The M-based mini environment is going to be a little more unique.


Just reread you comment and I guess it’s the network that will cause issues. To be clear, I think I can make the cloudflare portion work one way or another (I have a second domain i can use if necessary). If my thinking is correct the tailnet communication would be over that IP space - not trying to route to my LAN net. Unless I’m missing something.


So I learned today that I need to play with the conflate tunnel if I want two systems using one domain. I’m hoping a second api key will help. Honestly, until I tested the second server on the tunnel, that’s been rock solid. Or are you saying using both networks will inject flakiness?

Also, I appreciate the suggestion of clustered with, but none of this is mission critical. If it’s down until I can login/fix, I’m ok with that. Only a 2-3 people using it.


Agreed. I’m not much of a coder, so the best contribution I can give is probably $$. At least until I get off my ass and learn something new!


True, it’s a good percentage, and probably better than most free software. That said, given the communities the self hosted apps support, their excitement for the products, and for some the essential nature of some of these apps, it would be nice to see the yes/no number more 50/50 at least.


VPS services connecting to local services
I currently have my home services set up in a way I like, and think I understand. I have an S12 pro w/ *arr, Overseerr, Immich, paperless, etc running. The only things exposed are immich, paperless, and overseerr. This is via swag/dockerproxy over a cloudflare tunnel. This makes it so I don't have to do anything on the cloudflare end or my router to add a new service. DockerProxy picks up a new container, swag configures a reverse proxy automatically (assuming it recognizes the container, but it also supports custom configs) using the container_id as the subdomain. I'm looking at setting up a VPS to host authentik and uptima kuma (to start - maybe ntfy in the future). What I'd like to do is have the public interface on these containers use the same cloudflare tunnel I'm currently using... or a second one, if necessary. For the interface back to my home server, I'd like to use Tailscale. I already have it running on my home server, and I expect I'll install it on my VPS. The goal here is the "public" connection uses the cloudflare tunnel, and the backend connection is over tailscale. I've tested that I can spin up swag/dockerproxy on a second box in my lab and it will connect to cloudflare. I have not yet tested standing up a container on that box to see if the proxy works as expected. So, questions: - Tailscale on VPS: container or no? Obviously, if I can't install it locally, I'll put it in a container - How to I configure a container to use these 2 networks? I'm fairily good on getting the cloudflare part working. The TS part is new to me, and all the documentation I've seen doesn't really cover other containers using the tailnet. - Am I overthinking this? If I put these services on tailnet alone, will the cloudflare tunnel... tunnel back and forth to/from clients not on tailnet?
fedilink

I think this is pretty troubling. Including myself in the sentiment that the self-hosting community needs to do better. Aside from funding individual projects, are there any organizations that help fund self-hosting projects?


Compatible? Should be. Identical? No (at least not always). Only identical FRU are interchangeable.


I use nginx & docker-proxy. Because the model I copied used that setup. Having messed with it a bit, I’m understanding it more and more. Before that, the last time I messed with a web server (Apache), nginx wasn’t around. Lately, I’ve seen a similar docker setup to mine that doesn’t use docker-proxy. If I find time, I’ll probably play with that some on my dev rig.


2 Swag Instances to 1 Cloudflaire domain
I have the arr stack and immich running on a beelink S12 pro based on [geekau mediastack](https://github.com/geekau/mediastack) on GitHub. Basically, and I'm sure my understanding is maybe a bit flawed, it uses docker-proxy to detect containers and passes that to swag, which then sets up subdomains via a tunnel to Cloudflaire. I have access to my services outside of my LAN without any port forwarding on my router. If I'm not mistaken, that access is via the encrypted tunnel between swag & Cloudflaire (please, correct me if I'm wrong). That little beelink is running out of resources! It's running 20 containers, and when immich has to make any changes, it quickly runs low on memory. What I would like to do is set up a second box that would also run the same "infrastructure" containers (swag, docker-proxy), and connect to the same Cloudflaire account. I'm guessing I need to set up a second tunnel? I'm not sure how to proceed.
fedilink



Not really sure what you’re getting at here. I’ve had a network outage for the past 2 days and was able to watch stuff on my local NAS just fine. I haven’t done anything special to make it do that.


Title of this post is a bit misleading. You’re suggesting the article spells out how Disney’s, and other companies’, rabid protection of its IP is a Bad Thing, when it’s really more of a history and primer on what’s changed with Steamboat Willie entering PD.


Whether or not they comply with law enforcement is not the issue. Any company will comply with their local law enforcement if they want to keep their doors open. What’s important is what data they keep on their users. Unless I’m mistaken, Nord, like many others, only keeps billing info and limited connection info for load balancing purposes (deleted after something like 15-minutes). So, the Panamanian government (where they’re headquartered); who IIRC has no data retention laws and isn’t part of 5-eyes; asks for logs, they will get something, but not much to tie a specific customer to anything.

Also, Nord has been independently audited multiple times in the past. Something quite a few other providers can’t say.

It’s popular to bash on Nord b/c they advertise a lot, but I haven’t seen a legit reason not to use them. If it exists, I’d love to see it.



There’s nothing wrong with the small PC/NAS route. Certainly more powerful and flexible. I’m currently running the *arr stuff in containers on a Synology 1520 (also storing a bunch of other stuff), with Plex running on a Shield Pro. It’s pretty low power draw, and so far does everything I need.

Main thing with running Plex on the NAS is transcoding - audio and/or video. Depending on what your Plex client is, you want to make sure everything you’re streaming can direct play.


Adding to this, there’s probably a general feeling that, especially with publicly traded companies (which Nord isn’t… yet), profit motive will inevitably cause a company to make decisions that don’t align with its customer’s best interests. The idealist in me thinks it’s possible for a company to be profitable without being shitty towards its customers. The cynic in me thinks there’s probably more profit in being shitty.

That said, profit keeps companies in business. If you’re getting it for free, you’re either the product, pirating it, or relying on others to keep it going. I won’t say paying for it guarantees future availability and development, but that profit motive also motivates continuing development. Kind of a double edged sword, there.


Obvious next question: how’s the privacy policy on 3rd party stereo makers like Pioneer, Kenwood, Alpine, Jensen, etc.?


The folks on similar IPs to me really like porn.

I’m on Nord. I know a lot of folks on here diss it, but I’ve been mostly happy with it.


Adobe and Microsoft only kinda care about you. You’re one person. All the freelancers out there are still a fairly small part of their respective balance sheets. If you’re a freelance worker, some of your customers might require you to show valid licenses for the software you use, because they want to make sure their partners are ethical (at least, in this regard). Alternatively, you could use FOSS apps.

As someone else already said, if you are making money using commercial software, you really should be paying for it. The cost of your software should be factored into what you charge your customers. They should understand that.


I have VPN, BitTorrent and prowlarr in one “stack” (a project in Synology Container Manager). Everything else is bundled into a separate project. Not sure how portainer would make this work differently. I don’t have much experience with that.


FWIW, all of my *arr, and VPN containers use the same network bridge. Prowlarr and torrent use the VPN service, though having Prowlarr on there is maybe overkill. They’re all able to access one another using the bridge gateway + port as the host, e.g.: 172.20.0.1:5050

I mostly used this guide, where he suggests:

I have split out Prowlarr as you may want this running on a VPN connection if your ISP blocks certain indexers. If not copy this section into your compose as well. See my Gluetun guides for more information on adding to a VPN.

One thing I had to make sure of was that the ports for Prowlarr were included in the VPN container setup, rather than the Prowlarr section (b/c it’s just connecting to the VPN service):

    ports:
      - 8888:8888/tcp # HTTP proxy
      - 8388:8388/tcp # Shadowsocks
      - 8388:8388/udp # Shadowsocks
      - 8090:8090 # port for qbittorrent
      - 9696:9696 # For Prowlarr

The 4k77 guys pretty much provided what fans have been asking for. Lucas had his chance and chose to charge the fans for something they didn’t ask for.


It’s succinct. I’ll give you that!


Photoprism or Immich
I’ve seen a lot of recommends for Immich on here, so I have an idea what the answer here is going to be, but I’m looking for some comparisons between it and Photoprism I’m currently using Synology Photos, and I think my biggest issue is it’s lack of metadata management. I’ve gotten around that with MetaImage and NeoFinder. I’m considering moving to something not tied to the Synology environment.
fedilink


Backing up Blu-Ray movie doesn’t work
Trying to backup a TV show I have on BD to my Plex server. The other discs in the season work fine w/ MakeMKV, but this one will only read the deleted scenes. So far, I've tried MakeMKV and dd (which returns an I/O error on this disc). Any other thoughts?
fedilink