YouTubeWebsitePixelfed

Let me know if you want to mod any communities I’ve setup here on lemmy.world, thank you.


  • 0 Posts
  • 48 Comments
Joined 1Y ago
cake
Cake day: Jun 14, 2023

help-circle
rss

I want to present my files - wherever they may be - to all sorts of different applications which let me interact with them in different ways.

Only some self-hosted software grants us this portability.

I’d say almost everything is already covered with Samba shares and docker bind mounts. With Samba shares the data is presented across network to my Kodi clients, the file browser on my phone, and the file browsers of all my computers. And with docker bind mounts those files are presented to any services that I want to run.


It isn’t, you can get SFF PCs for as little as $75 on eBay that have Quicksync CPUs and will run circles around a RPi, especially if you have to do any transcoding. They are also really power efficient… 7-20W idles.

https://www.ebay.com/itm/195163970881

SBCs really should no longer be considered for selfhosting unless you are A) in an extremely power constrained environment like an off-grid RV or vanlife situation or B) clustering


Some dashboards can do this, check out homepage as one example. Glances is a great tool if you want to create your own. Really you should not rely on checking status pages for health but instead set up monitoring and notifications.

I use healthchecks.io and smtp_to_telegram to be instantly notified of SMART failures, storage limits, backup and other script failures, and if docker services go down. As with all selfhosting though, there are plenty of other options for both the monitoring system and the notification system.


Cake Wallet is available for Linux since May of this year, but it’d probably be better to install Feather Wallet in TAILS.


That actually sounds pretty awesome in terms of not having big risk and being easy enough that someone like me can pull it off. But does just moving it to another wallet make it anon?

(non-anon wallet) => transfer A => (anon wallet)

You have the idea, that’s all you need to do, the only thing I will add is that I would also recommend using sub addresses on each of the wallets. It’s not a good idea to use the same address over and over if you are going to do future transactions.

Also… maybe a dumb question and I know this is completely different but i’m used to having tools like ipleak.net to check for dns leaks and stuff like that. Is there a way to check if your current wallet is anonymous or can be traced back to you?

Setup your first Cake wallet on your phone, without regard to being anonymous, as you are funding it with a known account like zelle or whatever anyway. Then boot into TAILS and set persistent storage with a Cake Wallet that you create in there, that will be your anonymous wallet, and then you can fund it from your known wallet. That TAILS Cake wallet will then be anonymous and you will not need to worry about tracking or fingerprinting in TAILS while you are connected to TOR. Just make sure not to accidentally identify yourself through other means.


Yeah, you would look at the list of vendors trading with cash, and then choose one that you like. Check reviews.

does that mean i can only trade other crypto for it?

Not sure what you mean exactly by this, can you rephrase that?

like i have a credit card and cash but no crypto, no wallet, etc currently…

You are going to need a wallet regardless of how you fund it, I would recommend Cake Wallet.

so that only leaves cash and it sounds like you are supposed to put actual paper cash monies in an envelope and physically mail it out? doesn’t that mean if you mailed like $50 or $100 or whatever, that it could get lost in the mail and you’re just fucked and have to eat it?

Yeah, there is a little bit of risk to do it this way. You would want to make sure that you can’t tell it’s just cash in an envelope.

Also, doesn’t post office requiere a return address? i thought they refuse to mail without that cz of bombs/hatemail/etc? Am I wrong about that?

You can write anything you want on the return address though, it doesn’t matter. Obviously again, if it does get returned, you’d lose it. And that is the risk you take for being completely anonymous this way.

Alternatively you could use a known account like paypal or zelle or cashapp to fund an initial wallet but then you have to make sure to send it to another XMR wallet so it’s anonymous.


Get XMR like the other person said if you want anonymity.

https://agoradesk.com/

Pay by cash, or if you buy it with a known account then send it to another XMR wallet afterwards.


I have a Paperwhite 2015 version that I got back in 2016 for only $30 when they had a big sale on them to unload for their new version. Looks like on eBay that 2015 version goes for $30-50 today.

I transfer books to it via a USB using Calibre. It doesn’t need nor do I connect it to WiFi. Newer models might also be able to work via USB only, I don’t know, but I know my 2015 works that way.


Yeah, my recommendation is basically this:

Do you need to share passwords?

No - use KeePass

Yes - use Bitwarden


I’ve managed to keep my KeePass database for almost 20 years going back as far as when I was a dumb teenager. Back then it was as simple as having a couple extra copies on usb drives and Google Drive, but now I keep proper backups.

My take is, I’d rather control it myself, I am responsible enough to take care of my data, and I actually wouldn’t trust someone else to do it. That’s a huge reason I selfhost in the first place, a lack of trust in others’ services. Also, online services are a bigger target because of the number of customers, and maybe even the importance of some of their customers, whereas I’m not a target at all. No one is going to go after me specifically.


That looks very nice, gives a lot more options which I love so I will have to look into it.


I know this doesn’t fit your criterea OP, but if anyone else is looking for some kind of notification service, I use: SMTP to Telegram

I get instantly notified on my phone for healthchecks.io failures, cronjob reports for different scripts like borg backups or ddns update failures, certain Home Assistant scripts, and Sonarr completions so I know when a new TV episode is done downloading, and a bunch of other things set to notify on failure like SMART failures or snapraid-runner failures or distro updates… so many things. It’s nice having peace of mind that if I haven’t been notified that something is wrong, then I know everything is working, and I do not need to check on it. So it’s one of my favorite services that I’m running.

I don’t think I need to say it, but this is obviously not something you would put facing WAN as there is no TLS nor authentication.


It also means if Plex ever goes under then remote access will stop working for you, your service only works as long as the company still exists. If jellyfin ever goes under, nothing changes. I realize this isn’t a selfhosting sub, so I won’t go too in depth on my tirade, but persnally, I want to selfhost everything, and I don’t want to rely on any cloud based services that are outside of my control like Plex.


Most of the amcrest cameras have rtsp and don’t require cloud access, in fact I block mine from WAN altogether.

I have one wired POE outdoor camera and one wireless inside camera from them. Both are great cameras that I can fully control locally. Just make sure it has rtsp, because I’m not sure if every model they make has it.


I read it as the lists are awesome, not necessary everything in the lists.

I will tell you right now that I also think your idea is bad because I wouldn’t follow a list with subjective criteria and selections. I don’t want someone making those subjective decisions for me. Who is to say what awesome is? You don’t know what I’m looking for in a service, you don’t know what I value. If I prioritize privacy and security over form and function, I guarantee it is not going to be the popular or “awesome” option.

Example:

The tide is changing in this regard but 3 years ago, jellyfin was much less mature and Plex was really the most popular option for streaming media. Honestly, very few people talked about jellyfin and if they did, it was usually about it’s deficiencies. So 3 years ago, according to most peoples’ criteria, Plex might be the top option on a list, maybe even the only option with a couple of honorable mentions. But according to me, I wouldn’t even put Plex on a list because I don’t consider it selfhosting being that it relies on 3rd party servers. So who is right? There is not right or wrong, it’s subjective, everyone has to make their own decisions. So you see the problem. That is merely one example of countless because everyone prioritizes things differently.



I just wanted to give you my two cents and say that I appreciate the way you have it. And also thank you for all the thought you’ve put into it because I don’t want someone making subjective decisions for me and I’m glad you understand that position.


Interesting, yeah, maybe report it as an issue on github, I use a browser link to my dashboard for Home Assistant instead of the app so it hasn’t happened to me. I almost installed it the other day to get presence detection but decided on another way.


Yeah, I haven’t had any problems with it, what apps have been an issue for you?

The app that I use the most during that transitional period would be Ultrasonic which would be streaming music from the Airsonic service as I get in my vehicle and drive away or arrive back home. But even that flawlessly transitions without skipping a beat since it is set to cache songs.


It would be extra overhead for no reason. Why keep it on when Tasker automates it?


You are talking about security when that is not the purpose of it. So yes, you are off on a tangent and missing the point of it.

It should be clear to people who don’t understand security that running a protocol on a different port doesn’t mean shit for safety.

It is clear, it’s clear to everyone, so why did you randomly interject irrelevant information? Because you incorrectly assumed someone thought it had to do with security… but no one here thought it had anything to do with security. Everyone understood it but for you, and you were corrected not only by me but the other person.

Because it doesn’t get as much attention” wouldn’t mean anything to any enterprise firewall the moment it’s not an http header.

As I’ve said, I’ve used it a few times to escape firewalls… it works. Will it always work? No, I never made the claim this will bypass all firewalls… the strictest of firewalls will block it, but there are other ways around those firewalls. E.g. proxytunnel, stunnel4


I think you may be still missing the point because it was never implied that the port change is for security; the security is in disabling password authentication and only accepting key based authentication. The reason I put it on 443 is because it is a port that is usually allowed by firewalls and doesn’t get as much attention. So if I am on a network that is blocking access for standard VPN or SSH ports then it might just be enough for me to bypass it. And it’s traffic on a port that is going to see a lot of other encrypted traffic going across it, so it looks more natural then just popping some other random ports that could potentially raise an alarm.


Unless you need to share/provide services for a public, then you shouldn’t be setting up reverse proxies or cloudflare tunnels in my opinion. All you need is WireGuard for you and the handful of users that might be using it.

I have two ports open for:

  1. WireGuard

  2. SSH Tunnel

Both of these services will only accept key based authentication.

WireGuard is the main way that my wife and me access the services away from home. When our phones disconnect from our home’s SSID, Tasker automatically connects to the WireGuard tunnel so we never lose access to services.

The SSH tunnel is just a fallback in case I get behind a firewall that might be doing DPI and blocking VPN traffic. The SSH tunnel operates on 443 to hopefully appear to be SSL traffic and allowed through. I’ve used it a very limited amount of times to get out from strict corporate firewalls.


  • Scheduled Jobs
    • script to update subdomain ( E.g. home.domain.com) with external home IP address
    • script to run snapraidrunner
    • script to check docker services and report healthchecks
    • script to update and clean kodi libraries
    • script to backup with borg
  • Snapraid on 4x8TB
  • NAS - Samba shares
    • backups
      • computers
      • phones
    • public
    • media
      • music
      • tv
      • movies
  • SSH Tunnel
  • WireGuard (primary way to access services away from home)
  • Print server
  • Docker
    • Server 1 (ThinkCentre M93p, Intel i5-4570T 8GB RAM)
      • healthchecks (monitors services and makes sure scripts run otherwise notifies me)
      • smtp_to_telegram (most services support email notification, this is a way to use the built in notfication of most services but be notified instantly)
      • trilium (notes with tree structure organization)
      • pinry (image board, think pinterest)
      • portainer (GUI to manage docker services)
      • adguardhome (DNS adblocking like pihole but better in my opinion)
      • rustdesk (remote admin software, think remote desktop)
      • ulogger (what I use to map my motorcyle rides)
      • dozzle (docker log viewer)
      • mariadb (database for services that require mysql)
      • postgres (database for services that require postgres)
    • Server 2 (ThinkCentre M93p, Intel i5-4570, 20GB RAM)
      • omada-controller (controller for my tp-link router/switches/aps)
      • home assistant (control smart devices, setup automations)
      • airsonic (stream my music)
      • airsonic-refix (an alternative GUI for airsonic)
      • paperless-ngx (searchable document archive, I keep manuals and some receipts and tax documents)
      • redis (dependency for some services)
      • lidarr (manages music and auto downloads monitored artists/albums)
      • jackett (manages torrent trackers and can combine them into one query for things like lidarr/sonarr/etc.)
      • openbooks (download ebooks for my paperwhite)
      • sabnzbd (client for usenet downloads, integrates into lidarr/sonarr/etc.)
      • sonarr (manages tv shows and auto downloads them)
      • esphome (makes flashes firmware on devices easier)
      • agendav (web calendar, integrates with baikal or any caldav service)
      • baikal (keeps my calendar and contacts)
      • photoprism (photo manager, prefer over immich until immich has better read only integration)
      • stash (nsfw)
      • deluge (torrent client, integrates with lidarr/sonarr/etc.)
      • portainer (GUI to manage docker services)
      • dozzle (docker log viewer)
      • nginx proxy manager (use it to set subdomains for the services… E.g. arisonic.home.lan)
      • wallabag (save webpages for later viewing, doesn’t seem to work on a lot of sites so I usually just use SingleFile and save to a folder on the NAS instead so I might down this)
      • syncthing (mainly use it to backup all the photos and /sdcard/ dir on my phone, but also keep some configs synced between laptops/desktops)
      • adguardhome (backup to the other adguard dns)
      • nginx
        • Homer dashboard (my favorite dashboard, but been looking at homepage lately)
        • DokuWiki (favorite wiki, prefer the classic styling)
        • minimalist-web-notepad (very fast and easy notes for quick and temporary notes)

I refuse to buy any smart devices that require online cloud services and I can’t control locally.


It sounds like we have similar setups. I do the same with syncthing, works great, and not only backs up my photos but everything else on my phone like custom ringtones, notifications, exported backups from many different apps along with full neo-backup exports… basically all the common /sdcard/ directories like: Audio, Backups, DCIM, Downloads, Pictures, Documents, Screenshots etc.

I’m interested in immich for it’s multiuser sharing so I can easily share photos with others in the house. I have a huge directory of images, all sorted in folders, so until I can add that read only, immich isn’t an option for me. I tried setting it up with the monolithic docker image, and it didn’t import the directory the way I wanted it to, and seemingly made full copies of all the images into it’s own upload directory when I tried importing with the cli-tool. I was looking at it recently and the read only mode seems early stages. How do you like it so far?

Immich seems like it’s aim is to be firstly a phone photo backup solution… and that is not what I want… I already have a backup solution. All I really want is a mobile friendly way to look at all the photos I have already. PhotoPrism works exactly how I want but the one feature it lacks that I would really like is multiuser. I have seen there is a workaround for sharing with PhotoPrism where you can run individual instances for each user and then share a common directory… and right now that is preferable to immich for me unless they sort out the read only feature.



It’s always a good idea to confirm the error in case it was just a glitch in the matrix.


Yep, and it’s also nice to have that buffer when I ride through a few dead zones where I drop data for like 5-10 minutes but I set Ultrasonic to preload/cache 8 songs so it doesn’t disrupt playback then either.


Same here.

A tasker script automatically connects my phone to the Wireguard tunnel as soon as I disconnect from my home WiFi too, so I always have access to my services. It’s seamless, if I’m streaming music from Airsonic to my phone, and jump on my bike and take off, I don’t even skip a beat on playback.


https://somedaysoon.xyz/posts/tech/backups/

But tl;dr of that:

  • OS backed up with timeshift
  • Data backed up with both snapraid on the data drive pool and then also borg to other drives+devices+locations

I mean, that is just another way of checking your dashboards.

It’s not another way of checking dashboards… dashboards don’t even come into play for me with this notification system. If I get a notification that my backup script didn’t run, I’m dropping straight to an SSH session and checking logs and fixing it. There is no dashboard in this equation.

Unless you are dealing with a high availability setup, it matters a lot less whether you do a push/pull model for notifications so long as you are regularly checking then.

My home is not high availability, it’s just me and my wife, that doesn’t change the fact that this is a better solution over having to constantly check in on services. Also, high availability isn’t the reason for this, it’s having the peace of mind things are working, and doing literally nothing to know it. Right now, I know all my services are working, and how do I know? Because I haven’t received a notification that told me there is a problem so I know, everything is working. Do you know if all your services are working right now? No, not unless you actively check in on them right now. That’s the difference between my way and your way of doing it. I always know the status of my services, you don’t know unless you check in on them.

But listen, I’m not trying to persuade you, if you like to take time to check in and babysit your services to make sure everything is running correctly instead of setup a simple notification system, that’s your preference, but in my opinion it’s not the best way to do it. This is about working smarter instead of harder.


I agree that an “average joe” shouldn’t be selfhosting unless they firstly understand that they are responsible for their data and are making proper backups.

unless you are regularly checking your dashboards, they will happen in rapid succession

One thing I disagree with though, you shouldn’t be having to regularly check dashboards. And I understand this goes beyond the “average joe” realm of things, but you should have notifications setup to notify you if something is not working. Personally, I use SMTP to Telegram because almost every service has an email option for notifications, but I want to be notified instantly.

So when my healthchecks script runs and fails I’m instantly notified if one of my containers is down. If my snapraid scrub/sync fails to run or has errors or my borg backup script fails to run or has errors, I’m instantly notified of it. If my ddns script fails to update, again, I’m instantly notified of it. I’m even notified if the server has higher CPU load averages or RAM usage than expected of it, and of drive space running out, and of SMART failures. I’m even notified whenever a login to my OpenMediaVault dashboard occurs. My Omada Controller also has different network notifications, and so does HomeAssistant for different integrations.

Basically, I will be notified if any problems arise that need my attention… you shouldn’t be depending on scheduling your time to look at dashboards to ensure services are running properly. And if you setup a good notification system, you can just set and forget your services, mostly anyway.


I agree, Nicotine+ and Soulseek is the way to get music these days.

And for those saying using streaming services is easy and affordable so they don’t bother, I would remind you, it is… for now.

Look at what has happened time and time again with all these companies and how they just slowly squeeze their users over time or just flat out kill the service entirely. As someone that is really into selfhosting, and prefers to be in control of my data and privacy I would urge you to move away from those services. Setup Airsonic, Funkwhale, or some other streaming music service and control it yourselves.


Yeah, that’s what I use it for, I have access to my entire 200GB+ music collection no matter where I am and it has some other cool features too. Like radio mode where it will play similar artists. It’s pretty sweet, if you are going to try it then I would suggest the linuxserver.io/airsonic-advanced docker image.


Airsonic on the server and then Ultrasonic for Android clients would be my recommendation. With Airsonic you can setup music, podcasts, and radio stations. And it supports multiple users and gives you really good control over what each user can access.


If you had issues with search results then the problem is your indexers, Sonarr queries the sources you give it, if you give it bad sources then it’s going to have bad results.

I use Sonarr and Lidarr with private trackers and usenet where usenet is the preferred download source. I also use Jackett to combine the trackers so I can make one query from Sonarr to multiple trackers. Previously, for the longest time, going back to 2007, I used a torrent client with an RSS feed to download new TV show releases. It worked, but Sonarr provides far more granularity. And it is wife friendly, she can easily go on and add a show by herself. I don’t know of anything that works better than Sonarr, so OP… what would we have been using instead of it?



I never stated I knew everything. I asked you to back up your claim and you’ve failed to do it.

The only reason we are arguing is because I’m holding your claims to the burden of proof. You can stop being a blowhard whenever you want but you’re right, people are here to learn, so I’m doing them a service by calling out your misinformation.