• 0 Posts
  • 42 Comments
Joined 1Y ago
cake
Cake day: Jul 25, 2023

help-circle
rss

Blocklists are ineffective by design. Each and every member of the swarm can collect all the data necessary to flag you to your ISP. Obviously any professional collecting this kind of data can avoid a blocklist. There is no such thing as a better blocklist.


Teach us then 😭

I think this hits on another big generational difference. Those who grew up in the early days of personal computing and the Internet didn’t have teachers or a hallucinating language model to spoon feed them instant answers. They had to actually RTFM thoroughly before they could even think of asking in some arcane BBS, forum, or IRC for help from elders that had absolutely zero tolerance for incompetence or ignorance. MAN pages and help files came bundled, but the Internet (if you had it) was metered and inconvenient on a scale more like going to the library than ordering a pizza. They had to figure out how to ask the right questions. They had to figure out how to find their own answers. The Internet was so slow that all the really interesting bits were often just text. So much indexed and categorized one might need to learn a little more just to find the right details in that sea of text. There was a lot less instant gratification and no one expected to be able to solve their problems just by asking for help.

I’ve seen way too many kids give up at the first pebble in their path because they are so accustomed to the instant gratification that has pervaded our culture since the dawn of smart phones.


A decade ago we figured out blacklists were ineffective. What’s changed?


Downloading from YouTube or Spotify is still piracy. And those sources offer mostly shit quality far removed from the artist’s intent.

Believe it of not, there are things that aren’t on Spotify, YouTube, TIDAL, Apple Music, Bandcamp, or any streaming service. Sometimes when a streaming service does have a song or album, it’s either not the best quality or only a radio censored version available, even if Spotify claims it’s the explicit version. And that explicit tag feels like a slander because the original intent should be default and the radio edits should be the one’s with the CENSORED tag.

There is great music out there you can’t purchase or stream a digital release of.

There are old and often played CDs in my collection that can’t be ripped properly (by me) for one reason or another.

There are some really high quality vinyl recordings out there, done by people with better hardware and more skill than I. Again, many of these vinyl releases are not available in any other format and are no longer available for purchase anywhere.

The real primary reason I got into it, in the long ago times of Napster, was that I liked to make mixtapes/discs. When radio was no longer playing songs I wanted on those tapes, the wilds of Internet was the answer.

I still regularly support the artists I like as directly as I can: buying albums and merch directly from them at shows or their own websites. And I spend more of that money on more artists and especially less popular artists specifically because of the habits listed above.


In my experience, 2 devices will ultimately save you effort and frustration. Anything you choose as a good NAS/seedbox will be unlikely to have a good from the couch interface or handle Netflix reliable and easily. A small Android TV box may have a much better interface, simple app setup, and support all the streaming services, but probably won’t be very powerful or convenient to use as a NAS. The NAS is always on, plugged directly into the Internet access point, and tucked away out of sight and sound. The Android TV or Apple TV box is silent, small, and can be mounted directly to the Beamer/Projector.

Yes, Kodi exists and it’s add-ons can bridge this gap. But I still think that a SBC NAS running Jellyfin or plex + an Nvidia shield with jellyfin, Plex, Netflix, Spotify, YouTube, amaon, etc. will be so much easier to setup, manage, find support for, and upgrade.

I have a similar setup even though my server has a direct HDMI link to my TV. I’m not a fan of viewing using the server it from the couch. Setting up IR remotes sucks always. And it’s confusing for anyone but me to use. But if my Nvidia Shield dies or I’m having network trouble, VLC a pretty good backup.


Maybe they are illuminating their living room with the front end of a BMW.

Better yet, it’s a Pimp My Ride style makeover that replaces those unused turn signals with a projection system for an instant drive-in movie experience.



I think the more nuanced take is that we should be making “piracy” legal by expanding and protecting fair use and rights to make personal copies. There are lots of things that are called piracy now that really shouldn’t be. Making “piracy” legal still leaves plenty of room for artists to get paid.


I’m just curious how much RAM you think that is.


Docker compose is just a setting file for a container. It’s the same advantage you get using an ssh config file instead of typing out and specifying a user, IP, port, and private key to use each time. What’s the advantage to putting all my containers into one compose file? It’s not like I’m running docker commands from the terminal manually to start and stop them unless something goes wrong, I let systemd handle that. And I’d much rather the systemd be able to individually start, stop, and monitor individual containers rather than have to bring them all down if one fails.


Original? No. Usenet, BBS, IRC are the originals. Napster made it hip. Soul seek made it better. Then there was Limewire, DirectConnect, and some others. Then there was BitTorrent, which I really did use to download Linux ISOs before the rise of popular public and private trackers.


You don’t need to get too complicated with scripts if you let Picard do all the tagging and renaming. In my experience it works pretty well with the default out of the box configuration. Just don’t try to do your whole library at once, just go album by album and check each one is matching with the correct release. I was in the same boat about a decade ago and did the same, just a few albums a day getting tagged and renamed into a fresh music directory. And of course, make a backup first, just in case.

Lately I’ve been going through this process again because I messed up configuring Lidarr and many files got improperly renamed. Since they were all still properly tagged, fixing them has been easy, especially with Picard. I haven’t really bothered to find all the stray files yet (they’re still roughly in the right location) because Plex ignores the paths and just reads the tags so the misnamed files aren’t even noticable in Plex


When you say Plex interface remotely, are you referring to the Plex app or PlexAmp app? I feel like PlexAmp fixed all of my complaints about listening to music through Plex (the same app I use for videos).


Easytag works pretty well for me on Linux, when I’m not just using Picard. I use EasyTag mostly for fixing and normalizing the tags on audiobooks these days.


Jack of all trades, master of none. Forcing a router reboot to get the home Internet working again has become a thing of the past since I set up a unifi router and APs.

I’d had router/WiFi combos before running either dd-wrt, open-wrt, or tomato. None of them were stable. But I suspect that was because the hardware just couldn’t keep up, not because the open source software was faulty.


Why? If the power has gone out there are very few situations (I can’t actually think of any except brownouts or other transient power loss) where it would be useful to power my server for much longer than it takes to shut down safely.


If youre running it under your current user, theoretically anything your user can do (which usually means all your personal files)

That would be poorly configured permissions. There’s very little reason you should let any game run under a users own permissions, especially if you got it from a less than reputable source. Proper permissions would give it only enough access to run, nothing more.


Proper permissions would not give the game access to anything it didn’t actually need to run. It should be running either as it’s own user or wine. You don’t need a container. How did you think containers get locked down anyway? They run as a user with very limited access.


And that would achieve what exactly? The exploits won’t be the same. The permission structure shouldn’t allow it to do anything that would compromise the system. Maybe it can phone home, but to what effect?


It’s kinda trivial to limit their ability to do anything in Linux though. It’s not as if virus authors are gonna waste their time trying to exploit a demographic that is both small and extremely fragmentary when they can just write for windows.


They are pretty similar. It’s hard to judge because they are different sizes, different boundaries, and different brightness. If VLC is playing a bit dimmer, it makes sense that some artifacts would be less visible.


None of what you’ve just said here is true. They don’t work like house keys. Your system and my system are VERY different because I’m not making copies of my private keys anywhere. They never leave the safe place I created them. I only ever transfer the public keys. I could post my public keys here and there would be no security compromise for me. You came here asking for help. I tried to help you. I’m sorry it wasn’t what you wanted to hear. Your attitude sucks.


No, it is inherently bad to copy around private keys. You have some fundamental misunderstandings of how key authentication security works. RTFM.


No, you’re missing the point and creating a false choice here. You’re supposed to generate new keys for each client device and load their various public keys into the authorized keys file in your server user’s home folder. Copying around your private key like that is just BAD security and not how public key authentication is designed to work. It’s not as if the only two options are your bad way or passwords.

As an example, you copy your single private key to various devices and even carry (a probably un encrypted) copy around with you on a thumb drive, while I generate a fresh key set from each client that I use to connect. When your private key is compromised (when, NOT if), you must remove that public key from your server to lock out the bad actor, but that also completely locks you out. Unless you have physical password access to the machine at the moment its compromised, you’re also locked out. When one of my keys is compromised, I can just exclude that machine’s key from my authorized keys list on the server and continue accessing my machine remotely via any of the other uncompromised clients.


Why are you trying to reuse an ssh key? That seems like a really bad practice. It’s just not the way key pair authentication is supposed to work. Passing around and sharing private keys is BAD. Client devices create their own private keys and only share public keys. Just create a new key from ConnectBot and get it to your server via other methods. If you’re already away from home without any other means of connecting, that last part is admittedly tricky and you may be SOL.

Isn’t ConnectBot a dead project anyway? Last I checked, it hadn’t been updated in years. Well, I guess I was wrong here. I can’t find a simple full list of all the past updates, but I seem to remember moving away from ConnectBot because it lacked some feature I wanted and no longer worked on my new Android device. I’ve been satisfied with JuiceSSH, but I’m happy that ConnectBot is still alive since it was one of the first apps I installed on the first generation Android phone.


Maybe if you’re new to all this and/or have no interest. But if you’ve been tinkering for more than a few years, it’s just a PC version of project car. It’s something you tinker with on the weekends, adding and refining as you go. I would never be able to negotiate multiple streaming services in a unified way to my satisfaction. So it’s not as if I really even have the option of paying for what I actually want from a service.


Well, I’ve tried both (yes over twenty years) and writable optical discs have been pretty flaky in comparison to HDDs. I never suggested SSD was good for anything but temporary storage. But you’re totally missing the point about medium mattering much less than consistently making copies.


The storage medium you choose really isn’t as critical as making multiple copies, storing them in separate physical locations, and testing that you can recover the data when you need it. Diversity in the physical medium you choose is probably a good thing too long term. Archival discs aren’t really that long lived though. You could try, but unless you are regularly checking the discs and making additional copies, you’re going to loose data eventually. I gave up using discs as any kind of backup because it was too much hassle. Copying hard drives was much more straightforward and reliable.



You’re wording is a little weird, so hopefully we’re understanding your situation and desire. Symlinks won’t work, since they’re basically just links to files or directories, i.e. they do not contain the actual data. Most the software you’d use to torrent or to play media is going to struggle with following a bunch of symlinks. Hardlinks are better suited to seeding a torrent from one directory, while maintaining a copy elsewhere to fit in with your media filename standards, without double the storage size.

If symlinking is like forwarding your mail to a new address, hardlinking is like having one house with two or more addresses. Each address brings you to the “real” house. Deleting one address (maybe because you’re done seeding) does not remove the house or the other addresses. If you move or delete the target of a symlink, that link and any other symlink pointing to that location also breaks. The actual data of a file doesn’t get deleted until ALL of the hardlinks have been deleted.


Once you get Plex setup you can have all your videos, photos, and music with you wherever you are. You can stream via the Plex app on the firestick, Chromecast to any supported device with the help of your phone, or download media for offline viewing from your phone via a USB to HDMI adapter. The firestick Plex app is basically just a client, but the Plex app on an nvidia shield can additionally act as a server whose library resides on a USB drive connected directly to the shield or any samba/windows share the shield can see on your network.


What do you mean by professional made? The color of any dyes doesn’t really enter into it.

Mass produced CDs were physically stamped foil laminated with plastics. Writable discs regardless of quality, professional or otherwise, worked on a completely different principle which would fade (or rot) over time. Pretty much every other problem is physical and not rot.


It sounds like a physical hardware problem, not software. If you played it once and there is no apparent physical damage to the disc it may be a problem with your disc drive. If you can, try playing it on another piece of hardware like a dedicated DVD player. Also, you could try playing a known good disc (that you don’t mind losing) in your PC drive. This will help narrow down the cause to either the disc or the drive. I’ve had more drive failures over the years than disc failures. The discs that did fail were usually writable discs or obviously damaged. Most of the damage looked liked scratches of the read side or label damage.


You’re half right for the wrong reasons. Disc rot just doesn’t happen to stamped original discs, only writable discs rot. Old cheap discs might degrade for other reasons of course (like scratches or labels delaminating and tearing away at a substandard construction), but the data layer of original stamped discs doesn’t decompose because it’s mechanically stamped into the data layer. Original discs would have been stamped foil pressed between two layers of plastic. Cheap discs sometimes just skipped the top layer of plastic so that the data layer was just under the painted label. Writable discs especially using this cost saving technique. Thus any damage to the top label would damage the data layer. Writable discs rot because the bits are burned into a different kind of data layer film that can fade or otherwise decompose, but I doubt you’d be able to actually see dots from rot. Using the wrong kind of pen or using sticker labels could easily damage the data layer. If you hold a disc up to a light source and see dots of light through it, the foil layer has been scratched and it will be unplayable, but this is physical damage not rot.


Do you currently have a bunch of active torrents, each with a bunch of connected peers? What’s your network topology like? Aging combo modem/wifi/router? Have you tried limiting the total number of connected peers in your torrent manager? Torrents can really clog up a network. Sometimes routing too many connections overwhelmed my old router, forcing a reboot before any traffic could get through again.


I’ll have to take your word for that. I’ve never encountered such a thing in any of my podcasts. That sounds like a technical possibility in theory, but a practical fantasy given the way most podcasts are distributed.


I think most of the nerds that might be able to strip inline ads would prefer to support (or at least appear to support) podcast producers and so wouldn’t have much interest in developing said app.

Also, stripping ads isn’t an easy problem given how unpredictable they can be. Some podcasts ads are read live by the same voices presenting the podcast without skipping a beat (particularly common with a popular DnD podcast).

The more practical solution is to use a player that lets you set distinct skip intervals for skipping forward and back. For example on my podcast app, when I skip forward it jumps 30 seconds ahead, but when I skip backwards it only jumps 10 seconds. That means I can jump ahead in roughly commercial length intervals until I no longer hear the ads, but if I’ve gone just a little too far then skipping back once or twice usually gets me close enough to the start of real content again. If you rely on an app to strip the ads, you’re practically guaranteed to remove some actual content, you’ve already put more effort into avoiding the ads than I have just hitting skip once or twice, and if you’ve removed content that you want to hear then you’ve got to go back to the original podcast and listen through the ads anyway. Why bother? Podcasts aren’t like YouTube where a third party is inserting the ads in random obnoxious places that interrupt the narrative or musical flow. Are they?



Combination wifi & router devices are notoriously unstable. Those provided by ISPs are particularly bad. If you have the ability and the funds, spend a little more to get a prosumer router and wireless AP as separate devices that connect to your modem in bridge mode. In the long run, for me anyway, the stability and reliability of this kind of setup paid for itself quickly in less of my time wasted. My setup: my own cable modem per the specs my ISP provided, a unifi edge router X, and a unifi AP. I already had a server so I installed the AP management software on it, but unifi also sells a single board device to run that. Everything except the AP live in a little electronics cabinet tucked away. The AP gets it’s power over Ethernet, so it can be mounted to a better placement with regard to walls, doors, pipes, etc. on a wall or ceiling with only a length of Ethernet cable running to the router. The AP itself just looks like a hand sized bump of white on the wall. I turned off the AP’s status lights once it was setup so that it remains as discreet as possible. Adding a WiFi repeater from unifi nearer to the one room I still had a little trouble with was almost as easy as plugging it into the wall outlet.

Not everyone can or should go this route, and it was a learning experience for me with some growing pains, but in the end it was worth it to me. UniFi isn’t the only game in town either. Either way separating your network devices so that each only does one job (the modem connects, the router routes, and the AP does wifi) means that one underpowered chip isn’t being crushed under the weight of too many tasks at once.


You could spend a little for a prosumer router and AP. I have a very similar setup with a cable modem, edge router X (ubnt), a single UniFi AP, and a service running on my server (this could be replaced with a separate hardware device or Raspberry Pi, but the server is going to be running anyway). It’s been rock solid since I set it up, compared to the WiFi/router combo with open-wrt I was running before that struggled and needed restarting regularly.