If you have the technical knowledge (or ability to follow instructions) you can setup your torrents behind a VPN using qBittorrent and gluetun in one docker container, and Plex/Jellyfin in another with a shared volume between them. It provides you near bullet proof protection due to the isolated environment and prevents accidentally clicking those .mp4.exes since it abstracts file management away from you. It’s also super user friendly once set up even if you’re not running a media server 24/7 and just want it for your desktop.
If you wanna get fancy you can use the *arr suite of software to do some magic fairy shit including automatic indexing (searching dozens of torrent sites simultaneously), but that can quickly become a deep rabbit hole. Once setup though it’s seamless and kicks ass.
I’m a Zoomer with a Dell Optiplex running Ubuntu server, an 18 TB HDD, and 35 years of combined seed time. I’ll let you fill in the gaps. Many of us are extremely tech literate and often share our Plex/Jellyfin instances with friends. Many of these not-so-etch-literate friends ask how they can do this for themselves using their computers and we shoot them over instructions.
Piracy is infinitely easier/more accessible than ever. It’s spreading like wildfire and thanks to the FOSS community anyone with a spare evening can get themselves up and running very quickly.
I’d say it’s more convenience than elitism.
I’m in BTN and it’s the only indexer I use for my Sonarr instance because it has absolutely everything. I’ve never not been able to find something and almost everything I download will saturate my 1.2 Gbps connection.
For Radarr I don’t have any private trackers and it takes 35 public trackers to get coverage that is almost as good. The options I’m given are way less organized and download speeds are a gamble. It’s not really an issue because I rarely watch movies, but I definitely understand why private trackers are so sought after. I’ll eventually try to get into some smaller ones which tend to be pretty easy to do.
That was a pretty interesting read. However, I think it’s attributing correlation and causation a little too strongly. The overall vibe of the article was that developers who use Copilot are writing worse code across the board. I don’t necessarily think this is the case for a few reasons.
The first is that Copilot is just a tool and just like any tool it can easily be misused. It definitely makes programming accessible to people who it would not have been accessible to before. We have to keep in mind that it is allowing a lot of people who are very new to programming to make massive programs that they otherwise would not have been able to make. It’s also going to be relied on more heavily by those who are newer because it’s a more useful tool to them, but it will also allow them to learn more quickly.
The second is that they use a graph with an unlabeled y-axis to show an increase in reverts, and then never mention any indication of whether it is raw lines of code or percentage of lines of code. This is a problem because copilot allows people to write a fuck ton more code. Like it legitimately makes me write at least 40% more. Any increase in revisions are simply a function of writing more code. I actually feel like it leads to me reverting a lesser percentage of lines of code because it forces me to reread the code that the AI outputs multiple times to ensure its validity.
This ultimately comes down to the developer who’s using the AI. It shouldn’t be writing massive complex functions. It’s just an advanced, context-aware autocomplete that happens to save a ton of typing. Sure, you can let it run off and write massive parts of your code base, but that’s akin to hitting the next word suggestion on your phone keyboard a few dozen times and expecting something coherent.
I don’t see it much differently than when high level languages first became a thing. The introduction of Python allowed a lot of people who would never have written code in their life to immediately jump in and be productive. They both provide accessibility to more people than the tools before them, and I don’t think that’s a bad thing even if there are some negative side effects. Besides, in anything that really matters there should be thorough code reviews and strict standards. If janky AI generated code is getting into production that is a process issue, not a tooling issue.
I mean if you have access but are not using Copilot at work you’re just slowing yourself down. It works extremely well for boilerplate/repetitive declarations.
I’ve been working with third party APIs recently and have written some wrappers around them. Generally by the 3rd method it’s correctly autosuggesting the entire method given only a name, and I can point out mistakes in English or quickly fix them myself. It also makes working in languages I’m not familiar with way easier.
AI for assistance in programming is one of the most productive uses for it.
Not OP, but my main preference for MacOS comes from the UI/UX of an absolute rock solid OS on top of a unix-like shell. I regularly go months without rebooting my machine with 0 issues like software hanging on wake.
I know there are a lot of exclusive creative apps, but all I really use my MacBook for is code, typical browser stuff, music, slicer/web interface for my 3D printer, and to interact with my home server. I’m not an open-source/Linux purist by any means, but pretty much all the software I use is widely available on all platforms. It probably helps that I bought a MacBook after growing up with Windows/Linux, so I came into it with a set of software I was familiar with that already existed on other platforms.
Just because you’re not writing high performance software doesn’t mean it shouldn’t be a consideration. Sure, I’m not gonna micro-optimize memory when I’m writing an API in Python, but that doesn’t mean I’m not going to write it efficiently.
If I have to store and then do lookups on some structured data I’m gonna use a hash table to store it instead of an array. If I need to contact a DB multiple times I’m only gonna close my connection after the last query. None of this is particularly difficult, but knowing when to use certain DSA principles efficiently falls pretty firmly into the computer science realm.
If you need someone to hyper-optimize some computations then a mathematician might be a better bet, but even those problems are rarely mathematician level difficult. Generally software engineers have taken multivariate calculus/differential equations/linear algebra, so we’re decently well versed in math. Doesn’t mean we don’t hate the one time a year we have to pull out some gradients or matrices though.
I’ve had the opposite experience at my past and current job.
I’ve always been given the choice of Windows or MacOS, with a remote Linux machine available if needed (first job I ran remote IDEs on it, second job I’ve gone full local development). Same with IDEs. As long as I was able to properly write and test code it did not matter what I used as both companies had licenses for the top IDEs (JetBrains suite, Visual Studio, etc.), and would buy one-offs if you wanted to use something else. There was always a general team convention simply due to ease of use, but I occasionally opted for a heavily modified VSCode workspace over PyCharm and the like.
Same. I play the same 5 games throughout the year and rarely buy anything, but a few games I’d been looking at went on sale. I could’ve pirated them, but it was just so much easier to click buy on my Steam Deck and instantly download and play them. Not to mention cloud saves for free, remote play, and the ability to dock the thing to my 65" 4k TV.
Steam has robbed me of more money than any streaming service ever could, and I’m not even mad because they provide the best service I’ve ever received no matter how many or few games I buy. They recently identified one of the biggest reasons for refunds and piracy being people who want to validate games will run well on their system, especially on Steam Deck. As a result they’re working on a demo feature so you can test a game before buying it.
I promise you it’s dead simple to install if you wanna check it out. ModDrop is probably the easiest installation route, just follow the instructions in that link and you’ll be set up!
I think in this case AM4 is fine. I recommended it because OP mentioned the price was a bit much, and AM4 at the moment gets you a lot of value. Especially given they are someone who plays indie games primarily with some heavier games occasionally and isn’t on all the latest AAA games. I’m actually very similar to them where I’ll play the occasional AAA game, but I mainly stick to Minecraft and KSP (which is stupid CPU intensive). My R5 3600 was more than enough for this and my upgrade was 100% unnecessary, so the 5600X should last them quite a while. There is also a decent upgrade path from a 5600X to a 5800X3D or 5900X3D.
We’re starting to see gaps between generations get smaller as Moore’s law fails, so I think parts are going to start lasting a bit longer now anyway. Hell, my 4970k lasted me almost 7 years, and my mom ran it in her work PC I built her for another 3 after that.
I honestly don’t think either path is a bad one, just up to them if they want to save some money or get a little bit more upgradability.
I’m going to preface this with this computer will last quite a while, but you won’t have nearly as much of an upgrade path if you went with an AM5 platform (latest AMD CPU socket) on DDR5 (latest generation of RAM). With that said, your use case seems to be one that will not require keeping up with the latest games, so if you want to save some money this is what I would do.
NOTE: Prices are from Amazon, you can likely find a few components cheaper elsewhere.
CPU: You don’t need an R5 7600. I was running an R5 3600 up until a few months ago and the only reason I upgraded was I found a 5800X3D for a good price. I’d go for an R5 5600X which is $60 cheaper than the 7600 and will be more than enough for City Skylines 2
Motherboard: You can now get a B450DS3H board for that CPU for $40 cheaper
RAM: You’ll now be on DDR4. Get a 16GB kit of CL16 DDR4, will be about the same price as the DDR5 you have. May want to go for 32GB of RAM because sim games eat RAM, but ultimately up to you. You can always buy more down the road if needed as a 32GB kit is like $5 less than 2 16GB kits.
Case: The no-name brand cases on Amazon are actually quite good. You can get a nice case for ~$50. Hell, I just found a Thermaltake Versa H18 for that price. Another $55 saved.
GPU: I haven’t kept up to date on GPUs, but I’ve heard good things about the 6700XT, and benchmarks look respectable for BG3 and City Skylines 2. You could likely get away with something a bit less powerful, but price to performance seems to side with the 6700XT.
This brings the price down to $831. You could ditch the aftermarket cooler and get it under $800 as the 5600X comes with a cooler, but I’m never going to knock aftermarket coolers as they tend to be much quieter and less whiny than stock.
IntelliJ for Java Pycharm for Python VS Code for everything else
I use the Jetbrains IDEs through Gateway to my dev desktop, and VS Code through SSH.
I work at AWS and the tight integration of the Jetbrains IDEs with our internal package manager/build system is a must. I frequently need to do some lighter scripting or text formatting at which point I just use VS Code because it’s faster. I could realistically use any of them for everything, but I’ve realized using 3 IDEs that suit my multiple use cases perfectly has been more enjoyable than using one IDE that does one thing perfect, and everything else just okay.
I never open any ports to the open Internet other than the two my friend client uses.
For remote access I use a P2P VPN called ZeroTier leaving it always running on the Pi, and switching it on for the remote device when needed. It’s free for up to like 50 users and is very powerful, but dead simple.
I have a Pi 3B+ I run qBittorrent, Plex, ProtonVPN through Wireguard, and a Samba share on and have had 0 issues. It’s connected to a 2 TB external SSD which is where the Plex media library lives and coincidentally where qBittorrent downloads to by default wink wink. I also have a P2P VPN called ZeroTier that allows me to securely connect to the Pi from anywhere. You should be golden with a Pi 4.
I’ve had zero issues even transcoding 4k BluRay content, but it required adding active cooling to prevent the Pi from overheating. Thankfully you can get a tiny heatsink and fan for under $10.
Edit: Accidentally said RPi 5 which didn’t exist… Fixed.
Ask and ye shall receive