• 0 Posts
  • 22 Comments
Joined 1Y ago
cake
Cake day: Aug 25, 2023

help-circle
rss

lol. Did this in my old building - the dryer was on an improperly rated circuit and the breaker would trip half the time, eating my money and leaving wet clothes.

It was one of the old, “insert coin, push metal chute in” types. Turns out you could bend a coat hanger and fish it through a hole in the back to engage the lever that the push-mechanism was supposed to engage. Showed everyone in the building.

The landlord came by the building a month later and asked why there was no money in the machines, I told him “we all started going to the laundromat down the street because it was cheaper”



You can just point your domain at your local IP, e.g. 192.168.0.100


Not with 64gb ram and 16+ cores on that budget


You could set it up in docker whilst still on windows, and then all you need to do is copy/paste your compose file onto your new Linux machine, that way you aren’t struggling to learn two things at the same time (alleviates the “I don’t know if the problem is with my docker config or my host OS”)


“how dare they use the right tool for the job without taking the time to learn how to do it sub optimally first”


In my experience about ~8% better but 4x slower to transcode


Brew day is ~8 hours, I would say it’s half nannying, there’s usually 2 hours where you can full on walk away, but the rest is either active cleaning or you have to press a button or stir a thing every 10 minutes so you are glued to your pot

Bottling is another ~2 hours or so (sanitizing bottles and capping them, cleaning the used fermenter) - you can cut this down to half an hour if you forego bottling, but that’s another $1500 in capital costs for kegging equipment


Yes, but that is also contingent on you placing absolutely zero value on your time.

An absolute bottom of the barrel recipe (10lb 2 row, 1lb c-10, 1oz hallertau, s-04) will run you about $30-40 per 20L batch. So after you spend hundreds of dollars on equipment, you are only saving like $40 per 10 hours spent brewing


If you are looking to save money, take the fraction of a cent price increase in stride

Signed: guy who has spent thousands of dollars on home brewing equipment


scientific research papers

When JSTOR comes knocking you are going to wish it was the MPAA instead


To elaborate a bit more, there is the MySQL resource usage and the docker overhead. If you run two containers that are the same, the docker overhead will only ding you once, but the actual MySQL process will consume its own CPU and memory inside each container.

So by running two containers you are going to be using an extra couple hundred MB of RAM (whatever MySQL’s minimum memory footprint is)


it won’t necessarily take twice the resources of a single mysql container

It will as far as runtime resources

You can (and should) just use the one MySQL container for all your applications. Set up a different database/schema for each container


Too bad the people who need this most aren’t the type who believe in fact checking


Yes, which is EXACTLY like a pip freeze’d requirements.txt, storing the exact version of every package and downstream dependency you have installed


But running those pip commands you mentioned is only going to affect what version gets installed initially.

I don’t follow. If my package-lock.json specifies package X v1.1 nothing stops me from manually telling npm to install package X v1.2, it will just update my package.json and package-lock.json afterwards

If a requirements.txt specifies X==1.1, pip will install v1.1, not 1.2 or a newer version. If I THEN install package Y that depends on X>1.1, the pip install output will say 1.1 is not compatible and that it is being upgraded to 1.2 to satisfy package Y’s requirements. If package Y works fine on v1.1 and does not require the upgrade, it will leave package X at the version you had previously installed.


Would that just create a list of the current packages/versions

Yes, and all downstream dependencies

without actually locking anything?

What do you mean? Nothing stops someone from manually installing an npm package that differs from package-lock.json - this behaves the same. If you pip install -r requirements.txt it installs the exact versions specified by the package maintainer, just like npm install the only difference is python requires you to specify the “lock file” instead of implicitly reading one from the CWD


How is it not a lock file?

package.json doesn’t contain the exact version number of all downstream dependencies, this does


pip also has lock files

pip freeze > requirements.txt



Rsync checks the files and only issues the copy if the file size/modified dates are different by default. Ignore existing will not overwrite a changed file afaik.

If the file is large it only sends the changed blocks (e.g. you have a 100gb database and only a dozen 4mb blocks have been modified it won’t send the full 100gb across the network)


Should we keep people from dying from lung cancer because they smoked? Should we not try to help people dying from liver disease because they’re alcoholics?

When the smoker/drinker fully admits they have zero intention of quitting, I would much rather give my lung/liver to someone who isn’t going to get a full, healthy life out of it, rather than someone who clearly would rather continue abusing it and burn through it in a couple years.

Organs are a limited resource, that’s why there is a list - and we should absolutely dedicate limited resources to doing as much good as possible