• 1 Post
  • 24 Comments
Joined 1Y ago
cake
Cake day: Aug 19, 2023

help-circle
rss

So Google is a monopoly and removing funding to Firefox will help them not to be a monopoly? That does not sound right. Rather the opposite.

Nothing has been decided or done yet. Most likely they will just be forced to not abuse their position, for example make ads for it on www.google.com, don’t bundle Chrome with Android and such things.

I believe there will always be an alternative to Chrome available as the Open Source community will find a way together.


Block Chrome and use anything not Chrome based. In other words use Firefox.


If the government did it - I would be very angry. Do you know how much internet help with democracy in the world? Think of it as shutting down all libraries with books, and forbidde any kind of communication.

It would of course try to setup some communication and gather people. I currently only have one irc server. I guess people want something webbased as they probably don’t have the client software.

I would probably lose my job as well as it is a software as a service company that require internet to function.



A general protocol standard just for sync would be nice. But then there is the problem of getting all big players on board. There are open protocols like syncthing and seafile use but not compatible with eachother.


Just stop supporting the biggest actor in the market.



If there is a problem at low power usage then you can easily solve it by temporary add more power. Lets say add a 40watt lamp or something, later remove it from the calculation.


People will not leave Youtube. They got too many videos. I believe they will just find some proxy solutions or similar. Using anything else will for the author just result in they get paid less.


Yes, they can, like any other company or organization can. But they can’t remove the humans producing. That means the humans will just go anywhere else. Youtube is a standalone product that they probalby want to keep as I think it pays for itself with that amount of ads.



Youtube is just the database with video. Just use a different frontend. The problem is if I actually want recommended videos but without Google knowing about it, then it is hard due to the massive amount of videos. Only Google have the money so scan everything.


I love it is written in Rust. Should mean fast and efficient, low memory usage.

Edit: It uses MySQL as database so it is heavy.


How much will we save on the production Mozilla Firefox servers in terms of load?


Keepassxc + syncthing to phone in read only mode and to other machine. So 3 copies on different machine, while one of them is on me


I use very simple software for this. My firewall can use route monitoring and failover and use policy based routing. I just send all traffic to another machine with the diagnosis part. It does ping through the firewall and fetch some info from the firewall. The page itself is not pretty but say what is wrong. Enough for parents to read what error. I also send DNS traffic to a special DNS server that responds with the same static ip address - enough for the browser to continue with a HTTP GET that the firewall will send forward to my landing page. It is sad that I don’t have any more problems since I changed ISP.

Had a scenario when the page said gateway reachable but nothing more. ISP issue. DHCP lease slowly ran out. There were a fiber cut between our town and the next. Not much I could do about it. Just configured the IP static and could reach some friends through IRC in the same city so we could talk about it.

The webpage itself was written in php that read icmp logs and showed the relevants logs of up and down. Very simple.


The dvds are fine for offline use. But I dont know how to keep them updated. Probably result in taking loads of spaces as I guess they are equal to a repo mirror


I use it with Kubuntu. Doing apt update is now much faster. I did some testing and found some good public mirror so I could max my connection(100 Mbit) with about 15ms latency to the server. But I think the problem was there are so many small files. Running nala to fetch the files in parallel helps of course. With apt local ng I don’t need nala at all. The low latency and files on gigabit connection to my server leads to fast access. Just need to find a good way to fill it with new updates.
A second problem is to figure out if something can be done to speed up the apt upgrade, which I guess is not possible. Workaround with snapshots and send diff does not sound efficient either, even on older hardware.

apt update - 4 seconds vs 16 seconds.

apt upgrade --download-only - 10 seconds vs 84 seconds;


First off. If Internet goes down I have a http captive portal that do some diagnos, showing where the problem is. Link on network interface, gateway reachable, dns working and dhcp lease. Second, now when it is down, show the timestamp when it went down. Third, phone number to the ISP and city fiber network owner.

Forth. Watch my local RSS feed and email folder. Also have something to watch from Youtube or Twitch game downloaded locally.


Use Veeam. If you hit the limit just configure it to send to a SMB share and you need no licens.


It might be enough to just rsync stuff to the secondary regularly and the inactive machine monitor the active machine and just start all services as the active machine stops responding.



Local repository for Linux packages
I just installed apt cacher ng for catching my apt upgrade packages and saw a huge time improvement even though I have a good internet connection. It act as a proxy and caches the response packages. Do you run something similar? Or maybe even run a local repo mirror? Warning, they are really big so I don't think it is recommended unless you really need almost everything.
fedilink

I hate Python 3 requires parantes for print. Python 2 accepted print ‘hi’. Vs print(‘hi’)


24 MiB is too little. Not even enough for nginx/apache. What installation instructions did you follow?