Sorry but I can’t think of another word for it right now. This is mostly just venting but also if anyone has a better way to do it I wouldn’t hate to hear it.

I’m trying to set up a home server for all of our family photos. We’re on our way to de-googling, and part of the impetus for the change is that our Google Drive is almost full.We have a few hundred gigs of photos between us. The problem with trying to download your data from Google is that it will only allow you to do so in a reasonable way through Google takeout. First you have to order it. Then you have to wait anywhere from a few hours to a day or two for Google to “prepare” the download. Then you have one week before the takeout “expires.” That’s one week to the minute from the time of the initial request.

I don’t have some kind of fancy California internet, I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go - there are always intrruptions that require restarting the download. But if you try to download the files too many times, Google will give you another error and you have to start over and request a new takeout. Google doesn’t let you download the entire archive either, you have to select each file part individually.

I can’t tell you how many weeks it’s been that I’ve tried to download all of the files before they expire, or google gives me another error.

There’s no financial incentive for them to make is easy to leave Google. Takeout only exists to comply with regulations (e.g. digital markets act), and as usual, they’re doing the bare minimum to not get sued.

Avid Amoeba
link
fedilink
English
11
edit-2
2M

Or why is Google Takeout as good as it is? It’s got no business being as useful as it is in a profit-maximizing corpo. 😂 It can be way worse while still technically compliant. Or expect Takeout to get worse over time as Google looks into undermaximized profit streams.

@BodilessGaze@sh.itjust.works
link
fedilink
English
25
edit-2
2M

Probably because the individual engineers working on Takeout care about doing a good job, even though the higher-ups would prefer something half-assed. I work for a major tech company and I’ve been in that same situation before, e.g. when I was working on GDPR compliance. I read the GDPR and tried hard to comply with the spirit of the law, but it was abundantly clear everyone above me hadn’t read it and only cared about doing the bare minimum.

Avid Amoeba
link
fedilink
English
7
edit-2
2M

Most likely. Plus Takeout appeared way before Google was showing any profit maximization signs and didn’t even hold the monopoly position it does hold today.

@YurkshireLad@lemmy.ca
link
fedilink
English
142M

Because Google don’t want you to export your photos. They want you to depend on them 100%.

@TCB13@lemmy.world
link
fedilink
English
52M

It’s called: vendor lock-in.

@irotsoma@lemmy.world
link
fedilink
English
42M

Use Drive or if it’s more than 15GB or whatever the max is these days. Pay for storage for one month for a couple of dollars on one of the supported platforms and download from there.

Possibly linux
link
fedilink
English
22M

You need a solid wired connection. Maybe phone a friend for help.

Alternatively you could use curl. I think it as a resume option.

@rambos@lemm.ee
link
fedilink
English
32M

Im surprised that feature exist tbh. It worked fine for my 20GB splited into 2GB archives if I remember correctly

@gedaliyah@lemmy.world
creator
link
fedilink
English
22M

I used it for my music collection not that long ago and had no issues. The family’s photo library is an order of magnitude larger, so is putting me up against some of the limitations I didn’t run into before

It sucked when I closed my accounts years ago. I had to do it manually for the most part.

Flax
link
fedilink
English
62M

Try this then do them one at the time. You have to start the download in your browser first, but you can click “pause” and leave the browser open as it downloads to your server

You could try using rclone’s Google Photos backend. It’s a command line tool, sort of like rsync but for cloud storage. https://rclone.org/

@gedaliyah@lemmy.world
creator
link
fedilink
English
102M

Looked promising until

When Images are downloaded this strips EXIF location (according to the docs and my tests). This is a limitation of the Google Photos API and is covered by bug #112096115.

The current google API does not allow photos to be downloaded at original resolution. This is very important if you are, for example, relying on “Google Photos” as a backup of your photos. You will not be able to use rclone to redownload original images. You could use ‘google takeout’ to recover the original photos as a last resort

deleted by creator

Oh dang, sorry about that. I’ve used rclone with great results (slurping content out of Dropbox, Google Drive, etc.), but I never actually tried the Google Photos backend.

@helenslunch@feddit.nl
link
fedilink
English
102M

I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go

“Normal” home internet shouldn’t have any problem downloading 50 GB files. I download games larger than this multiple times a week.

@gedaliyah@lemmy.world
creator
link
fedilink
English
02M

Well then read it “shitty rural internet.” Use context clues.

@helenslunch@feddit.nl
link
fedilink
English
22M

Which context clues should I be using to blame your “shitty rural internet” on Google?

@1rre@discuss.tchncs.de
link
fedilink
English
3
edit-2
2M

Yeah, of course it varies place to place but I think for the majority of at least somewhat developed countries and urban areas in less developed countries 50Mbps is a reasonable figure for “normal home internet” - even at 25Mbps you’re looking at 4½ hours for 50GB which is very doable if you leave it going while you’re at work or just in the background over the course of an evening

Edit: I was curious and looked it up. Global average download is around 50-60Mbps and upload is 10-12Mbps.

they must have dialup or live in the middle of nowhere

@helenslunch@feddit.nl
link
fedilink
English
32M

That’s fair but also not Google’s fault.

@gedaliyah@lemmy.world
creator
link
fedilink
English
42M

The part that is Google’s fault is that they limit the number of download attempts and the files expire after 1 week. That should be clear form the post.

@Swarfega@lemm.ee
link
fedilink
English
32M

Not really helping you here. But when I started using Google Photos, I still manually downloaded files from my phone to local storage. I did this mainly to ensure I have the original copies of my photos and not some compressed image. Turns out that was a wise move as exporting photos from Google is a pretty damned awful experience.

@redxef@scribe.disroot.org
link
fedilink
English
53
edit-2
2M

Honestly I thought you were going to bitch about them separating your metadata from the photos and you then having to remerge them with a special tool to get them to work with any other program.

omg they WHAT

@Discover5164@lemm.ee
link
fedilink
English
102M

immich has a great guide to move a takeout from google into immich

Flax
link
fedilink
English
3
edit-2
2M

Links or it didn’t happen

@gedaliyah@lemmy.world
creator
link
fedilink
English
42M

Thank you! The goal is to set up immich. It’s my first real foray into self hosting, and it seems close enough to feature parity with Google that the family will go for it. I ran a test with my local photos and it works great, so this is the next step.

@Discover5164@lemm.ee
link
fedilink
English
22M

https://github.com/simone-viozzi/my-server

this is my setup, ofc is still and will always be working in progress

Lmao I am both amused and horrified that I had somehow never come across this datapoint before

@gedaliyah@lemmy.world
creator
link
fedilink
English
252M

I’m not really looking forward to that step either

@smeeps@lemmy.mtate.me.uk
link
fedilink
English
16
edit-2
2M

I think this is a bit unfair. Most Google Takeout requests are fulfilled in seconds or minutes. Obviously collating 100GB of photos into a zip takes time.

And it’s not googles fault you have internet issues: even a fairly modest 20Mbps internet connection can do 50GB in 6h. If you have outages that’s on your ISP not Google. As others have said, have it download to a VPS or Dropbox etc then sync it from there. Or call your ISP and tell them to sort your line out, I’ve had 100℅ uptime on my VDSL copper line for over 2 years.

I was able to use Google Takeout and my relatively modest 50Mbps connection to successfully Takeout 200GB of data in a couple of days.

@gedaliyah@lemmy.world
creator
link
fedilink
English
92M

What download manager did you use? I’ve tried with whatever’s built into Firefox on two different networks and similar results. The downloads freeze every so often and I have to restart them (it picks up where it left off). Sometimes it just won’t reconnect, which I’m guessing is a timeout issue with Google, although I really have no idea.

I don’t ever have to manage downloads of this size, so sorry if it’s an obvious question

@yonder@sh.itjust.works
link
fedilink
English
42M

A download manager I found to work well generally was aria2c. Only really worth it if you are on linux but it is simple yet powerful.

@machinin@lemmy.world
link
fedilink
English
62M

Not OP, but I use this download manager. It has been good.

https://www.downthemall.org/

Definitely misread that as Download The Mall and was quite amused by the name until I checked the link to see more lol

@Symphonic@lemmy.world
link
fedilink
English
82M

I have fancy California Internet and the downloads are surprisingly slow and kept slowing down and turning off. It was such a pain to get my data out of takeout.

@stepan@lemmy.cafe
link
fedilink
English
52M

There was an option to split the download into archives of customizable size IIRC

@gedaliyah@lemmy.world
creator
link
fedilink
English
12M

Yeah, that introduces an issue of queuing and monitoring dozens of downloads rather than just a few. I had similar results.

As my family is continuing to add photos over the week, I see no way to verify that previously downloaded parts are identical to the same parts in another takeout. If that makes sense.

@Willdrick@lemmy.world
link
fedilink
English
2
edit-2
2M

You could try a download manager like DownThemAll on Firefox, set a queue with all the links and a depth of 1 download at a time.

DtA has been a godsend when I had shitty ADSL. It splits download in multiple parts and manages to survive micro interruptions in the service

@gedaliyah@lemmy.world
creator
link
fedilink
English
12M

I couldn’t get it working, but I didn’t try too hard. I may give it another shot. I’m trying a different approach right now.

@gedaliyah@lemmy.world
creator
link
fedilink
English
12M

DownloadThemAll seems to be helping. I’ll update the original post with the details once I have success. In this case, I was able to first download them internally in the browser, then copy the download link and add them to DtA using the link. Someone smarter than me will be able to explain why the extra step was necessary, or how to avoid it.

Create a post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

  • 1 user online
  • 279 users / day
  • 589 users / week
  • 1.34K users / month
  • 4.55K users / 6 months
  • 1 subscriber
  • 3.47K Posts
  • 69.3K Comments
  • Modlog