I’m trying to decide how to import my Google Photos Takeout backup. I see two general ways:
Has anyone done it one way or the other? Any recommendation, pros/cons or gotchas?
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.
Rules:
Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.
No spam posting.
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.
Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
No trolling.
Resources:
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
I recommend using this: https://github.com/TheLastGimbus/GooglePhotosTakeoutHelper
A couple years ago, Google decided that instead of exporting the photos with EXIF data exactly as you’ve uploaded them, which was the original behavior and how platforms such as OneDrive do it, they are going to completely delete all EXIF from the image and instead also create a .json containing the original data, in a non-standard format. This script is an open and free version of a paid tool that goes through each image, finds the corresponding .json, and puts the EXIF data back on.
If you don’t do that, when you reupload these photos into a new service, the date will be reverted to the day you’ve downloaded them and location data will be missing entirely.
This was the tool I used. It worked great for me.
Google reminds me more and more of Microsoft of the 90s. That’s exactly the kind of compatibility breaking asinine move MS would do 30 years ago. Sigh…
Very shitty of them
Yes! I imported 23k media files into a new platform, and the takeout process was such a pain. My destination was built to handle the zipped or unzipped media, but occasionally issues cropped up,like when files spanned archives but the json was on the previous one. That resulted in orphaned files with upload dates instead of date taken.
Ultimately, I think I had the best experience extracting all 123GB and uploading the albums/folders that way.
Would have been SO much easier with an API that allowed cloud to cloud.
I wonder if this is worth doing even if I import with
immich-go
which seems to combine this data too.