Working around Google Photos limitation
As of June 1, 2021, Google Photos started limiting the storage for new photos and videos to 15GB/account.
15GB isn't much nowadays, so I looked for alternatives.
Signing up for paid storage is an option which I discarded:
- first, because I don't want to yet another lifetime subscription
- second, because I don't want to be tied to a ecosystem. What if prices double overnight? What if I reach the storage limit and need to upgrade to the next tier?
That said, I love Google Photos and its machine learning wizardry.
A suboptimal experience, yet better than nothing, would be to:
- upload low quality photos to Google Photos
- upload high/original quality photos to alternative storages
This way, Google Photos will still keep me reminding me of past trips or allow me to search for "bridge" or "Rio de Janeiro" against my 1000s of photos. Given an image ID/date, I can download the high/original quality photo elsewhere.
Whatever the solution would turn out to be, I wanted it to last at least 3.5 years. If we assume 10 photos/day in average, we're talking about ~13k photos here.
I ended up creating/reusing the following accounts:
|Provider||Available storage||Photo size||Trust?|
The pseudo-code is as follows:
every day at 2am: for each photo in /sdcard/DCIM: compress photo to /sdcard/Cloud/GooglePhotos/⋯.jpg compress and zip photo to /sdcard/Cloud/pCloud/⋯.jpg.7z ... zip photo to /sdcard/Cloud/Terabox/⋯.jpg.7z move photo to /sdcard/Cloud/Telegram/⋯.jpg every day at 3am, if wifi is connected: for each file in /sdcard/Cloud/<provider>: move file to <provider>
A photo is compressed for each storage provider accordingly.
The output image will be limited by
Photo size and, if
false, it will be 7zipped using a password.
The output image size is limited by playing with the properties for image resizing inside Tasker (
max dimension and
compression quality). These are estimated based on the
Photo size input and the original image properties, retrived by
One caveat is that, for some reason, Tasker doesn't preserve the EXIF attributes when resizing images, so I needed to use
exiftool to overwrite the new file EXIF attributes with the original ones.
Uploading files to decent, real storage providers
This was very straightforward and was automated using rclone.
Uploading files to bad, real storage providers
I plan to do that once a month.
The good news is that all files will already be easily located in
/sdcard/Cloud/Degoo, for example.
Uploading files to Telegram
Telegram isn't a storage provider but, to my knowledge, a given chat can have infinite attachments. 🤷
Its API allows file uploads using a simple POST request. The response body contains a
We can download the attachment by making a request to
We thus need to keep this
file_id in a database. My Tasker task stores this in
/sdcard/Tasker/db/telegram_uploads.txt and its contents look like this:
/Pictures/IMG_00.jpg;123456 /Pictures/IMG_01.jpg;789012 /Pictures/IMG_02.jpg;654321
Having this database in hand, we can even design a primitive web-based file manager, for example.
The database is replicated across all other storage providers, using the aforementioned methods.
I'm happy with the solution because Google Photos will still keep doing its magic and photos are replicated. If one storage provider goes down (I'm sure that down the road at least one of them will), I won't lose my photos.
The Tasker tasks aren't public because they rely on a very specific setup (and because I've never met anyone in person who uses Tasker - or Termux, for that matter).
In the unexpected scenario I get requests to do it, I'll share them in Taskernet.