More Flickr export & photo experiments
My last post was about dealing with my Flickr photo exports and experimenting with them in Jekyll. Now that Iāve got the zip files all backed up in Google Drive, I started looking at getting them all in one spot. Ideally, this would happen without having to download, unzip, and then move again somewhere else.
As well, I currently auto-upload photos from my phone to Dropbox, and then as Dropbox fills up (I donāt have a paid account), I archive them elsewhere ā in my Google Drive account (paid for, runs my personal email, holds my main working GDoc files) or OneDrive (paid, the āfamilyā account, includes 1TB of storage, my ālong termā archive / backup).
Right now, this is mainly a purely archive function. Iām not organizing the photos. Theyāre not available online anywhere unless I put them on my blog here or wiki.
On to the snippets of research!
Storing (and serving) photos from Amazon S3
I realized that 20GB of photos ā that is, at full resolution ā today would only cost about 50 cents per month (2.5 cents per GB). Thatās perfectly reasonable to effectively pay forever. And way cheaper than the ~$10 per month of Dropbox or comparable paid plans.
I already have images.bmann.ca
that Iāve used on and off, and I wrote about how to archive photos from iOS to S3 previously.
This means using the centralized Amazon S3 service as long term storage. As I move to IPFS, the files still need to be āstoredā somewhere.
Using S3 with my own domain name, I get āregularā web URLs that I control. Importing the photos to IPFS will give each file a permanent content address ā essentially a ānativeā Web3 address to reach them at. And, I can move where the files are stored, like storing them on a home server, or even on my phone.
Zapier for Dropbox to Amazon S3
A Zapier Premium account supports Amazon S3. Unfortunately, they donāt currently support the Canada Central zone, so this doesnāt actually work at the moment.
I was testing automating my Dropbox auto-uploads to my own S3 account. This is definitely a good option, and probably better than the apps I researched before.
MultCloud Cloud Drive Syncing
The next thing I found was MultCloud. It lets you connect multiple different cloud services.
I was pleasantly surprised that it actually supports Flickr directly! So, although I have all my Flickr exports backed up, it may actually be easier to move them from Flickr to S3 using Multcloud.
The free accounts limit the speed of transfers, doesnāt allowing for scheduled transfers, and only supports two kinds of sync ā One Way and Two Way.
It also has an option to delete files from source once they are transferred. So, this is a pretty perfect solution for archiving photos from iOS. I can enable Camera Uploads in Dropbox on mobile, and then periodically use Multcloud to copy them to my S3 account. Other than having to manually trigger the sync, the 50GB of transfer per month of the free account would be more than enough.
Advanced sync modes include:
- Mirror ā target directory has files deleted to keep it the same as the source
- Move ā source files deleted after target transfer
- Cumulative - target directory files are not deleted even if deleted from source
- Update ā target files are deleted, and then everything from source is transferred
- Incremental Backup ā a new subdirectory is created on the target for each source transfer of added / modified files
- Full Backup ā a new subdirectory on the target where the current state of source is synced to on transfer
I donāt know enough about the company behind MultCloud to give advice on trusting them long term. You are giving a lot of access to your accounts, that potentially hold a lot of sensitive information. Iām going to use it to shuffle around some files and then disconnect my account authorization.
RClone
Rclone is open source software that calls itself ārsync for cloud storageā. Itās a command line tool to do various file and directory transfers. I havenāt investigated it further, but would use this long term rather than a service like MultCloud.
MultCloud Flickr to S3 Test
I synced an Album from Flickr to S3 as a test. What better one to use than some history from the beginning of Web2. Hereās my original Flickr album for Barcamp and Mesh in Toronto.
The files are now all on my, like this photo of Roland at Liberty Cafe (Flickr original here):
The photos are relatively terrible in quality because they were taken with a Nokia 6630 cameraphone. I guess I have to still blame bad composition on me!
Since the metadata export from Flickr is one JSON file per photo ā and in this MultCloud transfer mode I only have the title of the file, not its unique Flickr ID ā there isnāt much I can do with this.
WeServ Image Cache & Resize
I found weserv previously, a service that enables on-the-fly image caching and resizing.
Since my images.bmann.ca
S3 bucket didnāt have S3 enabled, I was getting mixed content errors on older content in my Netlify builds.
I temporarily went through and just changed to the image source to be served up through weserv:
<img src="http://images.weserv.nl/?url=images.bmann.ca/dropshare/illustrator-bizcards-front.jpg" class="full" />
Another third party in the loop isnāt great long term, which is why Iām fixing this with Cloudflare.
However, this solves another problem I had identified previously. Over the last 20 years, the digital photos in my archive have gotten bigger and bigger, and embedding the originals in a web page isnāt feasible. So, generating thumbnails with weserv
I wonder if there is something photo-specific in IPFS that could link or associate generated thumbnails with an original. I think thumbnail generation is somewhat non-deterministic, meaning unique hashes, meaning different content addresses depending on how the thumbnail was generated.
The weserv code is open source and available on Github, so you can run your own instance if you need this capability. The team is from a small Dutch town called Sneek which I learned from the README on Github :)
HTTPS for Amazon S3 Buckets with Cloudflare
Enabling SSL on an Amazon S3 bucket is easier with Cloudflare than it is with Amazon's own tools. There is a ton of room for great tools on lower level systems.
— Boris Mann (@bmann) January 6, 2019
Basic instructions are, transfer your DNS to Cloudflare and change your Crypto > SSL settings to āFlexibleā ā and your S3 content should be delivered over HTTPS. You can also set a Page Rule to automatically forward HTTP to HTTPS, but Iām just going to use HTTPS for new content for now.
Hereās a blog post on full S3 bucket setup and Cloudflare if you want to follow some steps.
IPFS gateway with Cloudflare
I did a little bit of research previously, but other than running a whole cloud server, itās not currently easy to run your own IPFS gateway. Meaning, accessing files on IPFS through a regular web browser at your own domain.
But Cloudflare is running an IPFS gateway. They let you connect your own domain with a CNAME, which means you can upload / serve content using IPFS but still have it accessed using your own mystuff.example.com
.
A static site generator that is IPFS aware is something Iāll be researching. It āworks todayā with relative URLs, but I think thereās an opportunity for more here.
I label this kind of stuff as Web 2.5 ā using āregularā DNS, browsers, and standard web requests to access decentralized web content.
Notes mentioning this note
There are no notes linking to this note.