Yeah, everything is pretty much all fucked up. Rclone has been banned from Amazon Cloud Drive and won’t get a new API key and Amazon is supposedly killing of the „unlimited“ drive offer anyway.
So my remote Plex is offline and the whole entry is pretty much obsolete. Additionally the already uploaded data is encrypted and to decrypt it I would have to download the whole shebang and decrypt it locally with rclone. This supposedly works, but I ain’t got no time and storage for that!
Just glad I didn’t upload anything I wasn’t comfortable with „losing“ anyway, but what really sucks is that without rclone or acd_cli there is no comfortable way to use Cloud Drive programmatically for Linux backup shell scripts etc. Sucks big times.
I just wanted to wrap up a little project that I have set up lately which solved my problem of a: backup my media collection to an offsite location and b: cut the cord and circumvent the limited upload possibilities of my inhouse media-server.
It is very common here in Germany to have quite reasonable download rates, especially when connected to a cable connection, but pretty limited upload rates. I for example have convenient 120MBit download speed but only about 6Mbit upload.
Additionally the storage space on my media server was slowly filling up so I tried to find a cheap backup solution for my „not so utterly important“ data. My personal photos, documents and stuff get regularly backed up to AWS S3 where I can access and retrieve it without any 3rd party software or plugin for a reasonable price. But when your storage needs grow, S3 gets quite pricy so this was not an option.
At first I had a look at Crashplan, which worked nice, but had quite some disadvantages as well. At first it was quite a pain in the ass to set it up and configure it on my headless ubuntu server. You have to have the Crashplan agent installed on another machine and copy parts of the config files or set up a ssh tunnel. All manageable but not very convenient to archive. In addition, the client itself is java based and was, at least on my server, very heavy on memory and cpu. It had it running in a Docker container so it wouldn’t mess up my installation, but boy, it was a constant heavy load on the machine.
Last but most importantly, Backups on Crashplan are not easily restored or accessible without using the Crashplan software. So it is quite ok and cheap for a backup you hopefully never need, but not so great for my usecase.
Then I had a look at the Amazon Cloud Drive which now offers unlimited (we will see…) storage for all your documents, videos and files for 70€ per year.
Sound awesome and they had a 3 month trial so I created an account. At first I was not very convinced, because initially I had the same problems as with Crashplan, meaning I could not find a way to upload and access my files from my ubuntu server.
But then I came across rclone which solved all this problems for me.
Rclone handled the headless authorization of my Cloud Drive account, you can list, copy, sync and even mount your Cloud Drive via Fuse to your local filesystem and use it like a local folder. Awesome!
And as a final touch, rclone even supports client side encryption of your whole Cloud Drive or just a specific folder.
So here is what I did:
First I authorized rclone to access my Cloud Drive. Then I used „rclone config“ to set up my whole Drive as a remote location called „acd“. In the next step I create a folder in my Cloud Drive and used „rclone config“ again to use this folder on the remote location as a new, encrypted remote location called „crypt“. The configuration is interactive and you get to set your password and salt and so on there. So now you have two remote locations, „acd“ and „crypt“ which is the encrypted version of „acd:subfolder“.
So now when copying a file to the remote „crypt“ location this file gets encrypted and lands with a garbled filename in your „acd:subfolder“ remote. You can follow me so far? Good.
The best thing now is, that you can also „rclone mount“ this encrypted remote and with rclone version 1.34 is is even possible to seek within video files.
You see where I am heading? Right now I am uploading my media collection on my inhouse server via „rclone copy“ to the encrypted Cloud Drive.
Then I fired up the smallest VPS instance on Scaleway, which costs about 3€ a month as of now with unmetered bandwith and 50GB storage, with Ubuntu 16.04 and installed fuse, Plex and the latest rclone release.
To use the same Drive as on my home server I just copied the „.rclone.conf“ file which gets created in your users home folder to the VPS and was then able to use rclone out of the box without having to configure anything.
When using the „rclone mount“ I advise you to use the flags „–allow-other“ and „–default-permissions“ to prevent permission issues.
So after running „rclone mount crypt: /mnt/plex/ –allow-other –default-permissions“ the remote location was mounted and could be configured as the media source for Plex via the webinterface.
So this cut the cord to my poor upload as the Plex Server has a fast direct connection to my Cloud Drive.
Of course, the VPS is not very powerful when it comes to transcoding video and so on but most of my media files can be streamed directly without the need of transcoding. Regarding bandwith or other caveats, I have no idea if there is a filesize limitation or any other problems that can occur but after you got all your service and file permissions right, there should be no problems. Every video I have tested so far was able to play flawlessly via Chrome or Android App. Lets hope it stays that way and the unlimited storage offer really stays unlimited.