Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Rclone is 10 years old today (rclone.org)
147 points by vanburen on Nov 24, 2022 | hide | past | favorite | 26 comments


Rclone is another great example of open source being better than a commercial product. I'd rather use it than gsutil or s3cmd, it's simple, the commands make sense, and it doesn't bring in a lot of other cruft.


Yup, it's great. I used it to shuffle stuff from s3 into Dropbox and it was just the right tool for the job.


If rclone’s author is around… You should update the lage with an easy and VISIBLE link for dontations. I could not find it while browsing (ios) although it appears somewhere in the FAQ.

Thanks for the product. Unbelievable.


One of the Contributors here, if you use Azure Blob you'll use some of my code. Have fun moving Petabytes, it works quite well.


Only S3. In any case, GREAT work!


So funny! I just spent a few hours last night setting up rclone for the first time on my Macbook!

I've got one of those legacy Google Drive unlimited storage plans, and for years I've been using it to store all my code, video projects I work on, etc.

The way I normally use it is inefficient: if I was importing videos for a project, I'd use the Google Drive web UI to upload them into Drive, where I'd later organize them, etc.

Over the years I tried a variety of Goole Drive syncing tools, but all eventually stopped working, or were so unbearably slow that I couldn't justify it.

Last night I spent some time and set up rclone for Drive, along with a persistent mount so that Google Drive is always accessible via ~/Drive. So far it seems like everything is working well -- really amazing software! <3


Isn’t google getting rid of all the old legacy unlimited plans?


I certainly haven't heard anything about it as a gsuite subscriber


They got rid of them already. Unenforced so far as far as I can tell.


Here is a step by step, keystroke by keystroke, howto for rclone with S3 buckets as an example target:

https://rsync.net/resources/howto/rclone.html

... this HOWTO uses an rsync.net account as an endpoint/throughpoint but you can ignore that - this HOWTO is applicable to any general configuration.


I love rclone, have been using it to reliably move large files between different could providers (thanks Nick!). GUI applications were constantly crashing, rclone has always been reliable.

Make sure to support your favorite open source projects! https://rclone.org/donate/


We use Rclone as a "disk in the cloud" solution for the otherwise stateless docker images. Works perfectly well. And the best thing is - it supports transparent end-to-end encryption.

Among all the cloud storages we've tried to use with Rclone, Azure Blobs turned out to be the fastest so far.

The bitter part of the story is that I did not know about Rclone existence until a recent year or two. I still feel ashamed about that.


Can you elaborate a bit? Do you mount something like a S3 bucket inside of a container?


Yes, exactly. The mount point is then used as a general disk for container data.


Interesting, so why do that? Is it just to simplify your client code? Instead of using the s3 api, you just save files in a standard (virtual) file system? Any other benefits or reliability/performance drawbacks?

Maybe to avoid uuids and use standard fs paths?


I use it, for example, for Gitea storage. My host doesn't have a lot of storage, but I use reclone so that it all goes directly onto Google Drive (unlimited paid storage).

Gitea also has a private docker container registry built in, which quickly grows large into several (hundred) gigabytes. It all works perfectly well with rclone.

This makes my host stateless. Just run gitea docker image with Google-Drive backed storage. It works great because both git repositories and docker images are backed by files that are essentially immutable.

An example that would not work well would be trying to run a container whose storage is a SQLite file that updates often. Trying to sync that SQLite file to google drive with rclone would be a bad idea.


Ok, so one reason is that it’s someone else’s app, where adding cloud support would be impractical.


Cloud drive approach considerably simplifies the code. You can use shell scripts, Unix tools, 3rd party apps. You can combine them. This approach gives a lot of freedom and power. It is just Unix but with a cloud.

Such approach also protects you from a vendor lock-in: you can use any cloud storage you like today.

Performance drawbacks are evident - if a file is not cached in the local cache then it takes some time to get it there. But it does not really matter for the most apps because that initial lag is relatively short.

Reliability has been perfect so far.


Neat thanks.


Rclone is such an epic tool. Props to the creators. Thank you for making cloud storage easily accessible for linux users


I'm a heavy user of rsync and sftp between personal devices, but have never used rclone. Is it only used with cloud storage from the big companies? I have never used any of those so far.


No, while it supports a number of big-box cloud provider storage systems it also includes smaller providers and various protocols. Protocols like HTTP, FTP, SFTP, SMB, local filesystem, and things like "S3 compatible".


I use it for jottacloud directory sync as jottacloud only supports dull sync. And I use it for zotero attachment sync from/into jottacloud.

rclone is just awesome and well documented!


Rclone is great! I used it in a CI/CD pipeline to sync a files from a git repo to S3 for CDN distribution.


I love it that it has a mount option.


rclone with git-remote-gcrypt is the best git provider.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: