My new backup approach

I did two mentoring sessions in the morning, talking about life, some MongoDB details and Cypress testing. Love the mentoring setting, it's really open-minded and most mentees are professionals in other fields (finance, engineering).

Then I started a 7h stint of rethinking my approach to managing my data. What to do if my device breaks? If we don't have backups, the data is lost. That's why we create backups (I hope you do!).

My approach is at least "fragile". After every boot my home directory gets backup-ed into a directory outside my home directory. That's fragile; if the drive dies, the backup is also gone. That's why I move the folder manually on an external drive. The time between creating and moving it to the external drive is a weak spot. The bigger the gap, the higher the probability to lose data. Currently my gap is embarrassingly high, probably 30 days. But I don't want to plug in my external drive all the time. That's why my plan is to put it online. But yes, private data in the cloud, I know... that's why I will encrypt it locally.

So my first sketch looks like this:

/home
|--> external drive (daily, unencrypted (simple to access, someone would have to break into my house))
|--> cloud (daily, encrypted (you don't get my data))

All this should be as app-agnostic as possible and FOSS. I'm very data minimalistic, meaning I only own 20GB of data. Current problems are the slow encryption (30 minutes with 7z with a i7 CPU). Uploading 20GB of data with 25mbit/s needs like 2 hours. So incremental backups are probably a better solution. My next step is to find a simple solution I can automate with a bash script. I will have a look into Borg and Rclone.