Problem with Cloudlflare R2 Rclone backups

I hope you’re all doing well. I’m reaching out to seek some assistance with an ongoing issue I’ve been facing while using Cloudflare’s R2 bucket as backup in Hestia with Rclone. I’ve been struggling with this problem for quite some time now and have previously sought help in the Discord community, but unfortunately, I haven’t had any luck finding a solution. So, I’m hoping you can lend a hand.

Here’s the problem: The R2 bucket backup doesn’t delete old backup files after creating new ones. As a result, my bucket is getting filled up with loads of outdated backups that I no longer need. It’s becoming a real headache to manage and it’s eating up valuable storage space.

I’ve also double-checked my backup configurations to ensure they’re aligned with the recommended settings. However, I haven’t been able to find a way to make the service automatically delete the old backups once new ones are created.

Please let me know if there’s any additional information or logs I can provide to help you understand the situation better. I understand that more context might be necessary to diagnose and resolve the problem effectively.

Thank you so much in advance for your time and assistance. I’m looking forward to your response and to finally overcoming this backup challenge.

This is my R2 config file:

type = s3
provider = Cloudflare
access_key_id = xxxx
secret_access_key = xxxx
endpoint =

I use the following settings:

type = s3
provider = Cloudflare
access_key_id = xxx
secret_access_key = xxx
endpoint =


Where “hestiacp” is in my case the bucket name

If I run now:

v-backup-user admin a few times I will see:

2023-07-20 09:55:00 Upload With Rclone to r2: admin.2023-07-20_09-55-00.tar
2023/07/20 09:55:04 INFO  : admin.2023-07-20_09-55-00.tar: Copied (new)
2023/07/20 09:55:04 INFO  : 
Transferred:   	   87.627 MiB / 87.627 MiB, 100%, 43.828 MiB/s, ETA 0s
Transferred:            1 / 1, 100%
Elapsed time:         2.9s

Delete file: admin.2023-07-20_09-53-55.tar
2023-07-20 09:55:05 Rotated: 2023-07-20_09-53-55
2023-07-20 09:55:05 Local: /backup/admin.2023-07-20_09-55-00.tar
2023-07-20 09:55:05 Size: 88 MB
2023-07-20 09:55:05 Runtime: 1 minute

And I see:

Thanks for your reply!

However, if I run that command 3 times, I get 3 backups.

2023-07-20 12:09:05 Upload With Rclone to backupstowycloud: admin.2023-07-20_12-09-05.tar
2023/07/20 12:09:06 INFO  : S3 bucket backupstowycloud: Bucket "backupstowycloud" created with ACL "private"
2023/07/20 12:09:06 INFO  : admin.2023-07-20_12-09-05.tar: Copied (new)
2023/07/20 12:09:06 INFO  : 
Transferred:        3.213M / 3.213 MBytes, 100%, 5.590 MBytes/s, ETA 0s
Transferred:            1 / 1, 100%
Elapsed time:         0.7s

2023-07-20 12:09:06 Size: 4 MB
2023-07-20 12:09:06 Runtime: 1 minute

It simply does not delete them. I checked the API key, it has the right permissions.

package limits?

If you refer to Cloudflare’s package limits, then no. It’s a pay-as-you-go service?

Ho the limits for the number of backups in Hestia

Ah sorry. It’s set to 1.

rclone lsf $HOST:$BPATH

Where $HOST = hostname and $BPATH I the name of your bucket

towysudo@Towux:/backup$ rclone lsf backupstowycloud:backupstowycloud

How is the config in hestiacp?

Why is there the 2 files / folders in

I actually don’t know to be honest. I just deleted the file now, it seems to have been useless. The other is a directory where the backups are stored.

Anything I can try? I kind off believe this issue lies with Hestia.

What version are you on?

I have tested it this morning with 1.8.2 with out any issues

1 Like


:confused: That’s very weird. Any way to debug?

The root of Cloudflare R2 should look like:

And when opening the bucket and no sub folders

If it is there is something wrong with you configuration but with system access It is hard for me to debug and guess.

For me root is like this

All the backups are in the directory, and it creates the file “backupstowycloud” by itself after deleting.

Then there must be something wrong with the configuration

Have no idea what

Hmm, I’ve set up a cronjob to do it for now.

0 3 * * 1 /usr/bin/rclone delete remote:path --min-age 1w > /home/username/rclone.log 2>&1