There are several options you could consider for this.
Firstly, you can create a transfer job using the Storage Transfer Service ( navigate to 'Storage' > 'Transfer'). This can be configured to automatically backup data from one bucket to another (you also have the option to configure this to backup AWS buckets to Google Cloud Storage).
Transfer is a fairly flexible tool and amongst other things, allows you to define files to transfer based on file prefix, modified times or target specific object URLs.
Another option would be to use the gsutil
command to copy or sync files from one bucket to another. If you wanted to automate this process, you could add the command as a cronjob on an instance and run it at your chosen times/intervals.
For example, to copy everything in a source bucket to a destination bucket, you could use a command similar to this:
$ gsutil cp -r gs://SOURCE_BUCKET_NAME/* gs://DESTINATION_BUCKET_NAME/
Alternatively you could use gsutil rsync with the -r switch to synchronise the contents of a source bucket with a destination bucket. For example:
$ gsutil rsync -r gs://SOURCE_BUCKET_NAME gs://DESTINATION_BUCKET_NAME/
If you are concerned about deleting files, it's worth looking into Cloud Storage Object Versioning. If this functionality is enabled, if objects in the bucket are overwritten or deleted, an archived version of the original object is created, so that if required at a later date, the original object can be retrieved. This essentially protects an object from accidental deletion.
It's worth noting with Object Versioning that each archived object takes up as much space as the live object version, and you are charged the same amount for archived storage as for live storage. The archived objects can be managed (for example, automatically deleted when they reach a certain age) by utilising Object Lifecyle Management.
Best Answer
I understand that you are using a Backend buckets behind a HTTP(S) Load Balancer.
At the moment, it is not possible to set permissions or authentication checks at the load balancer level when accessing objects in the backend.
But you can use the "acl" command to set bucket permission or via Google Console. For example to make the bucket publicly readable so that it can be served through the load balancer, you need to add a read permission for AllUsers:
gsutil acl ch -u AllUsers:R gs://BUCKET
There are other design choices which could achieve your original goal of controlling access to the Cloud Storage bucket, such as Users API for Python 2, Cloud Storage authentication, or Hosting a static website. For instance, you could set up authentication in the frontend, and only proxy requests which were properly authorized . This would require some additional coding, though.
You can also set security policies for the load balancer using Cloud Armor.