Since your script is just copying the files over to the destination using rsync, you can do a simple integrity test to ensure all files were copied successful.
Here's what I'd do: When your backup script is collecting the list of files to copy, hash them using something like XXHash (or any other fast non-cryptographic hashing algorithm). Record a mapping of files -> hash in a file and back that file up along with the rest.
To test the integrity, iterate over each backed up file and hash it, testing if it matches the original hash when the files was backed up.
Depending on how active the system is, you may wish to also look at using file locking (flock) to prevent writes to a file that is actively being hashed & copied.
Thank you, I really appreciate the detail of the response. I'm going to look into this and make an effort to implement it with an update so everyone can benefit. Much appreciated.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Great suggestion. What's your preference for testing backups?
Since your script is just copying the files over to the destination using rsync, you can do a simple integrity test to ensure all files were copied successful.
Here's what I'd do: When your backup script is collecting the list of files to copy, hash them using something like XXHash (or any other fast non-cryptographic hashing algorithm). Record a mapping of files -> hash in a file and back that file up along with the rest.
To test the integrity, iterate over each backed up file and hash it, testing if it matches the original hash when the files was backed up.
Depending on how active the system is, you may wish to also look at using file locking (flock) to prevent writes to a file that is actively being hashed & copied.
Thank you, I really appreciate the detail of the response. I'm going to look into this and make an effort to implement it with an update so everyone can benefit. Much appreciated.