s3fs-copy-local resume function and error handling

Created on 25 February 2016, over 9 years ago
Updated 14 January 2025, 5 months ago

I've tried to copy my local files to s3 bucket by drush s3fs-copy-local.
There were some files which file names contained some non-standard characters.
drush s3fs-copy-local stopped copying files at the time when it reached the first file in the queue which contained this kind of characters.
I had to abort the command, manually find and rename/delete these files and re-run the command. The main problem is that there were a lot of files and the s3fs-copy-local command started copying from the beginnining so re-copied a lots of files which costs a lots of time.
It would be nice to
- implement some error handling in case it can't read or copy a file: ignore that and go to next file. At the end of running list the skipped files.
- store which files are successfully sent to the bucket and based on it implement a resume function to avoid re-copying files has been copied before

πŸ› Bug report
Status

Active

Version

2.0

Component

Code

Created by

πŸ‡­πŸ‡ΊHungary kepesv

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Not all content is available!

It's likely this issue predates Contrib.social: some issue and comment data are missing.

Production build 0.71.5 2024