- πΊπΈUnited States cmlara
Drupal 7 end-of-life triage:
Drupal 7 reached end of life on January 5th.The 7.x branches of S3FS do not have any additional planned releases.
The 8.x-3.x branch and newer already support this feature.
I've tried to copy my local files to s3 bucket by drush s3fs-copy-local.
There were some files which file names contained some non-standard characters.
drush s3fs-copy-local stopped copying files at the time when it reached the first file in the queue which contained this kind of characters.
I had to abort the command, manually find and rename/delete these files and re-run the command. The main problem is that there were a lot of files and the s3fs-copy-local command started copying from the beginnining so re-copied a lots of files which costs a lots of time.
It would be nice to
- implement some error handling in case it can't read or copy a file: ignore that and go to next file. At the end of running list the skipped files.
- store which files are successfully sent to the bucket and based on it implement a resume function to avoid re-copying files has been copied before
Active
2.0
Code
Not all content is available!
It's likely this issue predates Contrib.social: some issue and comment data are missing.
Drupal 7 end-of-life triage:
Drupal 7 reached end of life on January 5th.
The 7.x branches of S3FS do not have any additional planned releases.
The 8.x-3.x branch and newer already support this feature.