Slow uploads on large files (multi-Gb) to S3

Created on 21 January 2025, about 1 month ago

Problem/Motivation

This issue arises when trying to chunk out large files for upload to S3. The function calls fopen and fwrite this module uses reads the current partial file from S3, seams in the next chunk, then uploads that new partial file to S3.

For the first chunk, the network traffic and local system memory is 2MB upload. The next is 2MB download, seam together next chunk (2MB) then push up that 4MB file to S3, totaling 6MB for this chunk. This process continues and for a 2GB file the network traffic is many GBs larger than the original file size.

For smaller files this network traffic and local system memory usage is negligible and unnoticeable. Once you get into larger files, it becomes more apparent and time consuming to upload. Adjusting the chunk size through the settings.php doesnt work either. You end up with the same issue as the upload progresses.

Note that the related issue of Slow uploads will not work as we use multiple instances for a production server and the temporary:// scheme doesnt point to a single instance, causing a problem that the file cannot be found. This solution wont work for our use case. ✨ Slow file uploads when used with S3FS Active

Steps to reproduce

Create an AWS S3 bucket and configure Drupal with s3fs module to upload to the bucket. Enable this module and use on a file field. Try uploading a large file, 2-3Gb or more.

πŸ› Bug report
Status

Active

Version

2.0

Component

Code

Created by

πŸ‡ΊπŸ‡ΈUnited States peachez

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Production build 0.71.5 2024