You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When downloading an object from S3 using download_object_as_stream or as_stream, the chunk_size argument is ignored. Instead the hardcoded CHUNK_SIZE in libcloud.storage.drivers.s3 dictates the number of bytes returned from the iterator with each next call.
Detailed Information
This is using libcloud 3.6.1 with Python 3.10.8 on an M1 MacBook Pro running Montery 12.6.1
Here's an outline for how to reproduce:
from libcloud.storage.providers import get_driver
from libcloud.storage.types import Provider
driver = get_driver(Provider.S3_US_EAST2)("Your account key", "Your secret")
bucket = driver.create_container("bucket")
data = bucket.get_object("a file").as_stream(chunk_size=1024 * 1024 * 10)
print(len(next(data)))
If the file is large enough, the length printed is 5MB despite the 10MB chunk_size.
I can work around this by patching CHUNK_SIZE on the module, but that seems not great.
I understand that the chunk size must be at least 5MB for upload, but that restriction doesn't appear to apply to downloads.
The text was updated successfully, but these errors were encountered:
Summary
When downloading an object from S3 using download_object_as_stream or as_stream, the chunk_size argument is ignored. Instead the hardcoded CHUNK_SIZE in libcloud.storage.drivers.s3 dictates the number of bytes returned from the iterator with each next call.
Detailed Information
This is using libcloud 3.6.1 with Python 3.10.8 on an M1 MacBook Pro running Montery 12.6.1
Here's an outline for how to reproduce:
If the file is large enough, the length printed is 5MB despite the 10MB chunk_size.
I can work around this by patching CHUNK_SIZE on the module, but that seems not great.
I understand that the chunk size must be at least 5MB for upload, but that restriction doesn't appear to apply to downloads.
The text was updated successfully, but these errors were encountered: