Chris Blackwell yesterday published a tutorial on how to upload files to S3 using Laravel.

This is the code he used (slightly redacted):

[code lang="php"] $disk= Storage::disk('s3'); $disk->put($targetFile, file_get_contents($sourceFile)); [/code]

This is a good way to go about it for small files. You should note that file_get_contents will load the entire file into memory before sending it to S3. This can be problematic for large files.

If you want to upload big files you should use streams. Here's the code to do it:

[code lang="php"] $disk = Storage::disk('s3'); $disk->put($targetFile, fopen($sourceFile, 'r+')); [/code]

PHP will only require a few MB of RAM even if you upload a file of several GB.

You can also use streams to download a file from S3 to the local file system:

[code lang="php"] $disk = Storage::disk('s3'); $stream = $disk->getDriver() ->readStream($sourceFileOnS3); file_put_contents($targetFile, stream_get_contents($stream), FILE_APPEND); [/code]

You can even use streams to copy file from one disk to another without touching the local filesystem:

[code lang="php"] $stream = Storage::disk('s3')->getDriver() ->readStream($sourceFile);

Storage::disk('sftp')->put($targetFile, $stream); [/code]