This is the approach I arrived at to download hundreds of JPEG image files from an Amazon S3 directory. The accepted method of using Laravel's response()
helper function failed miserably to work for me. What a PITA working with cloud web services. Anyway, here is the web route that does the job.
Route::get('/dev/s3/download-to-local-storage', function()
{
set_time_limit(0);
ini_set('memory_limit', '128M');
$files = Storage::disk('s3')->allFiles('original');
foreach($files as $file)
{
$tmp=explode('/', $file);
$path=Storage::disk('s3')->url($file);
$dump=public_path('/storage/images/tmp/'.$tmp[1]);
file_put_contents($dump, file_get_contents($path));
}
set_time_limit(60);
});
The directory in question is called original
and I explode the $file
to extract the actual filename without the directory included. So if you've had niggling problems (and who hasn't) with downloading files from S3 then perhaps this approach will present better results for you.
Content on this site is licensed under a Creative Commons Attribution 4.0 International License. You are encouraged to link to, and share but with attribution.
Copyright ©2024 Leslie Quinn.