You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is the bug applicable and reproducable to the latest version of the package and hasn't it been reported before?
Yes, it's still reproducable
What version of Laravel Excel are you using?
^3.1
What version of Laravel are you using?
^10.0
What version of PHP are you using?
8.2.27
Describe your issue
I am using this package to import file entries to database through remote file stored on s3 using chunk reading and queues.
Our server setup is build on top of docker and using aws cloud container service .
The issue is the imports are working properly when build is run on local but failing when run on this configuration on aws ECS .
I have checked s3 object permissions and also updated the excel.php config file as given in docs.
I have also checked file permissions and they are set to web server group www-data .
Tried setting no_of_procs on supervisor to 1
s3 is setup using package league/flysystem-aws-s3-v3 and filesystems.php was checked and is working to upload files .
Also tried upscaling the resources. The file we were trying to process here is just 170 bytes and the server currently has 0 traffic so the resource bottleneck issue gets eliminated already.
Tried both ECS cluster types : EC2 and Fargate. Having same issue. However pulling and running the same image in local machine (mac arm64) also works fine.
Using import as below
Import class file
Docker file
# Use Ubuntu as the base image
FROM ubuntu:22.04
# Set the working directory
WORKDIR /var/www
# Set environment variables
ENV DEBIAN_FRONTEND=noninteractive TZ=FJT
# Install system dependencies and add PHP PPA
RUN apt-get update && apt-get install -y \
software-properties-common \
lsb-release \
curl \
unzip \
zip \
cron \
git \
redis-server \
supervisor \
apache2 \
tzdata \
&& add-apt-repository ppa:ondrej/php -y && apt-get update
# Install PHP 8.2 and extensions
RUN apt-get install -y \
php8.2 \
php8.2-cli \
php8.2-curl \
php8.2-gd \
php8.2-mysql \
php8.2-zip \
php8.2-mbstring \
php8.2-bcmath \
php8.2-opcache \
php8.2-intl \
php8.2-readline \
php8.2-redis \
php8.2-imagick \
libapache2-mod-php8.2 \
php8.2-xml \
&& apt-get clean && rm -rf /var/lib/apt/lists/*
# Enable Apache modules
RUN a2enmod rewrite
# Install Composer
RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
# Copy the existing application directory contents
COPY . /var/www
# Install Composer dependencies
RUN composer install --no-dev --optimize-autoloader --no-interaction
# Set permissions for Laravel directories
RUN chown -R www-data:www-data /var/www \
&& chown -R www-data:www-data /var/www/storage \
&& chown -R www-data:www-data /var/www/bootstrap/cache
# Create necessary log directories and set permissions
RUN mkdir -p /var/log/redis /var/www \
&& touch /var/log/redis/redis-server.log /var/www/worker.log \
&& chown -R www-data:www-data /var/log/redis /var/www
# Copy Apache configuration
COPY ./apache2/000-default.conf /etc/apache2/sites-available/000-default.conf
# Copy Supervisor configuration
COPY ./supervisor/supervisord.conf /etc/supervisor/conf.d/supervisord.conf
# Create a new cron file for Laravel
RUN echo "SHELL=/bin/sh" >> /etc/cron.d/laravel-cron && \
echo "PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin" >> /etc/cron.d/laravel-cron && \
echo "* * * * * cd /var/www && php artisan schedule:run >> /var/log/cron.log 2>&1\n" >> /etc/cron.d/laravel-cron
# Give execution rights on the cron job file
RUN chmod 0644 /etc/cron.d/laravel-cron
# Apply the cron job
RUN crontab /etc/cron.d/laravel-cron
RUN touch /var/log/cron.log
# Expose port 80
EXPOSE 80
# Start Apache, Cron, and Supervisor
CMD ["sh", "-c", "service apache2 start && service cron start && /usr/bin/supervisord -n -c /etc/supervisor/conf.d/supervisord.conf"]
excel.php temporary file part
`
'temporary_files' => [
/*
|--------------------------------------------------------------------------
| Local Temporary Path
|--------------------------------------------------------------------------
|
| When exporting and importing files, we use a temporary file, before
| storing reading or downloading. Here you can customize that path.
| permissions is an array with the permission flags for the directory (dir)
| and the create file (file).
|
*/
'local_path' => storage_path('framework/cache/laravel-excel'),
/*
|--------------------------------------------------------------------------
| Local Temporary Path Permissions
|--------------------------------------------------------------------------
|
| Permissions is an array with the permission flags for the directory (dir)
| and the create file (file).
| If omitted the default permissions of the filesystem will be used.
|
*/
'local_permissions' => [
// 'dir' => 0755,
// 'file' => 0644,
],
/*
|--------------------------------------------------------------------------
| Remote Temporary Disk
|--------------------------------------------------------------------------
|
| When dealing with a multi server setup with queues in which you
| cannot rely on having a shared local temporary path, you might
| want to store the temporary file on a shared disk. During the
| queue executing, we'll retrieve the temporary file from that
| location instead. When left to null, it will always use
| the local path. This setting only has effect when using
| in conjunction with queued imports and exports.
|
*/
'remote_disk' => 's3',
'remote_prefix' => 'payment-csv-files/',
/*
|--------------------------------------------------------------------------
| Force Resync
|--------------------------------------------------------------------------
|
| When dealing with a multi server setup as above, it's possible
| for the clean up that occurs after entire queue has been run to only
| cleanup the server that the last AfterImportJob runs on. The rest of the server
| would still have the local temporary file stored on it. In this case your
| local storage limits can be exceeded and future imports won't be processed.
| To mitigate this you can set this config value to be true, so that after every
| queued chunk is processed the local temporary file is deleted on the server that
| processed it.
|
*/
'force_resync_remote' => true,
],`
After debugging found that the resource is not found on this line.
It can only be reproduced on aws ECS , by running the docker image and deploying it and testing for import via s3 . As its running as expected on local machine (docker desktop on windows 10) not able to figure out the issue when image run on aws ECS .
What should be the expected behaviour?
Expected behavior should stream data from s3 and store on local file and import the rows onto the database .
The text was updated successfully, but these errors were encountered:
Is the bug applicable and reproducable to the latest version of the package and hasn't it been reported before?
What version of Laravel Excel are you using?
^3.1
What version of Laravel are you using?
^10.0
What version of PHP are you using?
8.2.27
Describe your issue
I am using this package to import file entries to database through remote file stored on s3 using chunk reading and queues.
Our server setup is build on top of docker and using aws cloud container service .
The issue is the imports are working properly when build is run on local but failing when run on this configuration on aws ECS .
Using import as below
![image](https://private-user-images.githubusercontent.com/28543320/401079689-46dff805-7663-4787-bbfa-964ea5f8c116.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkxMTIzOTgsIm5iZiI6MTczOTExMjA5OCwicGF0aCI6Ii8yODU0MzMyMC80MDEwNzk2ODktNDZkZmY4MDUtNzY2My00Nzg3LWJiZmEtOTY0ZWE1ZjhjMTE2LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMDklMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjA5VDE0NDEzOFomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTdmOWVmZDYyOTE2ZGNiNTA1NGM2NTE0OTZmN2RjYTU0YTJkZDM1YzQ5OTA3ZTJiZTIwOWMxYjg0ZjMxNTM5MTcmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.XYpNtOlKQzd1Z3FmVqiVAs1Am0s_lnZSCSaKMPzkeF8)
Import class file
![image](https://private-user-images.githubusercontent.com/28543320/401087864-17555717-3ced-44e3-8dc9-e00b537368cc.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkxMTIzOTgsIm5iZiI6MTczOTExMjA5OCwicGF0aCI6Ii8yODU0MzMyMC80MDEwODc4NjQtMTc1NTU3MTctM2NlZC00NGUzLThkYzktZTAwYjUzNzM2OGNjLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMDklMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjA5VDE0NDEzOFomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTQ4YzRiNDA1NGMzNzNiZDcwYTBkZDQyNjc4NmM5M2Q5Zjc2ZGQyYmEwMmY1NzBjNTUwNDY3ZDMzZWFmYzNkZjQmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.okKw-pt15PMgCVCoN7Xbslo8tcTFwBZiB8x26XtBveU)
Docker file
excel.php temporary file part
`
'temporary_files' => [
After debugging found that the resource is not found on this line.
Exception raised in failed_jobs:
How can the issue be reproduced?
It can only be reproduced on aws ECS , by running the docker image and deploying it and testing for import via s3 . As its running as expected on local machine (docker desktop on windows 10) not able to figure out the issue when image run on aws ECS .
What should be the expected behaviour?
Expected behavior should stream data from s3 and store on local file and import the rows onto the database .
The text was updated successfully, but these errors were encountered: