My server will host multiple shared websites, and we will frequently upload relatively large files to /phpmyadmin.
I would like to know if it is possible to set higher values specifically for /phpmyadmin without affecting the global configuration.
I don’t want to have to increase global values every time an upload is needed.
I ran some tests, and unfortunately, using only the .user.ini file with max_execution_time is not enough, since the Apache timeout and some other Nginx variables also affect the upload. For example, an upload might take up to an hour to complete.
I’d personally vote for increased limits for phpmyadmin, but i’ve only had to use phpmyadmin 3-4 times in the last 4 years I’d guess, so It’s probably not really critical.
With the depecration of mySQL workbench, I’ll probably use phpmyadmin MOAR in the future.
I did something functional, but I still need support to know if it’s secure since I’m a beginner.
I start by disabling /etc/apache2/conf.d/phpmyadmin.inc and /etc/nginx/conf.d/phpmyadmin.inc by renaming them to .bak and recreating the files without any content.
Next, I increase the limits in /etc/phpmyadmin/conf.d/01-localhost.php by adding $cfg[‘ExecTimeLimit’] = 7200;
I create a custom nginx.ssl.conf_custom for the main domain and add:
I was in doubt whether to use fastcgi_param SCRIPT_FILENAME with /usr/share/phpmyadmin/$1 or $document_root$fastcgi_script_name, which seems to be the default.
You say that it might take an hour to upload a database. That suggests that you’re going to be pushing the limits of PHPmyadmin. I’d go along with the suggestion of uploading it with ssh, sftp and using the commandline to import it. Maybe you can create a script to make it easier for your users.
PHPmyadmin also has a facility where you can nominate an import directory, so maybe that might be useful: you could upload by sftp, and then import with phpmyadmin. https://docs.phpmyadmin.net/en/latest/config.html#web-server-upload-save-import-directories