I will check my backup install ups as I am no longer using Hestia in production as need extra features not available. I did not install modsecurity but perhaps you can replace the lines to what you require
You need to create a file like this in /etc/apt/apt.conf.d/
Mine is simply called 06nginxmodules with the below line that hooks to your file /usr/local/sbin/nginx-mod-preinstall when update occurs.
// Hook to build and install dynamic modules before NGINX upgrades
// Script calls individual build scripts and passes back error codes
// Place this file in /etc/apt/apt.conf.d/DPkg::Pre-Install-Pkgs {â/usr/local/sbin/nginx-mod-preinstallâ;};
in nginx-mod-preinstall you need the below but I only use brotli
nginx-mod-preinstal contents
#!/bin/bash
Call NGINX module build scripts and pass error codes to apt hook
Get NGINX version to upgrade to
read ngfile < <(grep â/nginx_â) || exit 0
ngver=$(echo $ngfile | sed âs/-.//â | sed 's/._//â)
List of build scripts to run:
/usr/local/sbin/makebrotli $ngver || exit $?
/usr/local/sbin/makemodsec $ngver || exit $? â or what your build file is called
/usr/local/sbin/makepagespeed $ngver || exit $?
Thanks for hints @salnz. I have managed to work like this:
So, I have created a hook file like this:
/etc/apt/apt.conf.d/handleModSec
Code:
DPkg::Pre-Install-Pkgs {"/root/modsec/handleModSec"; }
Now in /root/modsec/handleModSec
#!/bin/bash
read debfile < <(grep '/nginx_') || exit 0
nginxVersion=$(echo $debfile | grep -o -e 'nginx_.*-' | sed 's/[^.0-9]*//g')
echo version is https://nginx.org/download/nginx-$nginxVersion.tar.gz
cd /root/modsec
wget https://nginx.org/download/nginx-$nginxVersion.tar.gz
if [ $? -ne 0 ]
then exit 0
fi
tar -xvf nginx-$nginxVersion.tar.gz
cd nginx-$nginxVersion
./configure --with-compat --add-dynamic-module=../ModSecurity-nginx && make modules
rm -rf /etc/nginx/modules/ngx_http_modsecurity_module.so
cp -rf objs/ngx_http_modsecurity_module.so /etc/nginx/modules
cd ../
rm -rf nginx-$nginxVersion.tar.gz nginx-$nginxVersion
Looks working fine for me. So, before install nginx itâs checking & building module. I ainât expert on this. Just tried by myself
Regarding aggressive bots, couldnât you also disallow them in robots.txt (as long as they donât violate standards) ? For some Websites I only allow Googlebot and Bingbot:
# robots.txt
User-agent: Googlebot
Disallow: /cgi-bin/
Disallow: /tmp/
User-agent: bingbot
Disallow: /cgi-bin/
Disallow: /tmp/
User-agent: *
Disallow: /
and if any bots ignore robots.txt and still hit the website, then I usually block their IP ranges and/or their agent string in Apache.
I went with approach approx. 15 years ago - was a total PITA. Google/Bing also like to ignore robots.txt, BTW; for example ignoring /? directives to prevent crawling parameters. Consider this: in adding a Disallow, you are advertising a location that you donât want crawlers to go - counter-productive? I actually use this technique for a honeypot on one of my servers.
Why re-invent the wheel when mod_sec can do an excellent job?.. hereâs one from yesterday:
[Thu Jul 16 22:04:30.300767 2020] [:error] [pid 16368] [client 154.113.16.226:62796] [client 154.113.16.226] ModSecurity: Access denied with code 406 (phase 2). Pattern match â[1]+$â at REQUEST_HEADERS:Host. [file â/etc/modsecurity.d/REQUEST-920-PROTOCOL-ENFORCEMENT.confâ] [line â735â] [id â920350â] [msg âHost header is a numeric IP addressâ] [data âxxx.xxx.xxx.xxxâ] [severity âWARNINGâ] [ver âOWASP_CRS/3.3.0â] [tag âapplication-multiâ] [tag âlanguage-multiâ] [tag âplatform-multiâ] [tag âattack-protocolâ] [tag âparanoia-level/1â] [tag âOWASP_CRSâ] [tag âcapec/1000/210/272â] [tag âPCI/6.5.10â] [hostname âxxx.xxx.xxx.xxxâ] [uri â/thinkphp/html/public/index.phpâ] [unique_id âXxCyzp3GaVmVrWnpbrcUNAAAAAUâ]
[Thu Jul 16 22:04:30.531760 2020] [:error] [pid 8218] [client 154.113.16.226:63176] [client 154.113.16.226] ModSecurity: Access denied with code 406 (phase 2). Pattern match â[2]+$â at REQUEST_HEADERS:Host. [file â/etc/modsecurity.d/REQUEST-920-PROTOCOL-ENFORCEMENT.confâ] [line â735â] [id â920350â] [msg âHost header is a numeric IP addressâ] [data âxxx.xxx.xxx.xxxâ] [severity âWARNINGâ] [ver âOWASP_CRS/3.3.0â] [tag âapplication-multiâ] [tag âlanguage-multiâ] [tag âplatform-multiâ] [tag âattack-protocolâ] [tag âparanoia-level/1â] [tag âOWASP_CRSâ] [tag âcapec/1000/210/272â] [tag âPCI/6.5.10â] [hostname âxxx.xxx.xxx.xxxâ] [uri â/html/public/index.phpâ] [unique_id âXxCyzrgCK5nTxWyAK6gzeQAAAAAâ]
Note the scans for inexistent files, as well as the protocol violation. 5 Attempts and theyâre blocked. A Nigerian hack attempt by the looks of things.
Another one:
[Tue Jul 14 00:30:27.732034 2020] [:error] [pid 5299] [client 93.158.66.41:38414] [client 93.158.66.41] ModSecurity: Access denied with code 406 (phase 2). Matched phrase â/.git/â at REQUEST_FILENAME. [file â/etc/modsecurity.d/REQUEST-930-APPLICATION-ATTACK-LFI.confâ] [line â124â] [id â930130â] [msg âRestricted File Access Attemptâ] [data âMatched Data: /.git/ found within REQUEST_FILENAME: /.git/headâ] [severity âCRITICALâ] [ver âOWASP_CRS/3.3.0â] [tag âapplication-multiâ] [tag âlanguage-multiâ] [tag âplatform-multiâ] [tag âattack-lfiâ] [tag âparanoia-level/1â] [tag âOWASP_CRSâ] [tag âcapec/1000/255/153/126â] [tag âPCI/6.5.4â] [hostname âidle-spare.server.comâ] [uri â/.git/HEADâ] [unique_id âXwzgg1VF4Q3efdj6-fzgmAAAAAEâ]
Mod Security is much more powerful/effective than just blocking the many rogue web crawlers.