Skip to content →

Web Server Hardening – Part 1

This is the first of a series of posts discussing hardening a web server through careful Apache configuration and use of the following tools:

  • denyhosts
  • fail2ban
  • iptables
  • mod-security
  • integrit

In this part, we’ll cover some quick and easy configuration settings that can have an immediate positive impact.

Information about the software on your server isn’t really anyone’s business. If you broadcast it, script kiddies will use it to target the specific software or versions of it your are running. So, configure Apache not to give up that kind of info. Make sure you have the following settings:

ServerTokens Prod
ServerSignature Off

If you run PHP, get rid of the X-Powered-By header with this setting in your php.ini:

expose_php = Off

Unless you have a Chinese and/or Russian audience and really want to be indexed by the Chinese search engine Baidu and the Russian search engine Yandex, it’s a good idea to deny access from user agents that identify themselves as their crawlers. The intent isn’t really to avoid the search engine robots themselves—they’re both well-behaved and robots.txt would be sufficient for that—we want to avoid the spammers and script kiddies who disguise themselves with these user agent strings. Here’s how I do it; I add the following to my Apache configuration:

BrowserMatchNoCase baidu bad_ua=yes
BrowserMatchNoCase yandex bad_ua=yes
<Limit GET PUT POST>
    Order Deny,Allow
    Deny from env=bad_ua
</Limit>

Stay tuned for part 2.

Published in Geek Stuff

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *