Jeremy Madea

Systems Architect & Consultant

Web Server Hardening – Part 1

This is the first of a series of posts discussing hardening a web server through careful Apache configuration and use of the following tools:

  • denyhosts
  • fail2ban
  • iptables
  • mod-security
  • integrit

In this part, we’ll cover some quick and easy configuration settings that can have an immediate positive impact.

Information about the software on your server isn’t really anyone’s business. If you broadcast it, script kiddies will use it to target the specific software or versions of it your are running. So, configure Apache not to give up that kind of info. Make sure you have the following settings:

ServerTokens Prod
ServerSignature Off

If you run PHP, get rid of the X-Powered-By header with this setting in your php.ini:

expose_php = Off

Unless you have a Chinese and/or Russian audience and really want to be indexed by the Chinese search engine Baidu and the Russian search engine Yandex, it’s a good idea to deny access from user agents that identify themselves as their crawlers. The intent isn’t really to avoid the search engine robots themselves—they’re both well-behaved and robots.txt would be sufficient for that—we want to avoid the spammers and script kiddies who disguise themselves with these user agent strings. Here’s how I do it; I add the following to my Apache configuration:

BrowserMatchNoCase baidu bad_ua=yes
BrowserMatchNoCase yandex bad_ua=yes
<Limit GET PUT POST>
    Order Deny,Allow
    Deny from env=bad_ua
</Limit>

 

Stay tuned for part 2.

Strange PhpDocumentor Issue

This was a frustrating issue I banged my head up against for a while, so I’m explaining it here in the hopes that it helps someone else. This isn’t actually specific to PhpDocumentor at all; that’s just where I saw it come up. Phpdoc was generating some intermediary files just fine but failing to generate its final HTML and it was spewing a bunch of messages that look like these:

PHP Warning: XSLTProcessor::importStylesheet(): error in /usr/share/php/phpDocumentor/src/phpDocumentor/Plugin/Core/Transformer/Writer/Xsl.php on line 62

PHP Warning: XSLTProcessor::importStylesheet(): Local file read for /usr/share/php/phpDocumentor/data/templates/responsive/layout.xsl refused in /usr/share/php/phpDocumentor/src/phpDocumentor/Plugin/Core/Transformer/Writer/Xsl.php on line 62

PHP Warning: XSLTProcessor::importStylesheet(): error in /usr/share/php/phpDocumentor/src/phpDocumentor/Plugin/Core/Transformer/Writer/Xsl.php on line 62

PHP Warning: XSLTProcessor::importStylesheet(): xsltLoadStyleDocument: read rights for /usr/share/php/phpDocumentor/data/templates/responsive/layout.xsl denied in /usr/share/php/phpDocumentor/src/phpDocumentor/Plugin/Core/Transformer/Writer/Xsl.php on line 62

PHP Warning: XSLTProcessor::importStylesheet(): compilation error: file /usr/share/php/phpDocumentor/data/templates/responsive/index.xsl line 3 element include in /usr/share/php/phpDocumentor/src/phpDocumentor/Plugin/Core/Transformer/Writer/Xsl.php on line 62

PHP Warning: XSLTProcessor::importStylesheet(): xsl:include : unable to load /usr/share/php/phpDocumentor/data/templates/responsive/layout.xsl in /usr/share/php/phpDocumentor/src/phpDocumentor/Plugin/Core/Transformer/Writer/Xsl.php on line 62

PHP Warning: XSLTProcessor::transformToUri(): No stylesheet associated to this object in /usr/share/php/phpDocumentor/src/phpDocumentor/Plugin/Core/Transformer/Writer/Xsl.php on line 138

A quick spin on Google helped me find other people with the same or similar problems but didn’t yield a solution. After too much time spent methodically backing out some recent changes on the system, I found that disabling PHP support for librdf (aka the Redland PHP interface) made the problem go away.

With that knowledge, a new Google search turned this up this bug report.

For now, my workaround has been to avoid using Redland PHP together with libxslt but some additional digging suggests that this problem may be resolved by using a libraptor >= version 1.9.

 

Current Interests

Redis has caught my attention again, and in a big way this time. It’s feature set is expanding sanely. With version 2.6, scripting in Lua has been added, giving me a good reason to learn it.

I’ve also been retraining myself in CLIPS after finding R-DEVICE, which is apparently a defunct project but possibly a good jumping off point for using CLIPS as a semantic reasoner with RDF data. I’ve started updating R-DEVICE and tweaking it to run on Linux with Redland for the RDF support. I’m unsure of its licensing at this point, though, so I’m just doing this to satisfy some personal curiosity at this point.

I took a look at Haxe this week. It’s another interesting project. Its claim that it “can be compiled to all popular programming platforms” would better be stated as a goal… but does compile to several including C++, Java, Javascript, PHP, Flash, and C#. That’s a pretty interesting variety. I think it has some potential to grow into a good alternative to Flex. I also think they might have a real win if they start supporting objective-c. I’ll keep an eye on it.

 

*facepalm*

I host sites, including this one, on virtual server instances running on a cloud computing infrastructure.  This offers many advantages, but the most important one to me is the isolation from hardware problems. The cloud is built on fully redundant hardware and if a hardware failure occurs, my virtual server just fails over to another node. For many years, I hosted sites on dedicated servers and disk failures were an occasional and very irritating problem. Switching to cloud-based virtual servers was supposed to end all that.

But here’s the thing. The virtual servers have virtual disks. And those virtual disks can, in strange circumstance, become corrupted just as real disks can. It’s not really a hardware problem; it’s a virtual hardware problem.

It’s rare. I have it on good authority that the engineers for the company that provides the cloud management system to the hosting company I use have only seen the issue a handful of times in almost half a decade.

I saw it twice in two days.

The two days immediately after I sent links to my resumé out to more than half a dozen employers.

*facepalm*

 

 

 

A Robustness Corollary

You’re probably familiar with the robustness principle which is usually stated as:

Be liberal in what you accept, and conservative in what you send.

When it comes to standards, I’d like to offer a corollary:

Be precise in your description and flexible in your interpretation. 

 

An example of where this corollary could be applied is in some browser developers’ handling of HTTP’s 301 “Moved Permanently” responses. According to   RFC 2616, section 10.3.2 “301 Moved Permanently”:

The requested resource has been assigned a new permanent URI and any future references to this resource SHOULD use one of the returned URIs. Clients with link editing capabilities ought to automatically re-link references to the Request-URI to one or more of the new references returned by the server, where possible. This response is cacheable unless indicated otherwise.

That’s all well and good but permanence is an elusive thing on the web. Websites and applications tend to change quite a lot, especially during development. And when aren’t they under development? Even if mistakes were never made, the ownership of domains (and therefore URIs) is subject to change.

None of that would pose a problem in this instance if it weren’t for browser developers who cache 301 responses for a resource indefinitely and use one of the returned URIs for all future references to that resource.

But, but, but! . . . That’s what the RFC says they SHOULD do, right? And isn’t a good thing that they want to be called “unconditionally compliant?”

Yes. But that misses the point. The phrase “any future references to this resource” can be quite flexibly interpreted. Any future references handled by what exactly? Some browser developers have decided it should mean any future references handled by an installation of their browser. It would be a lot more reasonable, especially from the perspective of web developers, if the browser developers simply interpreted it to mean an instance instead, at least, in the absence of specific caching instructions.

 

Feeble Visions

Here is an awesome rant on the feeble vision some big players have for the future of interaction design.

The author, Bret Victor, a onetime “human interface inventor” at Apple, complains that the vision of the future where we are sliding around “pictures behind glass” isn’t really visionary but just a tiny step from where we are now.

I’m inclined to agree.

Moreover, I think this vision isn’t just limited; it’s limiting too. If we can’t conceive of richer ways to interact with our computers, we’ll be restricting our ability to see what problems we can solve with them.