Respect Yourself, Protect Yourself

By KENT SHARKEY 8/25/2008 3:46:01 PM

Like a lot of other people lately, I've been having a botnet play with my Web site. It takes the form of occasional requests that append a SQL query at the end of the request. On my little site (not even the bots care), it's been trivial, but on other's it's actually taken down their site due to denial of service. It's a pretty crude hack, and it of course shouldn't work (and didn't on this site). However, doing a search for the culprits seems to indicate that it's worked for them so far.

I was curious, so I decided to look a bit more at the request. It's an URL encoded string, consisting of a DECLARE statement, and an assignment of a long hex value to that variable, followed by the inevitable EXEC statement. Overall, the query string is about 1300 characters long (depending on the actual URL it is hitting). The figure to the right shows a portion of that
query (broken into separate lines, and the EXEC has been replaced with a print so I could see just what the request was). This translates to the following SQL calls (converted to a graphic to avoid anyone mistakenly running it, and/or to give the slimes any Google juice).

In short, it grabs all the user-created tables, and inserts a block of HTML to it. This HTML calls a JavaScript file (I debated including it here, but I decided against it. I figured any interested parties would be up for that). That JS file inserts two 0 pixel iFrames (see, I told you iFrames were evil) into any page that the JS executes on. So, if you pull data out of a database, and slap it up on a page, it comes with friends.


All of this should lead to two questions (that you may already have an answer to):

  • Do I have the same botnet saying hello to my Web site?
  • How can I stop it?

The answer to the first question is easy, assuming you have log files. You can use the excellent (and free, which is more excellent) LogParser tool to scan your log files to find out. You really should be doing this on an ongoing basis, building up a set of standard queries or reports. This way, you don't have people like me complaining that you may have missed something. The query below will scan all of your Web site directories (in IIS you will have one per virtual root). You may need to adjust the path to LogParser or to the log files (I ran this from c:\windows\system32\logfiles, the normal directory for such things, and LogParser is on my PATH). In addition, you may have to change the input parameter (-i) if you're using a different log format than the default IIS W3C (for example, if you're using a different Web server).

for /d %a in (w3svc*) do logparser "SELECT c-ip, COUNT(*), STRLEN(cs-uri-query) as LENGTH, cs-uri-query FROM %a\*.log GROUP BY Length, cs-uri-query, c-ip HAVING Length > 500 ORDER BY LENGTH DESC" -i:IISW3C -rtp:-1 > %aHits.txt

This query produces one text file per v-root, and they should basically be empty. What you don't want to see is something like:

Directory of C:\WINDOWS\system32\LogFiles

08/25/2008 03:43 PM 2,079,560 W3SVC1677268996Hits.txt

Because of the length of the query strings, it looks worse than it actually was: there were only 1549 hits from this particular problem.

As for solving the problem, well, that takes a bit more work. First, those requests should not actually be getting into your database because you are using defensive techniques with your database, right? That basically means:

  • Don't concatenate user input into your SQL statements
  • Use stored procedures (or at least parameterized queries)
  • Don't assume your validation works
  • URL encode input before displaying it
  • Use an ASP.NET Identity that has limited access to the application's database (and only that database ideally)
  • etc. Really -- read that entire Web security chapter. Then go back and read the rest of the book. Then get another copy and give it to a developer you know.

However, there is still a chance that multiple incoming requests could still take down your server, so it's a good idea to layer something like URLScan in front of your Web site. URLScan is an IIS Filter that can be used to restrict the incoming requests based on some criteria. Two rules that will be useful here are restricting the overall length of the query string (using the MaxQueryString parameter) and adding checks for standard SQL injection techniques.

Remember: these two tools are not guarantees, nor are they the complete solution, but they will add that additional layer of security for your application.

Resources

  • LogParser 2.2
  • URLScan 3.0
  • More information on URLScan
  • More information on the attacks
Application Development
Let Apptius save you time and money by adding functionality to your existing programs.
Microsoft Outlook
Apptius offers professional development services for custom Microsoft Outlook solutions
Microsoft Outlook Logo