Wednesday, June 24, 2009

Using WIVET to test your crawler

WIVET is a wonderful project for a web security scanner developer. Using WIVET, you can analyse the link extraction/crawling ability of your WASS.

I recommend you to download latest version or an SVN copy to your local web server and test your scanner's crawlers performance. Also you can test your scanners javascript, flash and form parsing ability using this web application.

Don't forget to exclude the offscanpages folder and the logoff link! Also your crawler should have Cookie support enabled since WIVET tracks the crawling ability via a cookie.

So in order to succeed, you already should have;
  • cookie support
  • an exclude capability
  • javascript support (to compete with other commercial scanners)
  • flash support (to compete with other commercial scanners)
Project home
And here is the latest coverage results of some commercial scanners tested by WIVET author. Have fun!

Friday, June 19, 2009

Detecting new URLs by disabling cookie support

An article for web application security scanner (WASS) developers.

You probably want to find every page in a web application. What if their smart web developer built a vulnerable, "cookie support in your browser is disabled" page for website users? Can your scanner find that page?

Sample Cookies disabled error page
Let's think like a web developer. How can I detect if a user's browser accepts cookies? I think the best way[1] is to set a temporary cookie while redirecting them to a controller page. This page then should check if our previously set temporary cookie was sent during client's new request.

A WASS can find this page. Here is a way of doing this;
  1. While crawling with cookie support, remember all pages which set cookies and redirect at the same time. We can create a NoCookieQueue for this operation. BTW, Cookies are set using "Set-Cookie" response header and redirection is made using a "Location" header during a 3xx HTTP response.
  2. After crawling phase is complete, if NoCookieQueue is not empty, our scanner should disable cookie support in its crawler module and re-visit those pages in NoCookieQueue. This way we can see if those pages redirect to another location that don't exist in our scanners' complete or error queues.

Now your WASS can find new pages or test parameters which a cookieless client should see. While these cookie error pages are mostly static, you might find a vulnerable dynamic page, another directory on server, or an HTML comment with sensitive information etc.

[1] A second method is to set a cookie, and then, using JavaScript check if that cookie was set (If doesn't, redirect client to cookie error page). But these "no cookie pages" can also be found with a JavaScript parser in our WASS.