Tuesday, January 25, 2011

Myth Breaker - The Best Open Source Web Application Vulnerability Scanner

(The original benchmark post - comparison of 43 web application vulnerability scanners:

http://sectooladdict.blogspot.com/2010/12/web-application-scanner-benchmark.html)

It’s been a couple of weeks since the initial benchmark was published, and I used that time to contact most of the vendors and to come to some conclusions, as to which tool combinations are ideal for each task;

I believe that those of you that use these tools on a daily basis will find my conclusions interesting.

Please note that the conclusions refer to the condition of the tools in the day the benchmark was released (see the full explanation at the end of the post).

Glossary

AND – combining the tools is required to obtain the best results.

OR – using either one of the tools will provide nearly identical results.

AND/OR – it is currently unknown if combining them will provide additional benefits.

SAFE scan – a scan method in which the tester can select which URLs to scan, in order to prevent the scanner from accessing links that could delete data, lock user accounts or cause any other unintentional hazard (generally requires the scanner to have a proxy/manual crawling/URL file parsing/pre-configured URL restriction module); Recommended while scanning the internal section of an application that resides in a production environment.

UNSAFE scan – a scan method that scans all the URLs, without any restrictions or limitations; Recommended while scanning the public section of an application, and for scanning the internal section of an application that resides in the testing/development environment.

The Ideal Combination of Tools (Relevant to the release date of the initial benchmark – 26/12/2010):

(Constructed according to the cases detected by each tool, and according to tool capabilities and application scope restrictions)

Scan Type & Target

Reflected XSS

SQL Injection (MySQL)

Initial Public Scan

Initial Scan on the Application’s Public (unauthenticated) Section

(Purpose: gather as many “Low Hanging Fruit” exposures as possible with a minimal amount of false positives)

Netsparker AND Acunetix AND N-Stalker AND SkipFish

(Nearly False Positive Free Combination)

ProxyStrike AND WebCruiser

(Nearly False Positive Free Combination)

Internal Scan - Unsafe

The Application’s Internal (authenticated) Section

Netsparker AND Acunetix AND SkipFish

(Nearly False Positive Free Combination)

Wapiti

(Verification with other tools is recommended to reduce False Positives – ProxyStrike AND WebCruiser, In addition to one of the following: W3AF/Andipaors/ZAP/

Netsparker/Sandcat/

Oedipus)

Internal Scan - Safe

The Application’s Internal (authenticated) Section

(Method: scan internal application pages without activating any delete, logout or other dangerous operations).

ZAP AND W3AF

(Safe combination with relatively efficient accuracy)

W3AF AND Andiparos/Paros AND Oedipus AND ProxyStrike

Additional Public Scan

Detect additional potential exposures that require manual verification, and aren’t covered by previous tools

(Public Section)

ProxyStrike OR Sandcat (Grabber detects 1-2 additional POST cases - optional)

Wapiti

2nd Internal – Unsafe

Detect additional potential exposures that require manual verification, and aren’t covered by previous tools

ProxyStrike OR Sandcat

Wapiti

(No substantial change, so there’s no need to run another scan)

2nd Internal – Safe

Detect additional potential exposures that require manual verification, and aren’t covered by previous tools

(Method: scan internal application pages for additional exposure instances without activating any delete, logout or other dangerous operations)

ProxyStrike

W3AF AND Andiparos/Paros AND Oedipus AND ProxyStrike

(No substantial change, so there’s no need to run another scan)

Complementary Scan for Additional Exposures

Complementary Scan

Scan the applications with scanners that have a wider range of features, to cover additional security flaws

W3AF AND/OR Arachni AND/OR Skipfish AND/OR Sandcat

Notable Open Source & Freeware Tools – SQL Injection Detection

The highest SQLi detection ratio of open source & freeware tools belongs to Wapiti, currently the undisputed winner in this category.

A bit behind Wapiti were AndiParos, Zapproxy and Paros Proxy (all forks of the original Paros project), followed closely by Netsparker and W3AF (two tools that were prone to less false positives test cases, compared to all of the tools described so far - 30% compared to 40% or 50%).

* it is important to mention that Netsparker CE 1.5 does not contain Netsparker’s Blind-SQL injection module (disabled in this version), only the regular SQL-Injection module and the Boolean SQL-Injection module.

However, we cannot ignore the fact that the following tools had pretty decent accuracy with 0 false positives(!): WebCruiser (55.88%) and ProxyStrike (52.21%), making them ideal tools for an initial scan (Mini MySqlat0r and Scrawler had 0 false positives as well, but with lower accuracy).

Notable Open Source & Freeware Tools – XSS Detection

The Highest XSS detection ratio belongs to Sandcat, which detected nearly 100% of the overall test-cases (although like ProxyStrike & Grabber, it was misled by a few extra false positive test cases).

The highest XSS detection ratio of open source tools (and 2nd best in total) belongs to ProxyStrike (Grabber detected more POST test cases, but had a higher false positive ratio, and did not detect GET cases).

The best overall XSS detection ratio (while considering the low amount of false positives) belongs to Netsparker CE (63.64% and 3rd in the efficiency order, right after ProxyStrike), followed closely by N-Stalker and by Acunetix FE (and since Skipfish and these tools “complete” missing test cases in each other, they are ideal for initial scans, since they all have 0 false positives!).

The best overall XSS detection ratio (while considering the low amount of false positives) of open source tools belongs to WebSecurify.

The best HTTP GET XSS detection ratio (while considering the low amount of false positives) of open source tools belongs to XSSer.

The following open source tools had XSS detection modules that were free of false positives (while still having a relatively efficient detection ratio) – Grendel-Scan (GET) and Skipfish (Secubat had 0 false positives as well, but its detection ratio was a bit lower).

Notes

  • When using ProxyStrike for the initial scan, It’s probably best to use an external spider instead of the built in spider (e.g. use ProxyStrike as an outgoing/upstream proxy for Burp Suite FE or Paros/ZAP/Andiparos and then use the spider feature of the external tool through ProxyStrike).

  • As mentioned before, the conclusions reflect the condition of the various tools in the date the initial benchmark was published. Since the benchmark, many vendors had released new versions (some even in response to the benchmark), so the list of conclusions will change as soon as the next benchmark is released; I know for a fact that some vendors invested so much effort in improving their detection modules that some of the new versions get to nearly 100% detection ratio (but since I don’t have updated statistics, well have to wait).

Conclusions

So… it seems that I didn't find “the best web application vulnerability scanner” after all… but I did find combinations of open source & freeware tools that get pretty good results.

As I mentioned in previous posts, my work is only beginning.

Various open source vendors already released new versions that should be tested, tools that were improperly executed (or had a bug) should be retested as soon as their issues are mitigated, additional research led me to discover a couple of additional open source web application scanner projects, and at least one new open source web application scanner was released in the last couple of weeks (and I haven’t even mentioned commercial scanners).

Time to get back to work…

8 comments:

  1. Warning: You do not appear to put value on the ability of a human to remove false positives in recurring HTTP and link extraction idioms. Application scanner vendors make this same mistake. "Training" a scanner or set of tools may require a background in QA functional testing to some degree, but the basic theory comes down to equivalence classification. Additionally, experienced testers will elect to zone-in on particular URI and parameter keys, values, or combinatorial elements of those URIs, keys, and values -- in order to save time, prove focus, and to prevent themselves from testing "everything in every combination including the kitchen sink" (which can be problematic in a large or complex application). Unfortunately, a lot of domain-specific, framework-specific, and deployment-specific testing issues will arise without in-depth knowledge of all 3 areas. This is not something commonly found in a single tester or human. Thus, specialized testers are required to cover these corner cases -- or potentially a continuous penetration-test where the tester(s) pivot based on new and evolving information.

    Your analysis focuses too much on free tools as well as a point-and-shoot methodology in order to be fully interesting or correct. Keep up the good work, though -- you've clearly put some time and thought into web application penetration-testing when others have not!

    ReplyDelete
  2. Suggestions:
    1) Try Netsparker CE in crawl-only mode (with or without cookies) instead of Burp Suite FE, ZAP, Andiparos/Paros as the spider for ProxyStrike
    2) Run Netsparker CE in crawl-only mode through Fiddler2. You can process the SAZ file offline with Casaba Watcher, mark all XSS/HTMLi in Fiddler, and replay the requests a few at a time with Casaba x5s
    3) I highly encourage you to check out XSS Rays. It's another highly effective tool

    Reasoning: Netsparker CE appears to do better at link extraction than any of the other tools that you mentioned. XSS Rays also does a very good job at link extraction with regards to Javascript and the DOM. x5s may not give direct XSS findings, but it is much more thorough at detection of HTMLi (for Reflected XSS) because it implements a lot of abusable character injection techniques that no other tool has implemented.

    Additional recommendations: There is a growing concern that you should include Burp Suite Pro measurements. The low cost of the tool, combined with the ease of use and power of configuration, make it a key candidate for commercial tool review against the open-source tools.

    As a active, seasoned user of these tools (see the OWASP Phoenix Tools project, or my http://www.tssci-security.com/archives/category/itsm/ "Part 2" tool recommendation sections, I find it difficult and annoying to run multiple tools without bringing them all to center. This aids in my own individual analysis as well as sharing with other testers. In other words, I prefer to have all of my findings in a single pane of glass that is easily shared. My favorite tool for this use is Burp Repeater along with the Comparer tool -- both of which can be used in the free version. Fiddler2 is also good at viewing HTTP protocol traffic and provides a similar ability to replay requests (but does not provide URL/etc encoding/decoding, SQL constructed string conversion, or ViewState decoding). However, Fiddler2 can save SAZ files, which can be shared easier than by saving Burp configuration, session files, or log files.

    I find a lot of value sending Burp Scanner findings (or Intruder with some standardized fault-injections, such as the ones found in fuzzdb) to Repeater (where additional manual checks can be done), and then saving the HTTP request to a local file where sqlmap can import it. This can lead to SQLi exploitability analysis very quickly, especially when using the sqlmap "sql-shell". A similar technique can be employed using other free (or open-source) tools when Burp Pro is out of reach.

    In this comment, I have described much easier and efficient ways to get to XSS or SQLi, including the sharing of finding and exploitability information. This is true whether you consider XSS/SQLi to be domain-agnostic or domain-specific (http://seclists.org/securecoding/2011/q1/11).

    What I'd like to see out of a project like this is more in-depth analysis of domain-agnostic (but potentially deployment-specific) application vulnerabilities that are of different classes (e.g. HTTP header injection, LDAPi, XPATHi, CMDi, predictable resource locations, path/file/information/source disclosures, read/write inclusion vulnerabilities, framework-based session management, framework-based error handling, etc) in addition to domain-specific vulnerabilities such as file upload vulnerabilities, custom session management (including anti-CSRF token generation/verification and anti-automation), authentication, authorization, concurrency, custom error handling, logging, et al.

    I have hope for some tools such as Context App Tool (CAT) and WATOBO to stabilize, as they provide a lot of domain-specific analysis that no other free tools are capable of (CAT has amazing CSRF and ClickJacking tooltips; both have great anti-anti-automation capabilities). However, I have found that doing these things with Burp Pro is often easier, simply because it is much more stable (albeit that it may require a complicated Intruder configuration that includes recursive grep payload sets).

    ReplyDelete
  3. Suggestions:
    1) Try Netsparker CE in crawl-only mode (with or without cookies) instead of Burp Suite FE, ZAP, Andiparos/Paros as the spider for ProxyStrike
    2) Run Netsparker CE in crawl-only mode through Fiddler2. You can process the SAZ file offline with Casaba Watcher, mark all XSS/HTMLi in Fiddler, and replay the requests a few at a time with Casaba x5s
    3) I highly encourage you to check out XSS Rays. It's another highly effective tool

    Reasoning: Netsparker CE appears to do better at link extraction than any of the other tools that you mentioned. XSS Rays also does a very good job at link extraction with regards to Javascript and the DOM. x5s may not give direct XSS findings, but it is much more thorough at detection of HTMLi (for Reflected XSS) because it implements a lot of abusable character injection techniques that no other tool has implemented.

    ReplyDelete
  4. What I'd like to see out of a project like this is more in-depth analysis of domain-agnostic (but potentially deployment-specific) application vulnerabilities that are of different classes (e.g. HTTP header injection, LDAPi, XPATHi, CMDi, predictable resource locations, path/file/information/source disclosures, read/write inclusion vulnerabilities, framework-based session management, framework-based error handling, etc) in addition to domain-specific vulnerabilities such as file upload vulnerabilities, custom session management (including anti-CSRF token generation/verification and anti-automation), authentication, authorization, concurrency, custom error handling, logging, et al.

    I have hope for some tools such as Context App Tool (CAT) and WATOBO to stabilize, as they provide a lot of domain-specific analysis that no other free tools are capable of (CAT has amazing CSRF and ClickJacking tooltips; both have great anti-anti-automation capabilities). However, I have found that doing these things with Burp Suite is often easier, simply because it is much more stable (albeit that it may require a complicated Intruder configuration that includes recursive grep payload sets).

    ReplyDelete
  5. Great work! Very useful. Congratulations Shay-Chen.

    According to your list of scanner features, sandcat has a GUI, but I think the free edition has just console.

    ReplyDelete
  6. Hi Luis.

    As far as I know, Syhunt recently stopped distributing freeware versions with GUI (since Sandcat 4.0.3.0),
    and no longer distribute the previous releases in their website
    (I know for certain that versions 3.6-4.0 of the free version had a GUI at some period, since I tested them all, and read the license).

    Their GUI versions used to have several restrictions (such as intentional scanning delay),
    and I assume that they replaced these versions with Sandcat Mini (the console version) for legitimate business oriented reasons.

    I haven't tested Sandcat mini yet, but I will soon.

    ReplyDelete
  7. Analysis focuses too much on free tools as well as a point-and-shoot methodology in order to be fully interesting or correct. Keep up the good work
    open source vulnerabilities

    ReplyDelete