Skipfish 2.10 Beta
Fully automated, active web application security reconnaissance tool
Skipfish generates an interactive sitemap for the targeted website by carrying out dictionary-based probes and a recursive crawl.
The resulting map is then automatically annotated with the output from a number of active (yet hopefully non-disruptive) security checks.
The final report generated using the Skipfish tool is meant to be used as a foundation for professional web app security assessments.
A rough list of the security checks offered by the tool is outlined below.:
High risk flaws (potentially leading to system compromise):
· Server-side SQL / PHP injection (including blind vectors, numerical parameters).
· Explicit SQL-like syntax in GET or POST parameters.
· Server-side shell command injection (including blind vectors).
· Server-side XML / XPath injection (including blind vectors).
· Format string vulnerabilities.
· Integer overflow vulnerabilities.
· Locations accepting HTTP PUT.
Medium risk flaws (potentially leading to data compromise):
· Stored and reflected XSS vectors in document body (minimal JS XSS support present).
· Stored and reflected XSS vectors via HTTP redirects.
· Stored and reflected XSS vectors via HTTP header splitting.
· Directory traversal / file inclusion (including constrained vectors).
· Assorted file POIs (server-side sources, configs, etc).
· Attacker-supplied script and CSS inclusion vectors (stored and reflected).
· External untrusted script and CSS inclusion vectors.
· Mixed content problems on script and CSS resources (optional).
· Password forms submitting from or to non-SSL pages (optional).
· Incorrect or missing MIME types on renderables.
· Generic MIME types on renderables.
· Incorrect or missing charsets on renderables.
· Conflicting MIME / charset info on renderables.
· Bad caching directives on cookie setting responses.
Low risk issues (limited impact or low specificity):
· Directory listing bypass vectors.
· Redirection to attacker-supplied URLs (stored and reflected).
· Attacker-supplied embedded content (stored and reflected).
· External untrusted embedded content.
· Mixed content on non-scriptable subresources (optional).
· HTTP credentials in URLs.
· Expired or not-yet-valid SSL certificates.
· HTML forms with no XSRF protection.
· Self-signed SSL certificates.
· SSL certificate host name mismatches.
· Bad caching directives on less sensitive content.
· Failed resource fetch attempts.
· Exceeded crawl limits.
· Failed 404 behavior checks.
· IPS filtering detected.
· Unexpected response variations.
· Seemingly misclassified crawl nodes.
Non-specific informational entries:
· General SSL certificate information.
· Significantly changing HTTP cookies.
· Changing Server, Via, or X-... headers.
· New 404 signatures.
· Resources that cannot be accessed.
· Resources requiring HTTP authentication.
· Broken links.
· Server errors.
· All external links not classified otherwise (optional).
· All external e-mails (optional).
· All external URL redirectors (optional).
· Links to unknown protocols.
· Form fields that could not be autocompleted.
· Password entry forms (for external brute-force).
· File upload forms.
· Other HTML forms (not classified otherwise).
· Numerical file names (for external brute-force).
· User-supplied links otherwise rendered on a page.
· Incorrect or missing MIME type on less significant content.
· Generic MIME type on less significant content.
· Incorrect or missing charset on less significant content.
· Conflicting MIME / charset information on less significant content.
· OGNL-like parameter passing conventions.
How to install and run
Unarchive, open a Terminal window, go to the Skipfish's folder and run the following commands from the command line:
sudo make install
Next, you need to copy the desired dictionary file from dictionaries/ to skipfish.wl. Please read dictionaries/README-FIRST carefully to make the right choice. This step has a profound impact on the quality of scan results later on.
Once you have the dictionary selected, you can try:
./skipfish -o output_dir http://www.example.com/some/starting/path.txt
- High performance: 500+ requests per second against responsive Internet targets, 2000+ requests per second on LAN / MAN networks, and 7000+ requests against local instances have been observed, with a very modest CPU, network, and memory footprint. This can be attributed to:
- Multiplexing single-thread, fully asynchronous network I/O and data processing model that eliminates memory management, scheduling, and IPC inefficiencies present in some multi-threaded clients.
- Advanced HTTP/1.1 features such as range requests, content compression, and keep-alive connections, as well as forced response size limiting, to keep network-level overhead in check.
- Smart response caching and advanced server behavior heuristics are used to minimize unnecessary traffic.
- Performance-oriented, pure C implementation, including a custom HTTP stack.
- Ease of use: skipfish is highly adaptive and reliable. The scanner features:
- Heuristic recognition of obscure path- and query-based parameter handling schemes.
- Graceful handling of multi-framework sites where certain paths obey a completely different semantics, or are subject to different filtering rules.
- Automatic wordlist construction based on site content analysis.
- Probabilistic scanning features to allow periodic, time-bound assessments of arbitrarily complex sites.
- Well-designed security checks: the tool is meant to provide accurate and meaningful results:
- Handcrafted dictionaries offer excellent coverage and permit thorough $keyword.$extension testing in a reasonable timeframe.
- Three-step differential probes are preferred to signature checks for detecting vulnerabilities.
- Ratproxy-style logic is used to spot subtle security problems: cross-site request forgery, cross-site script inclusion, mixed content, issues MIME- and charset mismatches, incorrect caching directives, etc.
- Bundled security checks are designed to handle tricky scenarios: stored XSS (path, parameters, headers), blind SQL or XML injection, or blind shell injection.
- Report post-processing drastically reduces the noise caused by any remaining false positives or server gimmicks by identifying repetitive patterns.
In a hurry? Add it to your Download Basket!
What's New in This Release:
- Updated HTML tags and attributes that are checked for URL XSS injections to also include a few HTML5 specific ones
- Updated test and description for semi-colon injection in HTML meta refresh tags (this is IE6 specific)
- Relaxed HTML parsing a bit to allow spaces between HTML tag attributes and their values (e.g. "foo =bar").
- Major update of LFI tests by adding more dynamic tests (double encoding, dynamic amount of ../'s for web.xml). The total amount of tests for this vulnerability is now 40 per injection point.