Browse Prior Art Database

System, Method and Apparatus for Effective Detection of Tracked Parameters Disclosure Number: IPCOM000235637D
Publication Date: 2014-Mar-17
Document File: 5 page(s) / 668K

Publishing Venue

The Prior Art Database


A new approach is described for detection of tracked parameters in Web applications. What distinguishes this technique from previous ones is the fact that a single login sequence is enough for the scanner to detect the session parameters. The key idea is to manipulate parameter values and test the effect they have on the session state. If the session state depends on a parameter, then changing its value will cause an out-of-session state (which would prevent the user from using the application until logging in again).

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 34% of the total text.

Page 01 of 5

Method and Apparatus for Effective Detection of Tracked Parameters

Method and Apparatus for Effective Detection of Tracked Parameters

             Effective Detection of Tracked Parameters
Background. Functional web crawling often requires that the crawler maintain stateful crawling sessions, whereby the crawler first logs into the website as a registered role

(using user-specified credentials), and only then starts exploring the website's functionality. For example, in black-box security products like IBM Security AppScan Standard and Enterprise Edition, the scanner asks the user to provide login information, so that the tool can search for candidate attack points within the business logic of the application.

The difficulty with this setting is that the scanner may go out of session during crawling. This can happen either if (i) the automated scanner triggers a logout operation unintentionally, or (ii) if the target site forces users to log out after a certain amount of time and/or operations and/or if anomalous interaction is detected by the website.

Once the crawler detects that it is out of session, there is the need to log into the application again before the crawling session resumes. In practice, however, this action alone does not suffice. The reason is that the crawler also has an internal state consisting of outstanding request queues (one or more). These are currently configured

with parameters and cookies whose values are due to the previous crawling session

(before the crawler logged in again), and so draining the queues after logging in again renders the outstanding requests invalid.

The parameters and/or cookies whose value in the outstanding requests needs to be updated after logging in again to make the requests valid are referred to as the tracked parameters. The problem of identifying tracked parameters in (stateful) functional crawling is well known and important. Automating the specification of tracked

parameters involves comparing between two login sequences to determine which parameters change across the two sequences and how.

The problem of pinpointing exactly which parameters need to be tracked is important:

· If a false negative arises - i.e. a parameter requires tracking, but is not specified as such - then the outstanding requests are left invalid, which leads to

incomplete crawling.

· On the other hand, if a false positive arises - i.e. a parameter that need not be tracked is treated as a tracked parameter - then all occurrences of this parameter in outstanding requests are wrongly overwritten with it new value,

which could corrupt the outstanding requests.

In both cases, the impact on the quality and coverage of the crawling process may be significant. This highlights the importance of detecting which parameters require tracking with high precision.

Background art. There are existing heuristics for detection of tracked parameters. These include

1. Coarse heuristics, such as marking parameters that match certain...