Browse Prior Art Database

SYSTEMS AND METHOD FOR PREVENTING AUTOMATED ATTACKS

IP.com Disclosure Number: IPCOM000236521D
Publication Date: 2014-May-01
Document File: 6 page(s) / 79K

Publishing Venue

The IP.com Prior Art Database

Abstract

Various forms of automated web traffic may be directed at websites for malicious or abusive purposes. Two common examples of automated abusive traffic are Denial of Service (DoS) attacks, and scraping. In general, DoS attacks may include attempts to make a network resource, or a given network connected machine, unavailable to other users, for example, by inundating the network resource with such a high volume of requests or traffic, that the network resource in unable to respond to legitimate requests, or is only able to do so at a reduce capacity or rate. Scraping may include processes of automatically harvesting information from websites. Various other types of abusive web traffic may also be perpetrated against websites. One commonly employed approach to protecting against abusive traffic is the use of simple traffic analysis. Traffic analysis may allow anomalous behavior (e.g., in terms of external request or traffic) to be detected and stopped.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 31% of the total text.

Page 01 of 6

SYSTEMS AND METHOD FOR PREVENTING AUTOMATED ATTACKS

    Various forms of automated web traffic may be directed at websites for malicious or abusive purposes. Two common examples of automated abusive traffic are Denial of Service (DoS) attacks, and scraping. In general, DoS attacks may include attempts to make a network resource, or a given network connected machine, unavailable to other users, for example, by inundating the network resource with such a high volume of requests or traffic, that the network resource in unable to respond to legitimate requests, or is only able to do so at a reduce capacity or rate. Scraping may include processes of automatically harvesting information from websites. Various other types of abusive web traffic may also be perpetrated against websites.

    The various types of automated abusive traffic may be perpetrated for malicious or harassing purposes, or may have greater criminal intent. Therefore, whether simply to maintain it's capacity to continue providing services to legitimate traffic, or to protect against greater criminal intent, website must typically implement system to protect themselves from automated abusive traffic such as DoS attacks and scraping, and the like. One commonly employed approach to protecting against abusive traffic is the use of simple traffic analysis. Traffic analysis may allow anomalous behavior (e.g., in terms of external request or traffic) to be detected and stopped.

    While protection systems, such as simple traffic analysis, may be implemented to reduce or prevent attacks, attackers may often persist in their attempts disrupt or otherwise attack a website, at times, even viewing protection measures implemented by websites as a challenge to be overcome. For example, successful attacks against search


Page 02 of 6

engines, such as Google or Bing, may allow web searches to be scraped to learn the ranking algorithms utilized by the search engines, or to sensor specific blogs, or overwhelm them with traffic such that they may become inaccessible to legitimate users. As such, attackers may have much to gain from circumventing protection measures that may be implemented by websites. One way in which attackers may attempt to circumvent the protection measures implemented by a website is through reverse engineering the protection measures, for example, to identify ways in which the protection may be defeated. Trial and error attacks may be utilized to learn about a website's defenses. As a trivial example, an attacker may guess that a particular IP address has been blacklisted. Such a guess may be confirmed by sending abusive traffic from different IP addresses, and may allow the attacks to continue from unblocked IP addresses.

    In response to circumvention of the protection systems by attackers, a website defending against attacks may modify the protection algorithms that it uses. However, often attackers will persist with their trial-and-error attempt to circumvent the modified p...