Browse Prior Art Database

System and Method to detect unauthorized behavior in a virtual world Disclosure Number: IPCOM000168877D
Original Publication Date: 2008-Apr-01
Included in the Prior Art Database: 2008-Apr-01
Document File: 5 page(s) / 139K

Publishing Venue



Executive Summary: This idea provides a system and method by which unsolicited behavior is identified, stopped or registered by avatars in a Virtual World. These behaviors are similar in today’s real world by email spam and associated systems that use spam blockers In the virtual world, peer to peer communications is the normal communications vehicle between avatars. Having a mechanism to identify and regulate 'problem' avatars who are the source of unwanted data or putting a stop to the sources’ stream or registering the source on a blacklist are all categories for this idea.

This text was extracted from a PDF file.
At least one non-text object (such as an image or picture) has been suppressed.
This is the abbreviated version, containing approximately 30% of the total text.

Page 1 of 5

System and Method to detect unauthorized behavior in a virtual world


The basis of this idea is to provide a method by which unsolicited communications are identified, stopped or registered to a blacklist within the virtual world. As the virtual worlds begin to attract a greater number of users and a wide variety of demographics, unsolicited behavior will be increasing and to some this will become unwanted and intolerable.

The method for this idea may allow a virtual world to identify, stop or register unwanted data sources. This may be used to keep misbehaving users away from children or harassing other members of a the virtual world who are accessing the virtual universe. This may lessen the risk that users would abuse the system through the use of actively monitoring their interactions once identified.

Rationale or Problem to Solve:

This idea solves several problems and challenges for a detecting suspicious activity within a virtual universe and provides a general system to monitor a virtual community for similar behavior patterns. These behavior patterns may be identified by the behavior ratings (as referenced in another idea) of an avatar or potentially quarantine the source by shutting down the access point or registering the source on a blacklist.

For example, in order to restrict unsolicited behavior from other avatars, this behavior may be identified via visual or audio alerts that this avatar has been flagged based on certain circumstances such as unsolicited behaviors or a less than favorable avatar rating. This would give the VW the ability to identify before/during a peer to peer exchange. By using such a mechanism, this may allow an avatar the opportunity to make a decision to engage or not.

A example may be include different areas in Las Vegas where adult material is handed out to pedestrians walking past. In the real-world, one can easily determine that these individuals are hoping to interact on an unsolicited basis to market an unwanted product. In the virtual world, it may be difficult to determine one's true intentions. B identifying these avatars based on certain criteria, the avatar can readily determine if they may want to interact with them or not.

Business Value and Advantages:

Virtual communities may benefit from this idea by providing a safer environment for both children and other avatars. This may also keep different solicitations and unwanted interactions at a minimum by making these avatars easy to identify visually or audibly. Once their behavior is identified, the different avatar community may make a better decision to either engage or not engage with these avatars.


Page 2 of 5

This idea may provide a safer and less commercial environment for all participants who wish to participate. This functionality may also draw credible vendors and additional users who wish to...