Browse Prior Art Database

Preventing inappropriate pictures from being shared

IP.com Disclosure Number: IPCOM000235948D
Publication Date: 2014-Mar-31
Document File: 2 page(s) / 30K

Publishing Venue

The IP.com Prior Art Database

Abstract

Disclosed is a system for learning what content is appropriate vs. inappropriate in a photograph taken with a smart phone based on the user’s profile, system training, and machine learning. The system then accepts the photograph as appropriate to share, alters inappropriate portions of the image, or denies the entire photograph for sharing because it is entirely inappropriate.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 51% of the total text.

Page 01 of 2

Preventing inappropriate pictures from being shared

A method is needed for preventing inappropriate pictures to be sent to others or posted on a social networking site. Prior art exists for monitoring and reviewing video content, but little is available for addressing the distribution of inappropriate photographs.

The novel contribution is a system for learning what content is appropriate vs. inappropriate in a photograph taken with a smart phone based on the user's profile, system training, and machine learning. The system then accepts the photograph as appropriate to share, alters inappropriate portions of the image, or denies the entire photograph for sharing because it is entirely inappropriate.

The system, as well as the user, is trained to identify appropriate vs. inappropriate images. The system analyzes the photograph and assigns a score based on the level of inappropriate content. Enabling art is used to score the images. Based on the score, the system either denies the picture in its entirety, or alters (e.g., blurs, removes) aspects of the picture.

The system can perform a continuous analysis of the camera view and signals the user before the picture is taken, when there is inappropriate content in view. The system communicates to the user what is considered inappropriate by circling those parts of the denied or altered photograph. In addition, with more granular filtering, the system can provide scoring flags to indicate to the user the level of propriety (e.g., 5% of the picture is low-inappropriate, then picture is acceptable; 5% of the picture is highly inappropriate, then the picture is not acceptable). Communicating this helps the user learn what is appropriate vs. inappropriate in a photograph.

The system can also utilize existing facial recognition technology to add another variable to the scoring method (e.g., do not show images of a person's child).

To implement the system for intercepting inappropriate pictures before the user shares those images.

1. System contains base information and is trained to determine whether the content is appropriate.

A. The system utilizes the user profile on the smart phone as a standard threshold for all pictures; houses the information in a database: i. types of appropriate picture content for each age level or range (e.g., pictures of animals are appropriate for all ages; pictures of other people are appropriate for ages eight and up; pictures of swimwear or cigarettes are appropriate for ages 18 and up; pictures of alcohol are appropriate for ages 21 and up, etc.);

ii. a picture of the main user (i.e., related to the profile) to use for facial recognition of people in the image;

iii. could have a database of images with labels (e.g., picture of a bookshelf with the label bookshelf, a picture of a beer bottle

with label beer/alcohol, etc.);

B. System is trained to understand what is appropriate for the main user: i. (for youth) par...