Browse Prior Art Database

System and Method for Sensitive Personal Detector for Social Sharing

IP.com Disclosure Number: IPCOM000249065D
Publication Date: 2017-Jan-31
Document File: 2 page(s) / 35K

Publishing Venue

The IP.com Prior Art Database

Abstract

Disclosed are a smart system and method to dynamically identify items in a user’s video that, if shared online, can lead to negative consequences (e.g., theft, physical harm, etc.). The system comprises a smart camera filter that identifies personally identifiable items and uses a filtering system for videos or images that highlights, blocks, grays-out, or replaces the items.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 51% of the total text.

1

System and Method for Sensitive Personal Detector for Social Sharing

As users of social media and streaming video websites to add videos for sharing, the risk of inadvertently sharing too much personal information (e.g., home address, current location, personal possessions, etc.) is high. A dynamic method or system is needed to assist users in identifying and filtering or blocking personally identifiable items from appearing in shared video content.

The novel contribution is a smart system and method to dynamically identify items in a user’s video that, if shared online, it can lead to negative consequences (e.g., theft, physical harm, etc.). The system comprises a smart camera filter that identifies personally identifiable items and uses a filtering system for videos or images that highlights, blocks, grays-out, or replaces the items.

In addition, the system learns from the user's history to recognize behaviors and patterns, enabling it to dynamically block items for the user. Many people use the same room multiple times to create a home video. The system stores and learns the user’s behaviors and patterns to dynamically remove any personally identifiable items present in that room. The system creates metadata tags for the items and highlights over/blocks those that should not be shared over an image/video.

If the video shows multiple individuals (especially if the same group of individuals is consistently present in a series of videos), then the system can combine all preferences. In one embodiment, the system can assign a different highlight/blocking color for each person. This enables user to identify who may want to remove certain items from the video.

For example, a user captures videos in an office for a streaming video channel. The user has a professional certificate hanging on a wall in the background. That certificate shows personal information about the user. The system identifies the presence of personal information, and then tags and highlights the professional certificate for the user. This indicates to the user that this may be an item to NOT share in the video. The user selects an option to block the item. The system dynamically removes the professional certificate from all of the user’s videos from that point forward.

In another example, a user may take a self-video in the kitchen while some money is placed on the counter in the background. The system identifies this item because it is unlikely that the user wants to show money in the house over the Web. The user did not even realize this could be a harmful situation, but the system recognizes that this can show other viewers where the user keeps money, which could be harmful to be shared over a social p...