Dismiss
InnovationQ will be updated on Sunday, Oct. 22, from 10am ET - noon. You may experience brief service interruptions during that time.
Browse Prior Art Database

Video Search using Smart Watch Gestures

IP.com Disclosure Number: IPCOM000247834D
Publication Date: 2016-Oct-06
Document File: 6 page(s) / 166K

Publishing Venue

The IP.com Prior Art Database

Abstract

Disclosed is a method and system for searching video in order to find a particular object moving in a particular way. The video search criteria combine cognitive image analysis of a user-selected object with smart watch gestures that define how that object should move.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 38% of the total text.

Page 01 of 6

Video Search using Smart Watch Gestures

As libraries of video content continue to grow at a rapid rate, users need an effective method for searching the content. Current search methods typically combine text-based searching whereby a video is assigned a title, a description, and tags, and then execute search queries against this text. Image analytics tools enable smarter search tools that identify individual objects or people in a video stream, even if those objects were not tagged in a video.

Existing systems can identify objects in a video stream. Others can detect human motion activities, such as whether a person is walking. However, no existing art combines video object detection with smart watch gestures to create a movement profile for means of video search.

The novel contribution is a method and system to perform a video search by combining object recognition of objects identified in a video with wrist-based gestures captured by a smart watch to create a movement profile.

The novel system combines object analytics with movement gestures issued on a smart watch to identify objects in a video. The user can select an object in a video and then issue a gesture indicating the type of movement for which a user is looking (e.g., up, down, left, right, twist, etc.). The smart watch registers (i.e., picks up) that movement. The system then searches for video content that contains the selected object and matches the movement pattern.

With this system, a user can find an object such as a beach ball with a movement profile matching the movements of a user's wrist and find a person with a movement profile matching the movements of a user (e.g., walk five steps, jump, run, etc.).

Figure 1: Video search is performed based on the movement profile as measured by a smart watch

1


Page 02 of 6

FIGURE 1

A user touches an object shown in a streaming video on a mobile device. In addition, the user issues a wrist or body-based gesture with an associated smart watch. The combination of the selected object and the smart watch gesture creates a movement profile. The system applies the movement profile to search for instances of the given object moving in the indicated direction. The system analyzes the current streaming video and other videos in a video library to find segments of video in which this movement profile is matched. The system presents the user with a result set of all videos and video segments matching the movement profile.

The invention makes use of the following components (Figure 2):


 Mobile device - a mobile device to play a video


 Video - a streaming video playing on the mobile device


 Touch screen - a touch-sensitive screen to select a given object in a playing video


 Local network connection - a method to communicate with the smart watch (e.g., Bluetooth*)


 Internet connection - a method to receive video stream


 Smart watch - a smart watch worn on a user's wrist


 Movement sensors - sensors that can detect mo...