Browse Prior Art Database

Measuring Audience Engagement via Visual Recognition and Social Media Activity

IP.com Disclosure Number: IPCOM000247936D
Publication Date: 2016-Oct-12
Document File: 3 page(s) / 31K

Publishing Venue

The IP.com Prior Art Database

Abstract

Disclosed are a method and system for measuring audience engagement during a live presentation. The system identifies the number of active mobile device screens in the audience and how often people look at the mobile devices, measures engagement via social media, analyzes the numbers, and ultimately communicates an engagement score to the presenter.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 51% of the total text.

Page 01 of 3

Measuring Audience Engagement via Visual Recognition and Social Media Activity

The problem addressed herein is the inability for live presenters to effectively determine the audience's level of engagement. Especially with the ubiquity of mobile devices, an audience is easily distracted by messages and social media.

The novel solution is a method and system for measuring audience engagement (i.e., how many people are listening to the presenter) by identifying the number of active mobile device screens in the audience and how often people look at the mobile devices. The system also measures engagement via social media. The system is intended for use during a live presentation such as an industry conference keynote presentation.

The system comprises:

• Gaze detection systems to determine where the audience members are looking • Image recognition systems to detect audience members taking photographs • Image recognition systems to track active mobile device screens • Online tracking systems that monitor for: - Certain keywords and hashtags on social media sites
- Traffic on sites named in the presentation (e.g., if the keynote speaker names a demo site, the system watches for increased traffic on that site).

The system aggregates all of this information into an overall engagement score, a heat map (to see which parts of the audience are paying attention and which are not), and a "top N" or "bottom N" video reel that shows the segments of the presentation with the highest or lowest engagement scores.

The system receives the following inputs:


• Gaze detection (prior art), including time-based (how often the user looks up at the presenter vs. looking down)

• Detection of active mobile device screens
• Measurement of social media engagement (e.g., posts, etc.) or other engagement (looking at a website that was mentioned)


• Detection of people taking pictures

The system can be tuned to apply different weights to different inputs and can apply presets for prescribed audience types.

To create the engagement scores from the inputs:

1. For each measurement period (tunable, but 10 seconds is the default, and special measurements may be taken at major transition times such as slide transitions), the system:

A. Uses an audience-facing camera to measure:
i. gaze detection for each individual using image/face recognition

1


Page 02 of 3

ii.the number of individuals taking pictures of the speaker using image recognition

B. Uses an overhead camera to measure the number of active mobile device screens using image recognition

C. Using event-specific user configured hashtags over the measurement period to measures the number of outgoing social media communications

D. Measures the number of Uniform Resource Locator (URL) hits for any URLs referenced in the presentation

2. The system uses a weighting function (with weights tunable by the user) of the form:

      W1 * eye contact - W2 * mobile screens + W3 * (tweets + URL hits) + W4 *...