Surety is performing system maintenance this weekend. Electronic date stamps on new Prior Art Database disclosures may be delayed.
Browse Prior Art Database

Situated Augmented Reality

IP.com Disclosure Number: IPCOM000239026D
Publication Date: 2014-Oct-02
Document File: 1 page(s) / 19K

Publishing Venue

The IP.com Prior Art Database


When using AR to show a video of a human interacting with some of the real environment behind them, the human must be very accurately placed, in order that it looks like they are actually interacting with the machine that the user has through their camera view. This idea improves the accuracy of the video overlay's positioning in this situation, as it matches some transparent elements of the recorded environment to the same elements of the environment being viewed through the AR-users device.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 51% of the total text.

Page 01 of 1

Situated Augmented Reality

Current methods of delivering augmented reality do not provide a continuous and realistic experience for the user. They place a video into a situation but do not allow that video to engage with it's actual surroundings at all. The broken nature of the view this gives to the user currently causes extra cognitive strain which can make augmented reality unsuitable for training or simulation.

    The example situation for this AR application is in an oil rig. A user can point their device at a large machine (eg pumping unit) with lots of buttons (for example), and see an overlaid person talk them through which buttons they need to press.

    The problem with current AR techniques is that the overlay is not accurate enough to place the person in exactly the correct place for the scene, based on a small marker (be it QR code, or picture).

The solution is in a number of steps: 1. Take a video of the person in front of the actual machine, demonstrating it's use.

2. Remove parts of the video which are completely irrelevant to the scene (part of the background which would not be the same at the site of a different machine) "removed parts"

3. We only need the person to be visible in the overlaid scene in the AR environment ("visible part") - not the machine itself (as that would be in the camera view) - so make the non-human background transparent. "transparent parts"

4. Instead of ignoring the transparent parts, use them as part of the marker - ie match the picture from the live camera feed to the machine image stored in the transparent part, in order to transform th...