Browse Prior Art Database

Method and apparatus for taking 3D pictures by mobile devices

IP.com Disclosure Number: IPCOM000234976D
Publication Date: 2014-Feb-20
Document File: 5 page(s) / 155K

Publishing Venue

The IP.com Prior Art Database

Abstract

In this disclosure, a method of capturing depth data by using mobile devices is proposed. After configuring an infrared emitter besides the camera and an infrared/visible light filter upon the camera, you can retrive the depth data of an object through only one press action on the shutter. A 3D model of the object can be established to generate 3D pictures, print 3D objects, or for any other uses.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 52% of the total text.

Page 01 of 5

Method and apparatus for taking

Method and apparatus for taking

Currently, 3D or stereoscopic image capturing technologies applied to mobile phones can be categorized into three types:


i) Dual-camera stereoscopy


ii) Software/app-based stereoscopy


iii) 3D Sweep Panorama technology.

Dual-camera stereoscopy means that two cameras are precisely aligned, usually side by side, to simulate the human vision of two eyes to observe depth information. There are two problems with this method.


1. It is higher in cost with the additional camera and alignment issues.


2. This kind of mobile phone can only capture images of relatively far away objects, which must be within the intersected field of view of the two cameras.

In software/app-based stereoscopy, users take two pictures with a single camera, each with a slightly different perspective. Reposition is calculated by the software and facilitated by the cues indicated on the screen. However, this approach requires users to be very careful in the process of reposition. Any movement of the object or inappropriate reposition will cause errors in the 3D picture.

3D Sweep Panorama utilizes two off-center bands in the CMOS sensor. While the camera is swept across apanoramic scene, these two narrow bands in the sensor "scan" the scene from slightly different angles, capturing two slightly different panoramic images. Similarly, this method can only be used to shoot stationary objects since it need sweep a scene.

On the other hand, somatic games utilize infrared light to capture 3D images. These game devices workwith an IR emitter which sends out invisible infrared light and a sensor which receives the light reflected by the target object. As distances from different parts of the target object to the sensor vary, the innate software figures out the depth information of different parts via TOP (time of flight). This system is very complicated with high computation cost and low resolution ratio.

The present disclosure provides a method and apparatus for taking 3D pictures by mobile devices. By applying the method, user can get a 3D photo with one single press on the mobile phone shutter. The capturing, analyzing, and processing of images can be accomplished by normal mobile camera and CPU.

In this disclosure, we need to install an infrared emitter beside the rear camera of a mobile phone and a special light filter in front of the camera. The emitter projects a known light pattern onto an object or scene. The phone recovers scene geometry by analyzing distortions of the pattern, which is captured by the

camera with the infrared filter. The following figure...