Chinese
Adv Search
Home | Accepted | Article In Press | Current Issue | Archive | Special Issues | Collections | Featured Articles | Statistics

2020, 2(1): 1-11 Published Date:2020-2-20

DOI: 10.1016/j.vrih.2019.10.004

Co-axial depth sensor with an extended depth range for AR/VR applications

Full Text: PDF (5) HTML (36)

Export: EndNote | Reference Manager | ProCite | BibTex | RefWorks

Abstract:

Background
Depth sensor is an essential element in virtual and augmented reality devices to digitalize users’ environment in real time. The current popular technologies include the stereo, structured light, and Time-of-Flight (ToF). The stereo and structured light method require a baseline separation between multiple sensors for depth sensing, and both suffer from a limited measurement range. The ToF depth sensors have the largest depth range but the lowest depth map resolution. To overcome these problems, we propose a co-axial depth map sensor which is potentially more compact and cost-effective than conventional structured light depth cameras. Meanwhile, it can extend the depth range while maintaining a high depth map resolution. Also, it provides a high-resolution 2D image along with the 3D depth map.
Methods
This depth sensor is constructed with a projection path and an imaging path. Those two paths are combined by a beamsplitter for a co-axial design. In the projection path, a cylindrical lens is inserted to add extra power in one direction which creates an astigmatic pattern. For depth measurement, the astigmatic pattern is projected onto the test scene, and then the depth information can be calculated from the contrast change of the reflected pattern image in two orthogonal directions. To extend the depth measurement range, we use an electronically focus tunable lens at the system stop and tune the power to implement an extended depth range without compromising depth resolution.
Results
In the depth measurement simulation, we project a resolution target onto a white screen which is moving along the optical axis and then tune the focus tunable lens power for three depth measurement subranges, namely, near, middle and far. In each sub-range, as the test screen moves away from the depth sensor, the horizontal contrast keeps increasing while the vertical contrast keeps decreasing in the reflected image. Therefore, the depth information can be obtained by computing the contrast ratio between features in orthogonal directions.
Conclusions
The proposed depth map sensor could implement depth measurement for an extended depth range with a co-axial design.
Keywords: Depth map sensor ; 3D camera ; Controlled aberration

Cite this article:

Mohan XU, Hong HUA. Co-axial depth sensor with an extended depth range for AR/VR applications. Virtual Reality & Intelligent Hardware, 2020, 2(1): 1-11 DOI:10.1016/j.vrih.2019.10.004

1. Davis J, Ramamoorthi R, Rusinkiewicz S. Spacetime stereo: a unifying framework for depth from triangulation. In: 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. Madison, WI, USA, IEEE, 2003 DOI:10.1109/cvpr.2003.1211491

2. Valkenburg R J, McIvor A M. Accurate 3D measurement using a structured light system. Image and Vision Computing, 1998, 16(2): 99–110 DOI:10.1016/S0262-8856(97)00053-X

3. Geng J. Structured-light 3D surface imaging: a tutorial. Advances in Optics and Photonics, 2011, 3(2): 128–160 DOI:10.1364/aop.3.000128

4. Kollorz E, Penne J, Hornegger J, Barke A. Gesture recognition with a Time-Of-Flight camera. International Journal of Intelligent Systems Technologies and Applications, 2008, 5(3/4): 334 DOI:10.1504/ijista.2008.021296

5. Yahav G, Iddan G J, Mandelboum D. 3D imaging camera for gaming application. In: 2007 Digest of Technical Papers International Conference on Consumer Electronics. Las Vegas, NV, USA, IEEE, 2007, 1–2 DOI: 10.1109/ICCE.2007.341537

6. Lange R, Seitz P. Solid-state time-of-flight range camera. IEEE Journal of Quantum Electronics, 2001, 37(3): 390–397 DOI:10.1109/3.910448

7. Kawahito S, Halin I A, Ushinaga T, Sawada T, Homma M, Maeda Y. A CMOS time-of-flight range image sensor with Gates-on-field-oxide structure. IEEE Sensors Journal, 2007, 7(12): 1578–1586 DOI:10.1109/jsen.2007.907561

8. Ganapathi V, Plagemann C, Koller D, Thrun S. Real time motion capture using a single time-of-flight camera. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Francisco, CA, USA, IEEE, 2010, 755–762 DOI: 10.1109/CVPR.2010.5540141

9. Gokturk SB, Yalcin H, Bamji C. A time-of-flight depth sensor-system description, issues and solutions. In: 2004 Conference on Computer Vision and Pattern Recognition Workshop. Washington, DC, USA, USA, IEEE, 2004, 35 DOI: 10.1109/CVPR.2004.291

10. Birch G C, Tyo J S, Schwiegerling J. Depth measurements through controlled aberrations of projected patterns. Optics Express, 2012, 20(6): 6561 DOI:10.1364/oe.20.006561

11. DLP4500NIR. http://www.ti.com/product/DLP4500NIR

12. Imec RGB-NIR image sensor. https://www.imec-int.com/en/hyperspectral-imaging

13. EL-16-40-TC. https://www.optotune.com/products/focus-tunable-lenses/electrical-lens-el-16-40-tc

email E-mail this page

Articles by authors

VRIH

BAIDU SCHOLAR

WANFANG DATA