3d

Project Description

Lasers and laser diodes are widely used as illumination sources for optical imaging techniques. Time-of-flight (ToF) cameras with laser diodes and range imaging based on optical interferometry systems using lasers are among these techniques, with various applications in fields such as metrology and machine vision. ToF cameras can have imaging ranges of several meters, but offer only centimeter-level depth resolution. On the other hand, range imaging based on optical interferometry has depth resolution on the micrometer and even nanometer scale, but offers very limited (sub-millimeter) imaging ranges. In this paper, we propose a range imaging system based on multi-wavelength superheterodyne interferometry to simultaneously provide sub-millimeter depth resolution and an imaging range of tens to hundreds of millimeters. The proposed setup uses two tunable III-V semiconductor lasers and offers leverage between imaging range and resolution. The system is composed entirely of fiber connections except the scanning head, which enables it to be made into a portable device. We believe our proposed system has the potential to tremendously benefit many fields, such as metrology and computer vision.

Publications

"High-depth-resolution range imaging with multiple-wavelength superheterodyne interferometry using 1550-nm lasers"
Fengqiang Li, Joshua Yablon, Andreas Velten, Mohit Gupta, Oliver Cossairt
Applied Optics, 56 (31) H51-H56, 2017
[PDF]

Images

Schematic of our proposed setup


Schematic of our proposed setup: Two tunable lasers are
used as the illumination sources. FC, fiber coupler; EOM, eletro-optic
modulator; CIR, fiber circulator; APD, avalanche photo-diode; D1,
driver for EOM 1; and D2, driver for EOM 2. All fibers are polarization
maintaining. The NI card and the two drivers are controlled
with a computer. Red lines, sample arm. Black lines, reference arm.

Signal processing flow for a single point


Signal processing flow for one point: (a) simulated APD output;
(b) spectrum of simulated APD output; (c) signal after the mixer
in the quadratic sensor; (d) spectrum of mixer output; (e) signal after
the quadratic sensor; (f ) spectrum of the quadratic sensor output;
(g) analog-to-digital conversion by the DAQ card (red line) of the
quadratic sensor output (black line); and (h) inset of the signal in (g).
quadratic sensor output (black line); and (h) inset of the signal in (g).

Simulation for 3D object


3D simulation engine based on the proposed setup: (a) Groundtruth 3D objects; (b) simulated 3D images with no noise added in the APD and the NI card; and (c) simulated 3D
images with noise added. The x axis and the y axis mark the pixel number in the x and y dimensions. Different colors represent the
depth information with values from 0 to 100 mm. Color bar units are in meters.

Line plot across the 3D object


Depth
values on pixels along the line in simulated measurement with noise,
simulated measurement without noise, and ground truth. The imaging
range is about 60 mm as shown in the plot; (b) inset of (a) shows a
zoomed-in offset between measurements and the ground truth. There
is overlap between the noiseless simulated measurement and the
ground truth.

Experimental setup for the scanning


(a) Tilted plane being measured along with the schematic of
the scan; (b) driving signals to EOM 1 and galvo scanner; and (c) inset
of (b).

Captured signals


(a) DAQ card readout shows the phase-modulated signal;
and (b) inset of (a).

Depth calculated


Results and the depth calculated. (a) Phase value calculated
at different points; and (b) corresponding depth information

Acknowledgements

This work was supported in part by NSF CAREER grant IIS-1453192; ONR grant N00014-15-1- 2735; DARPA REVEAL grant HR0011-16-C-0028

    Leave a Reply