March 25, 2020
New technologies aim to make 3D cameras easier to use
WEST LAFAYETTE, Ind. – A 3D camera should be as easy to use as one found on a smartphone.
That is the guiding principle for a Purdue University professor with more than two decades of experience in the 3D imaging field, who has developed new technologies aimed at making 3D cameras easier to use.
Song Zhang, a professor of mechanical engineering in Purdue’s College of Engineering, led a team to create technologies to help compress 3D camera files and automate focus and exposure settings.
“We have come a long way with high-end 3D camera technology,” Zhang said. “But using the technology still almost always requires a great deal of training. We want to create technologies to make 3D cameras easier to use for everyone from tourists to doctors to video producers.”
To obtain the best image with current high-end 3D cameras based on structured light technique, the manufacturer must conduct precise projector and camera focal length and other parameters calibration, and the user must manually adjust the optimal sensor exposure time. This leads to a training requirement for a user to properly operate the camera, and often involves complicated recalibration processes by the manufacturer if the camera is accidentally disturbed.
Zhang’s team has automated the process of profilometry by developing algorithms to rapidly determine the optimal exposure after understanding the intrinsic constant response function of the sensor. The researchers also devised a method of generating highly accurate 3D images using an autofocusing feature on electronically tunable lenses.
“I believe 3D camera technology has the ability to have an even greater impact on the field than 2D camera technology ever has, assuming it is easy enough for users,” Zhang said.
The Purdue team’s work is published in Optics Letters and Optics and Lasers in Engineering. Part of the research has been funded by the U.S. Department of Justice and the National Science Foundation.
The innovators have worked with the Purdue Research Foundation Office of Technology Commercialization to patent the technologies. The office recently moved into the Convergence Center for Innovation and Collaboration in Discovery Park District, adjacent to the Purdue campus.
Zhang’s company, Vision Express, optioned the technology through OTC. For more information on licensing a Purdue technology, contact otcip@prf.org.
About Purdue Research Foundation Office of Technology Commercialization
The Purdue Research Foundation Office of Technology Commercialization operates one of the most comprehensive technology transfer programs among leading research universities in the U.S. Services provided by this office support the economic development initiatives of Purdue University and benefit the university's academic activities through commercializing, licensing and protecting Purdue intellectual property. The office is managed by the Purdue Research Foundation, which received the 2019 Innovation and Economic Prosperity Universities Award for Place from the Association of Public and Land-grant Universities. The Purdue Research Foundation is a private, nonprofit foundation created to advance the mission of Purdue University. Visit the Office of Technology Commercialization for more information.
Writer: Chris Adam, 765-588-3341, cladam@prf.org
Source: Song Zhang, zhan2053@purdue.edu
ABSTRACT
Autofocusing method for high-resolution three-dimensional profilometry
Xiaowei Hu, Guijin Wang, Jae-Sang Hyun, Yujin Zhang, Huazhong Yang, and Song Zhang
State-of-the-art high-accuracy three-dimensional (3D) profilometry systems typically use a lens with a fixed focal length, making it difficult for them to measure scenes with large depth variations, especially dynamically changing ones. To address this need, this Letter proposes a novel, to the best of our knowledge, autofocusing method for high-resolution 3D profilometry with a digital fringe projection technique by (1) developing a novel continuous geometric parameter model for systems using electrically tunable lenses and (2) employing a focal plane detection algorithm. The validity of the proposed method is confirmed by experiments.