By combining the information from the Kinect depth frame in (a) with polarized photographs, MIT researchers reconstructed the 3-D surface shown in (c). Polarization cues can allow coarse depth sensors like Kinect to achieve laser scan quality (b).
Courtesy of the researchers
Algorithms exploiting light’s polarization boost resolution of commercial depth sensors 1,000-fold
MIT researchers have shown that by exploiting the polarization of light — the physical phenomenon behind polarized sunglasses and most 3-D movie systems — they can increase the resolution of conventional 3-D imaging devices as much as 1,000 times.
The technique could lead to high-quality 3-D cameras built into cellphones, and perhaps to the ability to snap a photo of an object and then use a 3-D printer to produce a replica.
Further out, the work could also abet the development of driverless cars.
“Today, they can miniaturize 3-D cameras to fit on cellphones,” says Achuta Kadambi, a PhD student in the MIT Media Lab and one of the system’s developers. “But they make compromises to the 3-D sensing, leading to very coarse recovery of geometry. That’s a natural application for polarization, because you can still use a low-quality sensor, and adding a polarizing filter gives you something that’s better than many machine-shop laser scanners.”
The researchers describe the new system, which they call Polarized 3D, in a paper they’re presenting at the International Conference on Computer Vision in December. Kadambi is the first author, and he’s joined by his thesis advisor, Ramesh Raskar, associate professor of media arts and sciences in the MIT Media Lab; Boxin Shi, who was a postdoc in Raskar’s group and is now a research fellow at the Rapid-Rich Object Search Lab; and Vage Taamazyan, a master’s student at the Skolkovo Institute of Science and Technology in Russia, which MIT helped found in 2011.
When polarized light gets the bounce
If an electromagnetic wave can be thought of as an undulating squiggle, polarization refers to the squiggle’s orientation. It could be undulating up and down, or side to side, or somewhere in-between.
Polarization also affects the way in which light bounces off of physical objects. If light strikes an object squarely, much of it will be absorbed, but whatever reflects back will have the same mix of polarizations that the incoming light did. At wider angles of reflection, however, light within a certain range of polarizations is more likely to be reflected.
This is why polarized sunglasses are good at cutting out glare: Light from the sun bouncing off asphalt or water at a low angle features an unusually heavy concentration of light with a particular polarization. So the polarization of reflected light carries information about the geometry of the objects it has struck.
This relationship has been known for centuries, but it’s been hard to do anything with it, because of a fundamental ambiguity about polarized light. Light with a particular polarization, reflecting off of a surface with a particular orientation and passing through a polarizing lens is indistinguishable from light with the opposite polarization, reflecting off of a surface with the opposite orientation.
This means that for any surface in a visual scene, measurements based on polarized light offer two equally plausible hypotheses about its orientation. Canvassing all the possible combinations of either of the two orientations of every surface, in order to identify the one that makes the most sense geometrically, is a prohibitively time-consuming computation.
Polarization plus depth sensing
To resolve this ambiguity, the Media Lab researchers use coarse depth estimates provided by some other method, such as the time a light signal takes to reflect off of an object and return to its source. Even with this added information, calculating surface orientation from measurements of polarized light is complicated, but it can be done in real-time by a graphics processing unit, the type of special-purpose graphics chip found in most video game consoles.
The researchers’ experimental setup consisted of a Microsoft Kinect — which gauges depth using reflection time — with an ordinary polarizing photographic lens placed in front of its camera. In each experiment, the researchers took three photos of an object, rotating the polarizing filter each time, and their algorithms compared the light intensities of the resulting images.
On its own, at a distance of several meters, the Kinect can resolve physical features as small as a centimeter or so across. But with the addition of the polarization information, the researchers’ system could resolve features in the range of tens of micrometers, or one-thousandth the size.
Read more: Making 3-D imaging 1,000 times better
The Latest on: 3-D imaging
via Google News
The Latest on: 3-D imaging
- Portable Ultrasound Bladder Scanner Market Analysis, Size, Share, Growth, Trends and Forecast 2028on May 19, 2022 at 1:50 am
The global Portable Ultrasound Bladder Scanner Market was valued at US$ 126.3 Mn in 2021 and is expected to reach US$ 191.7 Mn by 2028. As per the ...
- Mobile Imaging Market to See Massive Growth by 2031 According to a New Research Reporton May 19, 2022 at 1:31 am
Overview Diagnostic imaging technologies have made enormous strides on the back of advancements made in the medical science, ...
- 3D Imaging in 5G Smartphone Market Is Booming Worldwide | Viavi Solutions Inc, RPC Photonic Inc, CDA, Heptagonon May 18, 2022 at 4:00 am
JCMR recently introduced 3D Imaging in 5G Smartphone study with focused approach on market size & volumes by Application, Industry particular process, product type, players, and p ...
- 3D Imaging Market Research Report by Component, Deployment, Industry Verticals, Region - Global Forecast to 2027 - Cumulative Impact of COVID-19on May 18, 2022 at 3:53 am
Reportlinker.com announces the release of the report "3D Imaging Market Research Report by Component, Deployment, Industry Verticals, Region - Global Forecast to 2027 - Cumulative Impact of COVID-19" ...
- Leopard Imaging to Showcase 2D/3D Solutions Based on NVIDIA Jetson AGX Orin at Embedded Vision Summiton May 17, 2022 at 10:07 am
Leopard Imaging Inc. (Leopard Imaging), a global leader in embedded vision systems design and manufacturing, is exhibiting their latest embedded vision systems based on the NVIDIA Jetson AGX Orin™ ...
- 3D Medical Imaging Devices Global Market Report 2022on May 16, 2022 at 3:23 am
Reportlinker.com announces the release of the report "3D Medical Imaging Devices Global Market Report 2022" - The global 3d medical imaging devices market is expected to decline from $15.99 billion ...
- Major advance in 3D ultrasound imaging to observe entire organson May 13, 2022 at 6:45 am
Two successive studies by the Physics for Medicine Paris laboratory (ESPCI Paris-PSL, Inserm, CNRS) highlight advances in non-invasive 3D ultrasound imaging, making it possible to observe blood flow ...
- 3D Imaging in Smartphone Market Share, By Product Analysis, Application, End-Use, Regional Outlook, Competitive Strategies Forecast up to 2025on May 2, 2022 at 7:02 am
May 02, 2022 (Market Insight Reports) -- The Global 3D Imaging in Smartphone market study focuses major leading industry players with information such as company profiles, product picture and ...
- 3D Imaging Software, Market Companies, and Competitive Landscape Market Players - Condor, 3Shape,I2S – 2022 - 2027 By HNY Researchon April 27, 2022 at 4:55 pm
Apr 27, 2022 (Heraldkeepers) -- The "3D Imaging Software " Market report provides a det ailed analysis of global market size, regional and country-level maret size, segmentation market growth ...
- 3D bimodal photoacoustic ultrasound imaging to diagnose peripheral vascular diseaseson April 27, 2022 at 7:51 am
Recently, a Korean research team has developed a 3D foot imaging technique that vividly captures peripheral blood vessels, even thinner than 1 mm. A POSTECH research team led by Professor Chulhong ...
via Bing News