
MIT researchers have developed novel photography optics, dubbed “time-folded optics,” that captures images based on the timing of reflecting light inside the lens, instead of the traditional approach that relies on the arrangement of optical components. The invention opens doors for new capabilities for ultrafast time- or depth-sensitive cameras.
Courtesy of the researchers
Technique can capture a scene at multiple depths with one shutter click — no zoom lens needed.
MIT researchers have developed novel photography optics that capture images based on the timing of reflecting light inside the optics, instead of the traditional approach that relies on the arrangement of optical components. These new principles, the researchers say, open doors to new capabilities for time- or depth-sensitive cameras, which are not possible with conventional photography optics.
Specifically, the researchers designed new optics for an ultrafast sensor called a streak camera that resolves images from ultrashort pulses of light. Streak cameras and other ultrafast cameras have been used to make a trillion-frame-per-second video, scan through closed books, and provide depth map of a 3-D scene, among other applications. Such cameras have relied on conventional optics, which have various design constraints. For example, a lens with a given focal length, measured in millimeters or centimeters, has to sit at a distance from an imaging sensor equal to or greater than that focal length to capture an image. This basically means the lenses must be very long.
In a paper published in this week’s Nature Photonics, MIT Media Lab researchers describe a technique that makes a light signal reflect back and forth off carefully positioned mirrors inside the lens system. A fast imaging sensor captures a separate image at each reflection time. The result is a sequence of images — each corresponding to a different point in time, and to a different distance from the lens. Each image can be accessed at its specific time. The researchers have coined this technique “time-folded optics.”
“When you have a fast sensor camera, to resolve light passing through optics, you can trade time for space,” says Barmak Heshmat, first author on the paper. “That’s the core concept of time folding. … You look at the optic at the right time, and that time is equal to looking at it in the right distance. You can then arrange optics in new ways that have capabilities that were not possible before.”

The new optics architecture includes a set of semireflective parallel mirrors that reduce, or “fold,” the focal length every time the light reflects between the mirrors. By placing the set of mirrors between the lens and sensor, the researchers condensed the distance of optics arrangement by an order of magnitude while still capturing an image of the scene.
In their study, the researchers demonstrate three uses for time-folded optics for ultrafast cameras and other depth-sensitive imaging devices. These cameras, also called “time-of-flight” cameras, measure the time that it takes for a pulse of light to reflect off a scene and return to a sensor, to estimate the depth of the 3-D scene.
Co-authors on the paper are Matthew Tancik, a graduate student in the MIT Computer Science and Artificial Intelligence Laboratory; Guy Satat, a PhD student in the Camera Culture Group at the Media Lab; and Ramesh Raskar, an associate professor of media arts and sciences and director of the Camera Culture Group.
Folding the optical path into time
The researchers’ system consists of a component that projects a femtosecond (quadrillionth of a second) laser pulse into a scene to illuminate target objects. Traditional photography optics change the shape of the light signal as it travels through the curved glasses. This shape change creates an image on the sensor. But, with the researchers’ optics, instead of heading right to the sensor, the signal first bounces back and forth between mirrors precisely arranged to trap and reflect light. Each one of these reflections is called a “round trip.” At each round trip, some light is captured by the sensor programed to image at a specific time interval — for example, a 1-nanosecond snapshot every 30 nanoseconds.
A key innovation is that each round trip of light moves the focal point — where a sensor is positioned to capture an image — closer to the lens. This allows the lens to be drastically condensed. Say a streak camera wants to capture an image with the long focal length of a traditional lens. With time-folded optics, the first round-trip pulls the focal point about double the length of the set of mirrors closer to the lens, and each subsequent round trip brings the focal point closer and closer still. Depending on the number of round trips, a sensor can then be placed very near the lens.
By placing the sensor at a precise focal point, determined by total round trips, the camera can capture a sharp final image, as well as different stages of the light signal, each coded at a different time, as the signal changes shape to produce the image. (The first few shots will be blurry, but after several round trips the target object will come into focus.)
In their paper, the researchers demonstrate this by imaging a femtosecond light pulse through a mask engraved with “MIT,” set 53 centimeters away from the lens aperture. To capture the image, the traditional 20-centimeter focal length lens would have to sit around 32 centimeters away from the sensor. The time-folded optics, however, pulled the image into focus after five round trips, with only a 3.1-centimeter lens-sensor distance.
This could be useful, Heshmat says, in designing more compact telescope lenses that capture, say, ultrafast signals from space, or for designing smaller and lighter lenses for satellites to image the surface of the ground.
Multizoom and multicolor
The researchers next imaged two patterns spaced about 50 centimeters apart from each other, but each within line of sight of the camera. An “X” pattern was 55 centimeters from the lens, and a “II” pattern was 4 centimeters from the lens. By precisely rearranging the optics — in part, by placing the lens in between the two mirrors — they shaped the light in a way that each round trip created a new magnification in a single image acquisition. In that way, it’s as if the camera zooms in with each round trip. When they shot the laser into the scene, the result was two separate, focused images, created in one shot — the X pattern captured on the first round trip, and the II pattern captured on the second round trip.
The researchers then demonstrated an ultrafast multispectral (or multicolor) camera. They designed two color-reflecting mirrors and a broadband mirror — one tuned to reflect one color, set closer to the lens, and one tuned to reflect a second color, set farther back from the lens. They imaged a mask with an “A” and “B,” with the A illuminated the second color and the B illuminated the first color, both for a few tenths of a picosecond.
When the light traveled into the camera, wavelengths of the first color immediately reflected back and forth in the first cavity, and the time was clocked by the sensor. Wavelengths of the second color, however, passed through the first cavity, into the second, slightly delaying their time to the sensor. Because the researchers knew which wavelength would hit the sensor at which time, they then overlaid the respective colors onto the image — the first wavelength was the first color, and the second was the second color. This could be used in depth-sensing cameras, which currently only record infrared, Heshmat says.
One key feature of the paper, Heshmat says, is it opens doors for many different optics designs by tweaking the cavity spacing, or by using different types of cavities, sensors, and lenses. “The core message is that when you have a camera that is fast, or has a depth sensor, you don’t need to design optics the way you did for old cameras. You can do much more with the optics by looking at them at the right time,” Heshmat says.
This work “exploits the time dimension to achieve new functionalities in ultrafast cameras that utilize pulsed laser illumination. This opens up a new way to design imaging systems,” says Bahram Jalali, director of the Photonics Laboratory and a professor of electrical and computer engineering at the University of California at Berkeley. “Ultrafast imaging makes it possible to see through diffusive media, such as tissue, and this work hold promise for improving medical imaging in particular for intraoperative microscopes.”
Learn more: Novel optics for ultrafast cameras create new possibilities for imaging
The Latest on: Time-folded optics
[google_news title=”” keyword=”time-folded optics” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Time-folded optics
- How Benjamin Netanyahu Pushed Israel Into Chaoson September 27, 2023 at 12:37 pm
The nation’s current crisis can be traced back, in ways large and small, to the outsize personality of its longest-serving prime minister.
- Adverum Biotechnologies Announces Positive Aflibercept Protein Level Data from the LUNA Phase 2 Trialon September 26, 2023 at 1:39 pm
Ixo-vec is an aflibercept-encoding AAV.7m8 vector designed via directed evolution to cross the inner limiting membrane, enabling in-clinic IVT delivery and enhanced retinal transduction.
- iPhone 15 Pro Max Review: Who Should Buy It (And Who Shouldn't)on September 26, 2023 at 7:40 am
Despite the at-a-glance similarities with its predecessor, the iPhone 15 Pro Max represents some of the biggest changes in the iPhone in years.
- Can Apple raise the iPhone 16 Pro price over expensive 5x optical zoom camera?on September 26, 2023 at 1:29 am
Apple's newest and most capable smartphone yet can be yours for just $31.92/mo at Walmart. To get the iPhone 15 Pro Max at that price, you just need to subscribe to Verizon or upgrade.
- DJI Mini 4 Proon September 25, 2023 at 9:34 pm
If you're new to flying remote copters, take time to read up on the rules and regulations before your first flight. The DJI Mini 4 Pro is quite small, too. Folded down ... drone's nose-mounted camera ...
- Internet for the People: The Movement for Affordable, Community-Led Broadbandon September 25, 2023 at 12:58 pm
Now, the 32-year-old, who grew up in a Dominican household in New York City, helps provide high-speed fiber internet installations and repairs to over 180 units in a low-income housing complex in ...
- The Oppenheimer of Our Ageon September 25, 2023 at 5:01 am
He insists the artificial intelligence he is creating could destroy civilization even as he hastens its advancement. Do we know enough about him?
- MKS Announces Ophir® FoldIR 30-450mm f/3.4 Compact, Folded Optics Zoom Lens for MWIR Camerason September 25, 2023 at 2:51 am
MKS Instruments, Inc., a global provider of enabling technologies that transform our world, has announced the Ophir® FoldIR 30-450mm f/3.4 continuous zoom lens, a long-range, compact folded optics IR ...
- Best Gear For The Solar Eclipse: Glasses, Binoculars, Telescopes And Moreon September 20, 2023 at 4:15 pm
Everyone knows that solar eclipse glasses are designed specifically to safely look at partial eclipses, but there’s something about this event that makes that advice ...
- How can we bring down the costs of large space telescopes?on September 15, 2023 at 10:14 am
We're all basking in the success of the James Webb Space Telescope. It's fulfilling its promise as our most powerful telescope, making all kinds of discoveries that we've been anticipating and hoping ...
via Bing News