Find a way to replace a large, heavy and expensive technology with an equivalent one that’s a lot smaller, lighter and cheaper and you have a shot at turning a boutique technology into a world changer. Think of the room-sized computers of the 1940s that now are outpowered by the run-of-the-mill central processing units in laptop computers. Or the miniaturized GPS components that contribute geolocation smartness in cell phones. DARPA program manager Joshua Conway has another shrinking act in mind: packing the light-catching powers of bulky lens-filled telescopes onto flat, semiconductor wafers that are saucer-sized or smaller, featherweight and cheap to make.
The primary goal of the newly-announced Modular Optical Aperture Building Blocks (MOABB) program is to develop the advanced technologies it will take to build ultracompact light detection and ranging (LIDAR) systems, which use light to image objects and their motions in the same way that RADAR systems use radio waves. A LIDAR system beams light out and then precisely monitors the timing of reflections to map and track objects within its detection range. Unlike a camera that captures a two-dimensional rendition of three-dimensional scenes, a LIDAR system essentially captures full-fledged three-dimensional reality. The basic technology already is out there—LIDAR allowed many robots at the DARPA Robotics Challenge to “see” and it enables autonomous vehicles to sense obstacles in their surroundings, for example—but those systems are too big, heavy and expensive for widespread use.
The range of applications for compact LIDAR systems that can provide real-time data on even subtly changing positions and velocities of nearby objects is enormous. One of the most coveted applications that could emerge from the envisioned program, which could extend for five years with up to $58 million in funding, is foliage-penetrating imagers for spotting hidden threats—a breakthrough that could revolutionize situational awareness in contested areas. “You would be able to fly a MOABB-enabled helicopter or drone low over a lush forest canopy and be able to effectively peel back the leaves and see a sniper or a tank underneath,” Conway said. ”It could instantaneously give you the range and velocity of everything up to a football field’s distance away with the resolution of a camera. And with accompanying visualization tools, he added, “you would feel like you are on the ground with nothing blocking your vision.”
Other potential applications include collision avoidance systems for small unmanned aerial vehicles (UAVs) maneuvering in tight indoor spaces, precision motor control for robotic limbs and fingers, high-capacity light-based communications and data-transfer systems, and sophisticated gaming or training modules in which LIDARs would open up new worlds of immersive experience just as GPS and motion-sensing accelerometers have done in today’s systems. “Every machine that interacts with the 3D world—whether it is a manufacturing robot, UAV, car, or smartphone—could have a chip- or wafer-scale LIDAR on it,” Conway said.
To get a sense of the technology challenge MOABB poses, picture stripping a telescope of its lenses, mirrors and the interior space in which images come into focus. Jettison the mechanical parts, too, including the dials, gears and motors for focusing and steering the instrument. Now reconstitute all of the light-gathering and imaging roles these parts play in conventional telescopes in an array of 10,000 light-emitting and light–detecting semiconductor dots distributed on a disk about the size of a DVD. The result: performance equivalent to or better than today’s LIDARS that rely on bulky, telescope-like detectors.
The first phase of the program calls for researchers to develop the fundamental devices that will underlie the new LIDAR concept: speck-sized light-emitting and light-detecting cells capable of being readily integrated into larger arrays using typical semiconductor manufacturing processes. Phase 2 and Phase 3 of the project call for the integration of these cells into a 1 cm2 array and a 10 cm2 array comprising upwards of 100 and 10,000 unit cells, respectively.
With an integration of digital, electronic, optical and radiofrequency elements on a variety of combined semiconductor materials, the final 10-cm aperture LIDAR surface has the potential to be the most complex electronic-photonic circuit ever constructed, according to an anticipated Broad Agency Announcement of the MOAAB project, to be published later this month on FedBizOpps. As a first step on the way toward the new technology, DARPA will host a Proposers Day event at DARPA headquarters in Ballston, VA, on December 17, 2015, to provide potential proposers with detailed information on the objectives of the program. Details about the Proposers Day can be found in a Special Notice released today.
Image Caption: Even lush forest canopies could became effectively transparent with a future generation of light detection and ranging (LIDAR) technologies. Click on image below for high-resolution. (Source: Shutterstock).
Media with inquiries should contact DARPA Public Affairs at firstname.lastname@example.org
You are now leaving the DARPA.mil website that is under the control and
management of DARPA. The appearance of hyperlinks does not constitute
endorsement by DARPA of non-U.S. Government sites or the information,
products, or services contained therein. Although DARPA may or may not
use these sites as additional distribution channels for Department of
Defense information, it does not exercise editorial control over all of
the information that you may find at these locations. Such links are
provided consistent with the stated purpose of this website.
After reading this message, click to continue