AEye develops advanced vision hardware, software, and algorithms that act as the eyes and visual cortex of autonomous vehicles.
"Recently, we announced the first commercial solid-state vision system capable of 360-degree coverage and real-time software configurability," says cofounder Jordan Greene, Vice President of Strategy. Those qualities are "key milestones for the delivery of high performance, compact, cost-effective advanced vision systems that enable self-driving vehicles to safely navigate roadways."
Greene says people outside of the industry might be surprised that the automotive industry is moving away from the model of traditional individual car ownership and toward a model focused on transporting individuals from point A to point B. According to Greene, three global factors are driving this transformation: vehicle utilization is below 4%; 400 billion hours are spent driving per year; and 3,500 people die in car accidents per day.
The shared economy is disrupting many industries as each looks to optimize access and utilization through software and distributed networks. The automotive industry is no different, he says. Safety is also a factor in the rise of autonomous vehicles. "Driverless cars are predicted to reduce US auto accidents by 90 percent, making roads a whole lot safer," Greene says. "It's our job to help the automotive players get there safely."
AEye's "intelligent sensing" LiDAR (Light Detection and Ranging) system has several key differentiators, according to the company. It pre-fuses computer vision and LiDAR for intelligent data collection and rapid perception and motion planning; it uses software-configurable hardware, which enables vehicle control system software to selectively customize data collection in real-time; and it leverages edge processing to reduce control loop latency for optimal performance and safety.
The resulting intelligent, software-configurable vision system leads to smarter data collection and quicker perception, using less computational resources. Those are key components of a vision system that enables automotive players to safely navigate roadways and solve tough corner cases.
"Our core competency is in developing end-to-end solutions to address vehicle perception-reaction time," Greene says. "Recently, we were tasked with optimizing the distance that we could detect pedestrians. The requested distance was 200 meters so that a vehicle traveling 45 miles per hour would have approximately 10 seconds to detect and react to a pedestrian walking across the street. Accordingly, we optimized our system for range, and we exceeded the expectations of our customer."
AEye's Pleasanton office is the company's headquarters and primary facility. The staff of 40, which includes executives, engineers, and the business development team, moved into Hacienda about three years ago. The company's team is made up of executives and program managers from DARPA, Lockheed Martin, Northrop Grumman, NASA, Raytheon, the US Air Force, VLSI, and other defense firms. These employees bring their experience in military-grade targeting solutions to developing intelligent vision systems that enable vehicles to quickly perceive and react to their environment.
On November 15 AEye CEO and founder, Luis Dussan, will be speaking at the Pleasanton Chamber of Commerce's annual economic development luncheon. For more information about AEye, visit www.aeye.ai. For more information about the 2017 Pleasanton Chamber of Commerce economic development luncheon, visit business.pleasanton.org/Events.