#InnovizTechnologies
Explore tagged Tumblr posts
Text
Smart mobility market to reach $70.5 billion by 2027
Smart mobility market to reach $70.5 billion by 2027 According to Allied Market Research, implementation of on-demand transportation services and supportive government initiatives regarding Smart Cities will drive growth in the global market. #AlliedMarketResearch #Bosch #Cisco #Excelfore #Ford #InnovizTechnologies #IntelligentTransportationSystems #MAASGlobal #QuaLiXInformationSystem #Siemens #SmartCities #SmartMobility #TomTomInternational #Toyota #TransportationServices Read the full article
#AlliedMarketResearch#Bosch#Cisco#Excelfore#Ford#InnovizTechnologies#IntelligentTransportationSystems#MAASGlobal#QuaLiXInformationSystem#Siemens#SmartCities#SmartMobility#TomTomInternational#Toyota#TransportationServices
0 notes
Text
Futuremobile: Who will prevail in the battle over the “eyes?”
The “rotating coffee cans” atop Google’s autonomous vehicles were once symbolic of the future of driving. They are now quickly becoming quaint artifacts of a bygone era as low-cost LiDAR technologies emerge and more integrated approaches to object sensing start to take priority.
The highly contentious patent-infringement lawsuit between Waymo, Google’s autonomous vehicle group, and Uber, emphasizes what’s at stake, while at the same time distracting from the rapid pace of LiDAR technology development from startups and established companies alike.
When they first emerged in 2005, Velodyne’s 3D, real-time light detection and ranging (LiDAR) sensors were a welcome answer to the shortfalls of radar and cameras. By measuring the time of flight (ToF) from the laser to an object and back to the photodetector, and knowing the speed of light, LiDAR systems can precisely calculate distance, down to the centimeter, or less (Figure 1).
Figure 1. LiDAR uses the time-of-flight of optical pulses to calculate range to an accuracy of 2 cm at up to 200 m. (Image source: Delphi)
On the other hand, radar uses RF waves and makes sense of the reflected waves. It doesn’t depend on being able to “see” in the visible spectrum, so it works regardless of weather or lighting conditions. It can also see around objects, using the right algorithms and with enough processing horsepower. However, it suffers from poor resolution so it has difficulty distinguishing between objects, and it can’t detect color.
Cameras and the computer vision algorithms behind them are excellent at reading signs, seeing people and tracking roads and lane markings, but typically cannot detect range and are easily blinded. That said, using two or more camera lenses, 3D information can be extracted for ranging, and vision algorithms and detectors are getting better at managing lighting extremes
For its core design, Velodyne combined multiple lasers and photodetectors (up to 64 in the high-end HDL-64E) with associated optics to get a vertical field of view (FoV) that’s now 26.8˚ to 40˚. However, to get a full 360˚ horizontal view of the surroundings, Velodyne developed the distinct “can” which houses the optics and electronics, sitting atop a rotating platform. The lasers, detectors, and algorithms provide accurate 3D images of up to 1.3 million data points per second the mechanical rotation provides the 360˚ view. The HDL-64E has a range of 100 to 120 m and a horizontal resolution (frequency of rotation and refresh) of up to 20 Hz.
On the downside, the system costs up $75,000, weighs 33 lb, and consumes 60 W.
LiDAR gets affordable Though the cost of Velodyne’s approach continues to come down, and Velodyne is introducing smaller versions, a fundamentally more cost-effective, lighter and more aesthetic approach to LiDAR is needed. In that regard, Velodyne is playing catch up to a flotilla of startups, including Waymo, which was spun out of Alphabet and is now part of Google’s self-driving car project.
Other startups include Israel’s Innoviz Technologies, which has promised to bring a $100 solid-state LiDAR to market by 2018. In the meantime, Quanergy Systems has already introduced the $250 S3, a solid-state, phased-array LiDAR that measures 9 x 6 x 6 centimeters (Figure 2).
The overall goal is to move from a bulky, mechanical system to a smaller, more discrete version of LiDAR based on bulk solid-state or fiber-based lasers. As Quanergy has shown, these have now become powerful and efficient enough to be considered for low-cost, low-power automotive applications.1 By moving to the solid state, the problem of mechanical moving parts is eliminated and reliability increases, and while that has always been the case, the improved performance of solid state devices now makes it a viable option.
Quanergy specifies a maximum range of 150 m at 8% reflectivity, and it generates 0.5 million data points per second. At 100 m, the distance accuracy is +/- 5 cm. The FoV for Quanergy’s solution is 120˚, both horizontally and vertically. To get 360˚ visibility, multiple S3s will be needed. Assuming four systems, one at each corner of the vehicle, the total cost is $1000 for a system provides more data points than Velodyne’s but costs a lot less.
Quanergy has partnered with Delphi, a Tier 1 automotive supplier, which has also invested in the company. Valeo SA, another supplier, is using LeddarTech Inc.’s proprietary algorithms and ASIC to provide a LiDAR solution to Audi for the A8 in 2019 and 2020.
Unlike Quanergy, LeddarTech is focused on processing in the digital domain, before the signal is sent out and after it’s received. It uses proprietary techniques to expand the sampling rate and resolution of the sampled signal and recovers the distance for every object in its field of view. While LeddarTech does supply related optics to get a project moving, its focus is on the processing: optics can be sourced elsewhere.
Israel’s Innoviz Technologies is a startup that has already announced the $100 InnovizOne, a high-definition solid-state LiDAR (HD SSL) measuring 5 x 5 x 5 cm. It is a full system and has key performance specifications that include a range of 200 m, a depth accuracy of <2 cm, a FoV of 100˚ horizontal and 25˚ vertical, and a resolution of >6 Mpixels/s at a frame rate of 25 fps. The company expects to be in production by 2018.
Another startup getting a lot of attention is Luminar Technologies, led by Austin Russell, a young, 22-year-old applied-physics prodigy. The company has already demonstrated a “hyper-accurate” LiDAR design with a range of over 200 m at <10% reflectivity (Figure 3).
The rate of development of competition in LiDAR, particularly for autonomous vehicles, clearly took Velodyne by surprise. While it has introduced its own version of a solid-state LiDAR in April, it’s own website still states that,
“Solid State approaches are experimental, and in the early stages of research. Especially for distance measurements above 20 meters with meaningful field of views, no proven concept has been offered as a commercial solution, and it is doubtful any will emerge in the near future due to challenges based on the fundamental laws of physics. Future sensors would be only directional, and therefore multiple sensors would be needed for a full surround view if this technology ever matures to a commercially useful level.”
LiDAR innovation is moving so fast that even marketing and site editors can’t keep up. The company’s solid-state Velarray LiDAR measures 125 x 50 x 55 mm, has a 120˚ x 35˚ FoV, and a range of 200 m. The intent of the system to enable advanced driver assistance systems (ADAS), a much-needed feature now, even as cars move toward full autonomy.
The competition for Velodyne isn’t just from startups. MicroVision, for example, is turning its microelectromechanical systems (MEMS) expertise toward LiDAR. See a demonstration on YouTube. One of the engineers at Velodyne, Kevin Watson, actually left the company to join MicroVision to develop what he calls, “…that Holy Grail of a sensor.” Also, Continental AG is getting to launch hi-res Flash LiDAR technology that relies upon a single pulse to acquire an image.
As the lawsuit between Waymo and Uber progresses, more details on Waymo’s technology have been revealed, including that it relies upon fiber laser technology, which uses the fiber itself to amplify the optical signal. While the technology itself isn’t new and is already widely used commercially for discrete ranging systems, its use in autonomous vehicles is only beginning. The case between Waymo and Uber is more about how it’s being implemented.
The paths to low-cost eyes While Velodyne paved the path to all-seeing eyes for autonomous vehicles, it’s becoming clear that the way forward is less about a singly perfect LiDAR and more about finding a lower-cost LiDAR system that can be integrated with cameras, radar, and other sensors. No one sensing technology can accomplish what’s needed, and for autonomous vehicles to progress, all options must be considered and integrated to provide a complete picture.
From a strictly LiDAR point of view, the winners will those that have the contacts and distribution channels within the very relationship-dependent automotive supply chain. The various architectures being offered by startups and established companies do offer clear distinctions, from digital-only processing to full systems. From MEMS to solid-state, to Flash LiDAR and fiber lasers, the options are many, but the odds are that many startups will be bought up by established players who will do well by focusing on the integration of LiDAR with other sensing options.
Reference:
1: https://esto.nasa.gov/files/Lidar_TechStrategy_%202016.pdf
0 notes