Arbe is developing the world’s first ultra high-resolution 4D imaging radar, a “missing link” technology, according to company CEO Kobi Marenko, in the L3 and higher autonomous vehicle evolution...
Ниже есть продолжение.
...Why is there a need for a new sensing technology for the autonomous driving industry? What is Arbe’s unique selling proposition?https://www.automotive-iq.com/powertrain/articles/arbe-robotics-and-their-new-autonomous-driving-sensing-technology
Currently, most autonomous vehicle sensing suites include two or three types of sensors: camera, radar and in some cases Lidar. The reason several technologies are being used is that each has strengths and each has weaknesses. You can’t rely on any of them independently. For example, cameras deliver 2D resolution and Lidar 3D resolution, both lose functionality in common environmental conditions such as darkness, pollution, snow, rain, or fog.
Radar, which is based on radio waves, maintains functionality across all weather and lighting conditions. However, the technology has been limited by low resolution, a disadvantage that has made radar very susceptible to false alarms and inept at identifying stationary objects. Until now, that is.
What we’ve been able to do at Arbe is to remove radar’s resolution limitation, infusing this super dependable technology with ultra high-resolution functionalities to sense the environment in four dimensions: distance, height, depth and speed. In the autonomous driving industry, this technological advancement effectively repositions radar from a supportive role, to the backbone of the sensor suite.
What impact can this product have on the autonomous driving industry?
OEMs are preparing to ramp up Level 2.5 and Level 3 autonomous vehicle production, a move that requires safety-critical functions transfer from the driver to the vehicle. This advancement cannot happen without a sensor that can instantaneously respond to the full range of driving scenarios, identify and assess risk, execute path planning, while offering a non-irritating driving experience for both the driver and those sharing the road.
This leaves us with two problems. First, we haven’t had a sensor suite capable of that level of performance. Second, many sensor suites rely on Lidar, which is quite expensive, at least 10x more expensive than radar. So, the price point would limit ADAS availability to premium and luxury vehicles.
Arbe’s Phoenix technology solves both of these problems. We’ve managed to produce an affordable sensor robust enough for ADAS and autonomous driving. We’ve been able to do this primarily through the development of a proprietary chipset technology that delivers a highly sensitive imaging radar that can identify and track objects small to large, moving and not moving.
For example, we can identify pedestrians, bikes, and motorcycles, and separate them from vehicles and environment objects - even when they are somewhat concealed by them. We believe Phoenix is a game changer.
...We are now delivering an image 100 times more detailed via higher resolution sensing. We can reduce false alarms through advanced algorithms and innovative antenna design. In addition, we separate small and large objects through a high-dynamic range, and provide clear boundaries of stationary and moving objects.
By using the 22nm RF CMOS process, Phoenix dramatically reduces costs per radar channel, while consuming the lowest power per channel in the industry.
One of the most significant obstacles to achieving ultra high-resolution has been the amount of processing power required for the analysis of enormous amounts of information. To counter this, Arbe made the strategic decision to develop our own proprietary radar processing on a chip...
Arbe developed the first radar that separates objects by elevation in high-resolution.
Our radar’s capabilities are not compromised when a vehicle goes in and out of an underground parking lot, or up and down a hill. Further, Phoenix can identify and assess objects at various elevation ranges and plan the route accordingly. It detects and responds appropriately to non-obtrusive objects such as manhole covers, or over-hanging signage. It identifies and brakes for stationary objects in the vehicle’s lane, even if they are under bridges or in dark tunnels, scenarios that present primary challenges to current ADAS sensor suites.
Also, elevation perception significantly simplifies the fusion of radar data with visual data from cameras. Because both sensors now share two dimensions - azimuth and elevation...
No comments:
Post a Comment