Abstract Tech

Mobileye Redefines Autonomous Driving Terms

A group of people standing in the background with a digital globe superimposed in front of them
MarketInsite Nasdaq Blog

Mobileye Redefines Autonomous Driving Terms

 

autonomous driving

Source: Mobileye

As autonomous driving becomes more prevalent, it’s reshaping not just our commutes but also our behaviors and attitudes toward mobility. A key part of this transformation lies in how we talk about it. The language surrounding autonomous driving often emphasizes features and capabilities, while overlooking the experience of what it feels like to ride around in a driverless vehicle. Autonomous driving, after all, is a revolutionary concept—one that fundamentally redefines what driving means.

The language surrounding it, however, is highly technical, with terms like Level 1, Level 1+, and so on, up to Level 5 and phrases like "conditional automation" and similar jargon often feel vague and inaccessible. So, how can we move away from these engineer-centric terms to adopt a language that resonates with human experience?

This is exactly what Mobileye (NASDAQ: MBLY), a leading ADAS developer, is striving to achieve. For autonomous driving to truly flourish, shifting the conversation from the technical to the personal can help ordinary drivers like you and me jump into an AV and talk about it like it’s part of our everyday lives.
 

The old jargon

Before diving into Mobileye’s new taxonomy, it’s essential to understand how the industry previously discussed self-driving technology. In 2014, the Society of Automotive Engineers (SAE) established a classification system for self-driving cars, ranging from Level 0 (no automation) to Level 5 (full automation).

However, this terminology—developed by engineers for engineers—proved overly technical and confusing for consumers. For instance, SAE Levels 3 and 4 indicate a high degree of automation, but what do these levels actually mean? Can drivers take their eyes off the road? Is it safe on all types of roads? Will the car maneuver independently?

This jargon was neither intuitive nor clear for the average person, highlighting the need for a more accessible framework that clearly defines where the driver's responsibilities end and the vehicle's begin—without requiring an advanced engineering degree.
 

On/off self-driving language
 

On/off self-driving

Source: Mobileye

Mobileye’s CEO and founder, Prof. Amnon Shashua, explained the logic behind the new taxonomy: 'We were looking for a more product-oriented language—so we created our own.' This new language emphasizes concepts that consumers can easily understand, clearly quantifying both the driver's level of involvement and the car's autonomous capabilities.

In developing this new vocabulary, Mobileye focused on three key questions: First, can the driver take their hands off the wheel? Second, can the driver safely look away from the road? Third, does the vehicle require the driver's involvement? The resulting taxonomy is straightforward, providing a much clearer picture of self-driving car capability.
 

Eyes-on/Hands-onEyes-on/Hands-offEyes-off/Hands-offNo Driver
This level encompasses basic driver-assistance systems like Automatic Emergency Braking (AEB) and Lane Keep Assist (LKA). The driver is still responsible for the entire driving task while the system monitors the human driver (Level 1-2 according to SAE).The next level represents more advanced driver-assistance systems where the driver's hands can be off the steering wheel while the system takes control of the driving, and the driver supervises the system. To be effective, the interventions should happen rarely, and in order to keep the human driver vigilant, a proper DMS driving monitoring system (DMS) should be in place.Driving function is controlled within a specified Operational Design Domain (say, highways with on/off ramp transitions) without the human driver needing to supervise the driving (hence, Eyes-off). Once the ODD comes to an end, the driver needs to take control again. This category can be classified as either Level 3 or Level 4 according to SAE J3016.This category represents fully autonomous vehicles, often referred to as robotaxis. Where in the robotaxi case there is no driver in the car, and a remote operator can take over in rare situations.


The new language has practical application in Mobileye’s current driver-assist and autonomous solutions. For example, Mobileye SuperVision™ (a hands-off /eyes on solution) allows drivers to take their hands off the wheel across a variety of road types   so the vehicle’s self-driving system can take control. Overall responsibility, however, still rests with the driver, who must keep his or her eyes on the road and supervise the vehicle’s operation at all times. 

With Mobileye Chauffeur™, additional active sensors are added to computer vision, complex system architecture like specialized crowdsourced AV maps, surround vision and more that already make up the platform of Mobileye SuperVision™. These redundant active sensors allow drivers to not only take their hands off the wheel, but their eyes off the road as well – within specific driving environments, limited to specified ODDs.

 

oem's

Source: Mobileye

By creating a spectrum of solutions that build upon each other, the company makes it simpler for OEMs to gradually expand their customer offerings, lending OEMs the ability to easily level-up the automated solutions provided to their customers.


Beyond just language

Beyond just language, the value of 'Eyes-on/Hands-off' systems emphasizes the driving experience, highlighting the core elements of autonomous driving—which is safety and a more comfortable experience. Meanwhile, 'Eyes-off/Hands-off' solutions take it a step further, providing a truly revolutionary benefit: giving drivers back their time for work, leisure, or other activities. 

While the technologies that go into these driving systems are highly complex, Mobileye’s new simplified taxonomy allows for their capabilities to be expressed as clearly as possible, making sure that the conversation around vehicle autonomy is understood. 

Latest articles

Info icon

This data feed is not available at this time.

Data is currently not available