Between now and when autonomous vehicles are ultimately deemed safe enough to be operated on the world’s highways, all of the sophisticated technologies being used in concert to “drive” the vehicles will be tested vigorously and improvements will be incorporated.

When the fully actualized robo-car gets the green light, the initial application appears to be ticketed for mobility services – a term that covers everything from ride-hailing companies such as Uber and Lyft to traditional taxis and buses.

When these various fleets are put into motion, imagine the world of opportunity that opens up for people who otherwise wouldn’t be able to get around, such as the elderly and physically impaired. In America alone, NHTSA says 53 million people have some form of disability. Imagine how the liberation of millions of people who previously couldn’t impact the economy in a positive manner.

This also will open up myriad possibilities for what happens inside the vehicles, especially in connection with ride-hailing operations. Passengers can elect to be productive, entertained or even to zone out. The environment of the cabin can be used as an office, a library or a living room. But to achieve those options, there has to be an upgraded interface between the car and the passengers inside.

The vehicles will need to become fully “passenger aware” and should be capable of providing insights regarding what happens inside the cabin. Similar to the equipment outside the vehicles, it will require a sensing platform within the cabin that combines artificial intelligence with machine vision, depth perception and motion analysis. These “four-dimensional” in-cabin sensors for autonomous vehicles will be providing understanding about what is happening inside in terms of user behavior, in terms of the number of users in the vehicle, their physical size, the kind of objects there are and so on.

As a rudimentary example, this type of sensing currently is provided by the taxi driver. In addition to providing an efficient, safe trip, we sometimes forget the driver is collecting and processing “data” and making constant decisions regarding who should be picked up and even who should be kicked out for unruly behavior. The taxi driver also would be sensitive to whether objects were left behind in the vehicle, whether passengers properly and completely disembark and whether the cabin was clean enough and ready for the next customer. For autonomous vehicles, sensing technology will provide similar observations and decisions based on what is happening inside the vehicle.

As we all know, the ability of a vehicle to be driven via technology instead of a human being depends on the collection and processing of data at speeds we cannot even comprehend. This data then is fed just as quickly into the algorithms that generate the actions or “decisions” that control the vehicle. The same concept applies to sensors inside a cabin.