Top 7 challenges that autonomous vehicles still have to overcome


Over the past few decades, autonomous vehicles have made a lot of hype. Although we see many experiments being carried out on driverless vehicles in a controlled environment under human supervision and excellent road and environmental conditions, not many people still believe there could be such a possibility at all or will be a possibility soon. Why? Because autonomous driving has so many moving parameters to handle and control simultaneously and a single failure could be highly catastrophic.

Before we look at some of the significant challenges that autonomous vehicles still face today, we’ll first see five different levels that can define the level of automation. Each level has a specific set of requirements a vehicle must meet before being considered to operate at that level. It’s important to understand these levels as it’s the most widely accepted classification system in the industry.

Level 1: Most functions are still controlled by a human driver at this level, but the car can automatically accelerate or steer.

Level 2: At Level 2, at least one driver assistance system is automated by driving environmental information such as cruise control and lane centering. Drivers start being physically disengaged from the vehicle by having their hands off the steering wheel and foot off the pedal. However, the driver should be vigilant in this condition and always prepared to take over the vehicle.

Level 3: Human drivers are still required in Level 3 cars, but under specific traffic or environmental conditions can shift safety-critical functions entirely to the vehicle. It means that the driver is still present and will intervene if necessary but that the situation does not have to be monitored as it did for previous levels.

Level 4: Level 4 is what “fully autonomous” means. Level 4 vehicles are designed to carry out all safety-critical driving functions and to track road conditions for the whole trip. Again, however, they are restricted to the vehicle’s Operational Design Domain (ODD), which means that not every moving scenario.

Level 5: This level refers to a fully efficient system in which the performance of the vehicle in any driving scenario is expected to be equal to that of a driver, including extreme environment such as dirt track, which will not in the near future be driverless.

After constant research and development efforts in the last 50 plus years, we see autonomous cars as a reality. Still, the design of a fully autonomous system for driverless vehicles presents many challenges. They are the following:

1. Unpredictable road conditions

Road conditions may vary from place to place and can be extremely unpredictable. In some instances, roads are smooth and marked, but the road conditions in other cases have deteriorated considerably. There are lane-free roads, potholes, and tunnels where signals are not clear. Road marking lines around the globe are also different. Most self-driving cars rely heavily on highly detailed 3D maps that communicate intersections, stop signs, ramps, and buildings with automotive computer systems. They combine these maps with sensor readings to find a way around. Very few roads to this degree have been mapped. Maps may also be outdated as conditions change. For now, the main task of automated vehicle promoters is to map roads.

2. Weather conditions

Autonomous vehicles should work under all weather conditions, whether sunny, rainy, or stormy weather! There’s no room for failure or downtime. Snow, rain, fog and other weather conditions make driving difficult for humans, and it’s no different for driverless cars. They can block the view of lane lines that vehicle cameras use to find their way. Falling snow or rain can also make it difficult for laser sensors to identify obstacles. Radar can see through the weather, but it doesn’t show the shape of an object needed for computers to figure out what it is. Researchers so far haven’t figured out a way around this. They are working on laser sensors that use a different light beam wavelength to see through snowflakes. The software also is being developed so vehicles can differentiate between real obstacles and snowflakes, rain, fog, and other conditions.

3. Traffic and human drivers

Autonomous vehicles should ride on the highway under all traffic circumstances. They should be on the highway with other vehicles, and there would be lots of people at the same moment. When people are involved, traffic can be chaotic because individuals breach traffic laws. Even the most sophisticated algorithm can not forecast the messy, unexpected, and embarrassing behavior of these drivers and pedestrians. Computer systems may help self-driving vehicles comply with road laws, shut them, stop, slow down when a sign turns yellow and resume when a red sign becomes green. However, this technique can not regulate the behavior of other riders. Autonomous vehicles must need to cope with drivers who speed, pass even when there’s a double yellow line and drive the wrong way on a one-way street. In short, autonomous vehicles will have to deal with humans who don’t always play by the rules.

4. Accident liability and insurance

Accident liability and insurance are some of the worst areas for self-driving vehicles. Who is liable for accidents caused by an autonomous vehicle? How do insurance companies handle fender benders while a driver was not paying attention to the road? The software is the main driving component for autonomous cars and makes all significant decisions. Although initial autonomous car models had an individual physically behind the steering wheel, the later models have no dashboard and steering wheel! In such designs, where the car has no controls like a steering wheel, a brake pedal, an accelerator pedal, how should the person in the car control the car in the event of an incident?

5. Radar Interference

Autonomous cars use navigation, lasers, and radars. The lasers are installed on the ceiling, while the sensors are mounted on the car body. The radar principle functions by detecting radio wave reflections from surrounding objects. On the road, a car continually emits radiofrequency waves, which are reflected in the cars and other objects near the road. In order to compute the distance between the car and the object, the time taken for reflection is measured. Suitable measures are then taken on the basis of radar readings. Will the car be able to distinguish between its own (reflected) signal and the signal (reflected or transmitted) from another vehicle when this technology is used in hundreds of vehicles on the road? Although radar is available in several radio frequencies, this frequency range is unlikely to be insufficient for all manufactured vehicles.

6. Consumer acceptance

Surveys conducted after last year’s fatal Uber crash near Phoenix showed that drivers are reluctant to give up control to a computer. In a March survey, 71 percent of people fear driving in fully autonomous vehicles. Consumers see self-driving cars as less safe than they did two years ago, and nearly half of consumers said they would never buy a Level 5 car. Nevertheless, consumers want and expect semi-autonomous features in future cars because they agree that collision alert and collision avoidance systems help people become better drivers.

7. Creating cost effective vehicles

Sensors, radars, and communication devices used in autonomous vehicles are expensive. In 2020, a Level 4 and Level 5 car could cost an additional $75,000 to $100,000 compared to a regular car. The total cost may even exceed $100,000, given the number of sensors required to achieve autonomy levels 4 and 5. For customers to purchase these vehicles, the price has to drop dramatically and become affordable for consumers. Right now, with this high price, it seems that only Mobility-as-a-Service (MaaS), ride-sharing or robotaxi companies can have the first real deployments of autonomous vehicles on the road. These companies can build a business model that can support these expensive vehicles by replacing the cost of a human driver.