The safety culture of the autopilot Auto department is strongly criticized by NTSB.

Posted 2024-12-16 00:00:00 +0000 UTC

On November 19, according to foreign media reports, Robert sunwater, chairman of the national traffic safety board of the United States (hereinafter referred to as NTSB), said in a speech in Washington that Uber's safety culture is not sound enough, and there are problems in the system and in vehicle safety drivers. Last year in Tempe, Arizona, a pedestrian died in a serious car accident involving an Uber self driving test vehicle. NTSB found that Uber lacked a complete security strategy, which eventually led to this unfortunate accident. When he held a board meeting to determine the cause of the crash, he pointed out that behind the misconduct of the autopilot system and the vehicle safety driver, there was a deeper problem, that is, the safety culture at that time was not perfect enough to play an effective role in restraining. This is the first time that NTSB has investigated the fatal accident of autopilot test vehicle, or will have an impact on the manufacturers, suppliers and technology companies who have entered or ready to enter the field, and this case is also closely watched by the autopilot industry. This emerging industry is trying to subvert the current transportation industry, attracting billions of dollars of investment from companies such as GM and parent company alphabet. A survey report released by the agency last week showed that the test car with an automatic driving system did ride a bicycle at the age of 49 at night, trying to cross the road about 6 seconds before Elaine Hertzberger found her. Since the accident in March 2018, the company has suspended the automatic driving test after the release of the investigation information. The investigation highlights Uber's technical and human errors, which NTSB may cite as the cause of the accident. Uber has resumed autopilot testing in Pittsburgh at the end of last year. According to the survey report, the radar sensor of the excellent step test vehicle was first detected by Herzberg about 5.6 seconds before crossing the road, but unfortunately, the automatic driving system initially classified the pedestrian as a vehicle. NTSB said the system had repeatedly classified Herzberg as a different type of object, but failed to predict that she would cross the road and enter the test lane. The report points out that the automatic driving system is lacking in predicting the possibility of pedestrians crossing the road. Uber, which was testing a modified crossover at the time, lacked procedures to identify and respond to pedestrians outside the crosswalk signs, and the system did not allow the vehicle to automatically brake just before a collision. The responsibility for avoiding accidents lies with the single safety driver who monitors the vehicle automation system, while other companies will deploy two people in the vehicle to increase safety. According to police, although the company banned drivers from using mobile devices, the safe driver was using his mobile phone to watch TV programs before the accident. NTSB also said that the senior technology team of the company did not set up an independent safety department, a formal safety plan, a standard operation procedure, nor a manager specially responsible for accident prevention when testing autopilot on the public street of Tempe. After the NTSB investigators conducted a number of reviews on their operation and investigation results, the automatic driving system was greatly adjusted. NTSB said that according to Uber's report, the new software will be able to correctly identify pedestrians and trigger controlled braking in the first four seconds of the collision. NTSB found that the current federal standards related to automated driving tests are not perfect. NHTSA provides only rough guidelines, but some states are filling in the gap. As of June 2019, 29 states in the United States have developed certain types of autonomous driving policies or guidelines. The report also points out that Pennsylvania is leading the way in setting safety rules. Even so, the report concludes, unless there is a driver who is awake and ready to control the situation at a critical moment, the technology is still a long way from gaining consumer trust. More importantly, at present, there is no mass production level (autonomous driving) vehicles that do not need continuous driver monitoring.

Copyright © 2020. TUTESL All rights reserved.