Death of Arizona Pedestrian Raises Several Questions on Self Driving Technology Deployment and Related Matter
Death of Arizona Pedestrian Raises Several Questions on Deployment of Self Driving Technology. Why are the rules of the road not yet defined?
According to the Verge, police have released statement that Uber’s self-driving car was travelling at a speed of 40 miles per hour (64 kilometres per hour) when it struck a 49-year-old woman in Arizona on Sunday night and showed no signs of slowing down.
As a consequence of the sad incident, Uber suspended all of its self-driving testing in cities across the country in the wake of the crash. Volvo and Toyota, which have self-driving partnerships with Uber (or were negotiating deals), have declined to comment on the future of their relationship with Uber.
The unfortunate incident is shaping up as the first significant test of how policy makers and the public will respond to the niche Technology. The incident occurred at a time when companies have been pushing for regulatory clearance to offer self-driving car ride services as early as next year. On Friday, Uber and Alphabet Inc's Waymo car units had sent written requests to U.S. senators to approve sweeping Self Driving car legislation within next few weeks.
The pedestrian’s death is raising several questions on Self Driving Technology, deployment of Self Driving Cars, the Governing Policies and Regulations, Policy Makers, an endless list. Is it time to enforce sensible regulatory policies even if innovation despises regulation?
What triggered Amendments to Existing Vehicle Safety Regulations?
Automobile manufacturers and technology companies such as Uber, General Motors Co and Toyota Motor Corp have made substantial investments in this arena. This has supposedly yielded significant revisions to existing vehicle safety regulations written under the assumption that a licensed human would always be in control of a vehicle.
Why compare Robotic Systems to Impaired Human Drivers?
Automobile and technology industry officials have warned that there could be accidents and deaths involving self-driving cars, but they have said countless additional lives would be saved as robotic systems programmed to obey traffic laws took over for distracted, sleepy or impaired human drivers. Does this imply Self Driving Cars are as good as human drivers challenged in some way?
Why is former chairman of NTSB insisting Self Driving Cars are Safe?
Mark Rosenker, a former chairman of the National Transportation Safety Board, said on Monday the public should not overreact to the Uber incident. He noted that 6,000 pedestrians and nearly 40,000 people die annually on U.S. roads in more than 6 million crashes annually. This is going to be an unfortunate obstacle that we are going to have to deal with to regain (the public's) belief that these devices are safe, according to Mark.
The incident prompted Uber to suspend all testing of Self Driving cars.
Do Self Driving Vehicles need different Governing Laws and Policies?
The immediate impact of the fatality may be to further delay or change a landmark bill pending in Congress to speed the testing of self-driving cars that was already stalled by objections from a handful of Democrats over safety concerns.
Senator John Thune, a Republican who chairs the Commerce Committee, said that the tragedy emphasizes the need to adopt laws and policies tailored for self-driving vehicles.
However, two Democratic U.S. senators on Thune's committee, Ed Markey of Massachusetts and Richard Blumenthal of Connecticut, said the Uber incident demands a tough response. "This tragic incident makes clear that autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians, and drivers who share America’s roads," Richard said in a statement.
Why are States Free to Set their own Rules?
The Trump administration has been working to dismantle regulatory roadblocks to self-driving cars, but it has also said it is focused on ensuring safety. "The goal is to develop common sense regulations that do not hamper innovation, while preserving safety," Transportation Secretary Elaine Chao said on March 1.
Elaine is reviewing a petition that General Motors filed in January with NHTSA (National Highway Transportation Safety Administration) requesting an exemption to have a small number of autonomous vehicles operate in a rideshare program without steering wheels or human drivers.
It seems states are free to set their own rules. So, when General Motors told the NHTSA it wanted to test its newest generation of autonomous Chevrolet Bolt EVs without steering wheels or pedals for accelerating or braking, there were already seven states that have said they would welcome such vehicles. But first, NHTSA must revise or eliminate current federal vehicle safety standards that require compliance through tests with a human driver, as manual controls for steering, acceleration and braking.
Two months later, GM’s request sits before NHTSA, an agency that still doesn’t have a permanent administrator.
Why no Rules Enforced to Test the Potential of Self Driving?
This is not a question of supporting or opposing progress. Self Driving vehicles may have the potential to prevent many of the accidents caused by human error. But where are the Defined Rules to Test the potential of Self Driving? Why Trump Administration didn't bother to ask for test cases from the Auto and Tech Companies?
To be sure, there are issues in the early testing. Since October 2014, the California DMV has received 59 collision reports involving autonomous vehicles. Most were minor, but many are caused by human drivers not anticipating the self-driving maneuver.
Any contact can do damage, especially to the dozens of sensors positioned around any autonomous vehicle.
Is it Permissible to Test Unproven Technologies on Public Roads?
California, which already has issued self-driving test permits to 52 companies, is expected to allow self-driving vehicles without drivers to begin testing on public roads as early as April.
The International Brotherhood of Teamsters said on Monday in a statement the incident demonstrated there are enormous risks inherent to testing unproven technologies on public roads. It is critical that pedestrians and drivers are safeguarded.
Former U.S. Transportation Secretary Anthony Foxx said on Monday the incident is a wakeup call to the entire Autonomous Vehicles industry and government to put a high priority on safety.
In September, the U.S. House of Representatives unanimously passed a measure that would allow automakers to win exemptions from safety rules that require human controls. A Senate version would allow automakers, within three years, to each sell up to 80,000 self-driving vehicles annually if they could demonstrate to regulators they are as safe as current vehicles.
Concerns over the safety of autonomous vehicles flared in July 2016 when a man driving a Tesla Model S in semi-autonomous Autopilot mode died when his car struck a tractor-trailer. In January 2017, federal safety regulators concluded there was no defect in the Tesla Autopilot system, and that the driver should have maintained control, which was ridiculous of the Feds.
Apparently, states are making the rules on ad hoc basis as they go along. Those decisions reflect a delicate balance between the economic development benefit of encouraging a new technology and ensuring public safety.
Is it just me thinking, no matter how much advanced technology gets, it will still need the power of human brain? Do you believe that Technology can successfully replace Human Beings?