From the course: Ethics and Law in Data Analytics

Example autonomous cars

From the course: Ethics and Law in Data Analytics

Example autonomous cars

- What we're talking about in this video are automatic or semi automatic vehicles. There is a range of automation that is actually in existence today. So we have cars in existence today that have automation like auto breaks, and then we go all the way to fully self driving vehicles, which we don't have yet, but are being tested by certain companies like Tesla Motors right now. This is still futuristic, but it's something that we're looking towards. Now with automobiles and vehicles, they are heavily regulated. So the design and manufactures of automobiles is heavily regulated by federal law, statutory law, and also in some respects by common law, which provides legal remedies under product liability for defects in the design and manufacture, that might cause consumer harm. We may have more defective warning claims as we move into more automation. As the facts change, the theories of product liability law will also evolve and expand. There's two issues to know here for our purposes. One is cyber security issues, and the second is changing liability issues as we become more autonomous with our vehicles. First, the cyber security threat. This is very unique to autonomous cars. It's actually possible for hackers to hack into the car, and take control away from a driver, and a semi autonomous car and also a fully autonomous car. For example, in July of 2015, two hackers, Charlie Miller and Chris Valasek, developed a tool to hijack the controls of a Jeep Grand Cherokee, while the driver, senior writer and senior writer for Wired Magazine, Andy Greenberg, was at the wheel in St. Louis/ Andy did not know what exactly would happen during this experiment. He didn't know was going to happen, but he didn't know the extent of it. He was surprised when his vehicle began blasting cold air. his radio station was switch the windshield wipers turned, and then transmission was cut while he was traveling at 70 miles per hour down the highway. The hackers were able to take control of Andy's vehicle through the Jeep's entertainment system. Fortunately for Andy, he knew, as I said that this was going to happen, he was aware of it, he just wasn't aware of the extent of it. The hackers conducting this experiment were part of a grant funded by the Defense Advanced Research Projects Agency or DARPA. It's part of an effort to understand the vulnerability of vehicle systems to cyber attacks. Their findings indicated that more than 400,000 vehicles on the road today could be susceptible to cyber attacks like that one. So cyber security is also an issue for us with smart cities. Self driving cars will require smart roads for operation. That means cameras and sensors are going to be built into roadways and street signs. Sensors will use radar and LIDAR to communicate with the vehicles. Some local governments are looking to invest in smart infrastructure that will allow communication between the roads and the vehicles traveling on them. For example, Columbus, Ohio, recently won 16$5 million in grants to become the first US city, to integrate self driving cars, connected vehicles and smart sensors. Atlanta Georgia has built a fiber and electrical network and it's downtown Carter to support roadside sensors and cameras for driverless cars, With hopes that the technology companies will develop and test their products in the city. These Smart City designs involve extensive communication networks that could be vulnerable to cyber attack, and interruption in these interconnected systems could cause significant damage. Because it's so new this technology has not yet been put through its paces and security testing. And cities begin to replace old infrastructure with the new the evaluation and monitoring of these cyber security concerns is going to be critical. Now our second issue is shifting liability, meaning shifting liability between the driver and the autonomous car. Right now as we drive if you're driving, you are responsible, you're insured and you're responsible for any liability. But what happens when the driving is shared or is turned over to the technology, who is responsible for the harm caused? Is it the car, Is it the driver? Is the car now the driver's agent? These are all questions that we're going to have to address eventually as we become more autonomous in our vehicles. So an example of this with a product design, happened in 2015. This is again at concern for shifting liability. In 2015 Tesla released an autopilot mode of its Model S. The following year, a driver of the Model S was operating it in autopilot mode, which means that the car was autonomously breaking, steering and lane switching. The car collided with a tractor trailer that was making a turn. The autopilot sensors fail to recognize the difference between the truck and the truck and the bright sky causing this collision. The incident resulted in the driver's death, the first fatality from a self driving vehicle. The driver was not controlling the vehicle when the accident occurred. So it might seem logical to assume that the autopilot system was at fault again, who is at fault here? Who's liable, the system itself or the driver? But interestingly an investigation by the National Highway Traffic Safety Administration found that there were no defects in the Model S autopilot system. An investigator noted that some situations are beyond the capabilities of the autopilot system. So autopilot actually requires full driver's engagement at all times. The investigation also found that the driver might not have been paying attention to the road. Tesla still may have some responsibility for this accident, even though the technology did not fail. It's unknown whether Tesla properly informed the owner of the limits of the autopilot features. This is what I was referencing earlier in this video when it was talking about defective instruction or negligent instruction. It's an aspect of products liability law that requires manufacturers to inform us, and instruct us on how to properly use whatever the consumer products is in this case, the vehicle. If there is not proper instruction, including instruction on what might not work, or what the possible risks are, then the manufacturer can be responsible under products liability law.

Contents