Software Found Guilty of Murder

edited September 2017 in For SCIENCE
Not really, but kind of. Remember all those news stories about Toyata's Unintended Acceleration (UA) events? A lot of the trials are finally starting/concluding, whatever. The UA isn't in the news as much (no longer new), but people died and others were injured, so this isn't something a class-action suit will magic away.

The big thing from these cases? For the first time ever, a jury found that defective software was the cause or probable cause of injury and death. My advisor was one of the expert witnesses for the prosecution. He does safety critical embedded systems, and has industry experience before his teaching experience. An article discussing the trial follows, with juicy links at the bottom leading to (surprisingly) easy-to-read court transcripts. It gets really fun when the defense yells

:objection:

and a muted coversation takes place at the bench outside of jury earshot. It's extra fun if you imagine the defense attorney, Mr. Bibb, as played by Jon Voight.

Enjoy.

Article
Court Transcript 1 (CTRL-F for "Koopman" to see the fun testimony)
Court Transcript 2

TL;DR:
  • Toyota was NOT following accepted safety design practices
  • Toyota had a poor safety culture, not taking safety seriously
  • If we're ever going to see self-driving cars, people need to get this stuff right

Comments

  • edited November 2013
    So your saying we aren't quite at the beginning of a robot takeover yet or are you?

    Also this is probably going to be a thing that comes up if and when Google tries to do more with their self-driving cars.
  • edited November 2013
    The bottom line is:
    • Adding computers to stuff makes that stuff more complex.
    • Maxing stuff more complex means there will be more bugs.
    • All code has bugs.
    • ALL CODE HAS BUGS.
    • Debugging will not remove all bugs.
    • The only way to make sure a system is safety critical is to:
      1. 1. Follow sound, methodical, accepted, and thorough design practices.
      2. 2. Redundancy.
      3. 3. Redundancy.
    • If you're able to remove human error (self-driving cars woo!) without introducing bugs, then you just made it safer. Grats!

    What I'm saying is don't buy a Toyota.
    Q: If Toyota removes every bug that caused an unsafe vehicle state, how many bugs are left?
    A: All of them.

    Toyota has shitty engineering practices, poor documentation, and THAT'S what's unsafe. Get a Honda instead.

    Also, other companies besides Google are looking into self driving cars.

    EDIT: I don't know if that really answers your question though?
  • edited November 2013
    Close enough. Joke questions are funny like that. I'll just assume that when the robots attempt to take over they'll fail epically due to code errors.

    In more serious discussion, human error is ironically the biggest obstacle to self driving cars in the long run, because even if the car can drive itself, the question is can it drive well enough on roads full of human-driven cars? Until you can get rid of all human drivers, there's always unpredictability. If you do replace all cars with self-drivers, there will be bugs in the system which are a product of human error.
  • edited November 2013
    I always thought we'd do the magnetic road thing from Minority Report or I, Robot.
  • edited November 2013
    I don't like all the complicated stuff cars have these days. Too many things to go wrong. I guess I'm a little paranoid, but I'll stick with my manual transmissions and crank windows.
Sign In or Register to comment.