On The Road Lending
On The Road Lending

BETTER CAR BETTER JOB BETTER LIFE

Self-driving cars: Our concerns were confirmed

Fatal crash involving Tesla in autopilot


Not too long ago we wrote a post on the innovative idea of self-driving cars. We were cautiously inquisitive about a future where cars drive themselves, asking questions like, “If we have a split second to decide between hitting a person or animal in our path with swerving off the road and into a tree, how does a driverless car make that decision?” There are decisions that are hard (if not impossible) for us to make as drivers, so how can we expect a computer to make them?

It turns out we were right to be concerned. On May 7, there was a fatal crash in Florida between a Tesla in autopilot and a tractor trailer. Joshua Brown, an entrepreneur from Ohio, was killed when the Tesla failed to apply brakes to allow the tractor trailer to make a left turn.

Currently the National Transportation Safety Board and the National Highway Traffic Safety Administration are both investigating the incident and trying to get to the bottom of why this happened. “Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said in a statement released recently.

On top of all of this, Tesla is looking into a more recent crash that took place in July in Pennsylvania that may have also been a result of the autopilot feature. (Tesla says there’s no evidence that points to autopilot, but the police contend that the driver said the car was in autopilot.)

Our concern here is that since Tesla has put the autopilot feature in all of its cars, there’s no real way to test it out in a controlled environment. Instead of testing in private, like most other companies, they’re testing with the public. And while autopilot is disabled in all the cars — drivers who want to use autopilot must first enable it and check a box saying they’ll “maintain control and responsibility” of the vehicle — we can’t help but feel like this is a dangerous way to beta test a product. It’s not like a new iPhone app, people’s lives are at stake here.

What are your thoughts on the technology of self-driving vehicles? We know it’s inevitable, but should it be tested privately first before it goes public?