1 2
Woody
Woody GRM+ Memberand MegaDork
12/8/19 7:28 a.m.

Not a criticism, just an observation.

 

 

NORWALK, CT (WFSB) - A state police cruiser was struck by a Tesla in autopilot mode Saturday morning on I-95 in Norwalk, said police. 

Police said the crash took place in the early morning hours just north of Exit 15. 

Troopers out of Troop G responded to a disabled vehicle in the left center lane of the highway, officials said. 

 

Police said the troopers had their emergency lights activated and flares behind the cruiser. 

As the troopers were waiting for a tow truck for the disabled vehicle, a Tesla Model 3 crashed into the back of one of the cruisers. 

The Tesla also hit the disabled vehicle. 

 

The driver of the Tesla told police his car was in autopilot while he was checking on his dog in the back seat just before the crash. 

Police say the driver was given a misdemeanor summons for Reckless Driving and Reckless Endangerment. 

Officials say nobody was seriously hurt in the crash. 

"According to the National Highway Traffic Safety Administration, although a number of vehicles have some automated capabilities, there are no vehicles currently for sale that are fully automated or self-driving," said State Police.

https://www.wfsb.com/news/state-police-cruiser-struck-by-tesla-in-autopilot-mode-on/article_03f661fa-18ff-11ea-968f-5bb6ecb31c72.html

pres589 (djronnebaum)
pres589 (djronnebaum) PowerDork
12/8/19 7:35 a.m.

Why is Tesla's autopilot function allowed to exist on vehicles sold to the public?  I've never understood this.  Clearly the tech isn't ready.

MrJoshua
MrJoshua UltimaDork
12/8/19 8:15 a.m.

So a car was stopped in the fast lane? Anyone here willing to calculate the stopping distance from 70mph, plus the distance traveled during a normal human reaction time, to see how far off a car would need to start an emergency stop? 

It does present an interesting scenario. We would notice the sky glowing in the distance from the flares, and the reflections of the red and blue lights strobing off in the distance before we actually saw either one. Those are obvious to our brain but seem like they would be an odd one for an automated system to recognize.

APEowner
APEowner GRM+ Memberand Dork
12/8/19 8:23 a.m.

Not that the Tesla system doesn't need work but people run into stopped emergency vehicles on their own with alarming regularity.

ShawnG
ShawnG PowerDork
12/8/19 9:14 a.m.

Car doesn't have a brain.

Please use yours.

No Time
No Time Dork
12/8/19 9:23 a.m.
APEowner said:

Not that the Tesla system doesn't need work but people run into stopped emergency vehicles on their own with alarming regularity.

That is understood, people are going to misjudge, lack attention, or otherwise make mistakes. 

The problem I see is that the Autopilot is software based. Software follows a fixed set of rules and could be expected to have the same behavior every time.

As a result, where most drivers would avoid the impact with the first responders and disabled vehicle, any/all Tesla Model 3 cars on autopilot would have produced the same result as the one that crashed.

Right now it appears that the driving public is being used to test the autopilot system and find bugs in the control algorithm. Until Tesla (or any other self driving cars) figures out how to address the issues identifying stationary vehicles we can expect to see more of of this type of accident. 

I like technology, but I don't want to see tech launched that has known risks that haven't been mitigated. 
 

yupididit
yupididit UberDork
12/8/19 9:24 a.m.

The owner turned around to check on their dog. Guess the owner figured the dog's comfort was more important than the safety of everyone else on the road. Who does that? 

Streetwiseguy
Streetwiseguy MegaDork
12/8/19 9:29 a.m.

Was the car really on Autopilot, or is it now a good excuse for Tesla owning E36 M3 drivers?

No Time
No Time Dork
12/8/19 9:30 a.m.

In reply to yupididit :

Dog, kids, or passengers, people are going to be distracted.  

Autopilot lets them think they can ignore the road and focus on the distractions. 

RX8driver
RX8driver Reader
12/8/19 9:36 a.m.

Part of the problem is marketing. They say "full self driving" and people think that's what it is, instead of an advanced cruise control. The world's best computer is still between your ears.

lnlogauge
lnlogauge HalfDork
12/8/19 9:42 a.m.

In reply to pres589 (djronnebaum) :

Because Tesla has logged millions of miles on autopilot, and you hear of every single time there's an accident. Even with the accidents, autopilot is 9 times safer than the average human driver.  also, just because accidents happen doesn't mean technology should stop. 

I'm on the side that it wasn't actually on. If Tesla wasnt able to see stopped vehicles, there would be alot more headlines.

In reply to RX8Driver :

The computer in between your ears makes errors, falls asleep, and sucks at processing more than one thought at a time. Absolutely no. 

​​​

 

 

Keith Tanner
Keith Tanner GRM+ Memberand MegaDork
12/8/19 10:00 a.m.
RX8driver said:

Part of the problem is marketing. They say "full self driving" and people think that's what it is, instead of an advanced cruise control. The world's best computer is still between your ears.

FYI, full self driving has not been released on Tesla’s other than in news stories. It’s kinda like Al Gore claiming he invented the internet - that was a joke in a Wired article but it somehow became “truth”.

The problem with the “worlds best computer” is that it’s rarely functioning at full capacity and that every single one has to be trained over years, starting from scratch. Real computers can do better. Eventually. 

_
_ Dork
12/8/19 10:03 a.m.
No Time said

I like technology, but I don't want to see tech launched that has known risks that haven't been mitigated. 
 

What, you mean like windows 10, every apple product, every video game launch, GoPro, every app released, and every website? 
Sadly it is the new normal to release trash and then fix it on the fly. The world wasn't like that twenty five years ago. You didn't release your product unless it was perfect, because it's flaws would bury your company. 

stuart in mn
stuart in mn MegaDork
12/8/19 10:57 a.m.

I wonder how people would react if the story was about a person checking on their dog in the back seat of their Toyota while it was on cruise control...either way, the driver is at fault.

BoxheadTim
BoxheadTim GRM+ Memberand MegaDork
12/8/19 11:17 a.m.
lnlogauge said:the average human driver.  also, just because accidents happen doesn't mean technology should stop. 

I'm on the side that it wasn't actually on. If Tesla wasnt able to see stopped vehicles, there would be alot more headlines.

Actually that's been documented a few times. There have been several instances of Teslas on Autopilot hitting stationary vehicles including emergency vehicles. My understanding from what I read is that there are some classification issues of obstacles that lead to the software occasionally misclassifying/ignoring obstacles that should trigger a full on brake or evasion. Some people seem to believe that Tesla invited additional complexity by relying on cameras alone and not cameras + LIDAR like most other (semi-) autonomous vehicles.

kb58
kb58 SuperDork
12/8/19 11:23 a.m.

I can't imagine owning a car company selling a self-driving car because of our litigious society. There'll be endless lawsuits for every oddball crash no one considered.

Here's a riddle I heard recently: Say you write software for electric cars, creating rules for given road conditions. A rule says, if the car sees something stationary in its lane, do a quick lane change to save both the car and driver. Situation: The car's driving down a two-lane street in town, and sees something stationary in its lane. The software tells the car to do a lane change. The thing is, there's oncoming traffic, and a sidewalk to the right, with objects on it. What should your software do? (let's also assume that it's raining, so the car can't stop, it must veer left, right, or go straight.)

What if the object in the road is an obviously empty cardboard box (but your sensors can't know that)?

What if the objects on the sidewalk are parking meters (that your sensors can't judge)?

What if the object in the street is a stalled car with people around it and the objects on the sidewalk are people? Look at all the permutations of the above; how can the car always make the right choice? To me, the car becomes better than people once automated driving results in fewer accidents than people-driven cars. Unfortunately, accidents will still happen, and the software will be blamed even for cases where the equivalent human driver would make the wrong decisions. Lawyers won't care, and will sue the company anyway.

My hat is off to Tesla for being brave enough to take this on. Parity will be reached when self-driving cars have less accidents than people-controlled cars, but that won't stop the lawsuits.

BoxheadTim
BoxheadTim GRM+ Memberand MegaDork
12/8/19 11:24 a.m.
_ said:

Sadly it is the new normal to release trash and then fix it on the fly. The world wasn't like that twenty five years ago. You didn't release your product unless it was perfect, because it's flaws would bury your company. 

There's enough crap software from 25 years ago that might prove the counterpoint . That said, you are right, at least in general software tended to be tested better because it was several orders of magnitude harder to get people to upgrade and apply patches. These days most desktop software includes ways of checking for upgrades because you can safely assume that most of your clients have an always on connection. Wasn't the case 25 years ago.

The other part is that yes, it's easier to update or patch web applications, especially if you rent those to people and control the servers. For less disciplined (cheaper?) software companies that often means release whatever crap they think is ready for their hourly upgrade push and then maybe fix the issues later on.

I'm also wondering if we haven't got the regulatory capacity anymore that ensures certain safety levels must be met before certain type of assitance systems can be released to the general public.

BoxheadTim
BoxheadTim GRM+ Memberand MegaDork
12/8/19 11:25 a.m.

In reply to kb58 :

That's actually an intense field of ethics study in that corner of computer science right now.

bruceman
bruceman Reader
12/8/19 11:26 a.m.
stuart in mn said:

I wonder how people would react if the story was about a person checking on their dog in the back seat of their Toyota while it was on cruise control...either way, the driver is at fault.

Even worse if it was a Chevrolet

Vigo
Vigo MegaDork
12/8/19 11:29 a.m.

 Guess the owner figured the dog's comfort was more important than the safety of everyone else on the road.

Actually, the average person putting a dog in their car has decided that dog's safety is less important than basically everyone else on the road. Dogs are almost never secured at all, and even in the BEST case they are still less safe than a human in human-centric restraints.  

Most cars with dogs in them would have dead dogs in them (or outside them...) if they got into a serious accident. 

yupididit
yupididit UberDork
12/8/19 11:52 a.m.
Vigo said:

 Guess the owner figured the dog's comfort was more important than the safety of everyone else on the road.

Actually, the average person putting a dog in their car has decided that dog's safety is less important than basically everyone else on the road. Dogs are almost never secured at all, and even in the BEST case they are still less safe than a human in human-centric restraints.  

Most cars with dogs in them would have dead dogs in them (or outside them...) if they got into a serious accident. 

Yes the average person doesn't secure their dogs. But I'm talking about this person in that Tesla. 

irish44j
irish44j MegaDork
12/8/19 12:13 p.m.
kb58 said:

I can't imagine owning a car company selling a self-driving car because of our litigious society. There'll be endless lawsuits for every oddball crash no one considered.

Here's a riddle I heard recently: Say you write software for electric cars, creating rules for given road conditions. A rule says, if the car sees something stationary in its lane, do a quick lane change to save both the car and driver. Situation: The car's driving down a two-lane street in town, and sees something stationary in its lane. The software tells the car to do a lane change. The thing is, there's oncoming traffic, and a sidewalk to the right, with objects on it. What should your software do? (let's also assume that it's raining, so the car can't stop, it must veer left, right, or go straight.)

What if the object in the road is an obviously empty cardboard box (but your sensors can't know that)?

What if the objects on the sidewalk are parking meters (that your sensors can't judge)?

What if the object in the street is a stalled car with people around it and the objects on the sidewalk are people? Look at all the permutations of the above; how can the car always make the right choice? To me, the car becomes better than people once automated driving results in fewer accidents than people-driven cars. Unfortunately, accidents will still happen, and the software will be blamed even for cases where the equivalent human driver would make the wrong decisions. Lawyers won't care, and will sue the company anyway.

My hat is off to Tesla for being brave enough to take this on. Parity will be reached when self-driving cars have less accidents than people-controlled cars, but that won't stop the lawsuits.

I can't speak for Tesla, but the very basic automatic braking on my GTI absolutely HAS engaged going about 60mph with a traffic stoppage ahead on the highway. It was quicker than my foot, which was about to do the same. You don't need an algorithm to change lanes, you just need the most simple algorithm: engage the brakes. 

If your car "can't stop" because of rain, the algorithm should take the conditions into account and make the estimated stopping distance further ahead. We have rain-sensing wipers, so I assume the Tesla can sense when it's raining. . Or, just not be able to be used in the rain, that's an easy solution. 

Side note: in the winter, the auto-braking sensor on the GTI (inside the VW badge on the grille) actually ices over and turns itself off (with a large, annoying warning on the dash). I certainly woudln't trust it in the rain, snow, etc since any of these sensors can be fooled by conditions. 

No Time
No Time Dork
12/8/19 1:25 p.m.
_ said:
No Time said

I like technology, but I don't want to see tech launched that has known risks that haven't been mitigated. 
 

What, you mean like windows 10, every apple product, every video game launch, GoPro, every app released, and every website? 
Sadly it is the new normal to release trash and then fix it on the fly. The world wasn't like that twenty five years ago. You didn't release your product unless it was perfect, because it's flaws would bury your company. 

To your point, software will always have bugs otherwise it would never get released. 

There is a difference between rebooting your PC, losing a video clip and plowing into the back of a stopped vehicle. The consequence and severity is much different, this is more like the 737max than Windows 10.

Foreseeable misuse is a real thing, and in this case should be mitigated whenever possible. If Tesla can't foresee people treating the driver assist as "autopilot", then they may need to rethink there implementation.  

If a new Subaru can track your eyes and alert you when you look away from the road. Then Tesla should be able foresee sleeping/reading/turning around should have mitigation's in place to address that to avoid the use of driver assist as autopilot. 

 

chada75
chada75 Reader
12/8/19 3:42 p.m.

In reply to yupididit :

A Sociopath.

 

ebonyandivory
ebonyandivory PowerDork
12/8/19 6:17 p.m.

When a Tesla kills someone on autopilot I'll be comforted knowing "Tesla has logged millions of miles."

1 2

You'll need to log in to post.

Our Preferred Partners
GaoK6PMu1O4UgXfgy566bqnsAEdZvpHcWCaA5afJm30As9jQMMxl21fFxFBrjcnl