1 2 3 ... 5
ebonyandivory
ebonyandivory PowerDork
12/8/19 7:23 a.m.

Seems to be more and more stories with the same theme.

(I beat Woody to the punch by 5 minutes!)

Incidentally, I think the "oil stains" are asphalt patch material.

https://interestingengineering.com/tesla-autopilot-confuses-oil-stain-for-lane-separator

 

https://abcnews.go.com/US/tesla-autopilot-slams-police-cruiser-driver-claims-checking/story?id=67570199

AAZCD
AAZCD HalfDork
12/8/19 7:44 a.m.

There was this one time I was cruising on I-70 West in a 1980s van on cruise control and got out of the seat to get a drink from a cooler in the back... Many of the cases I read and hear about are more about drivers not being ready for Prime Time and expecting magic from a system that is designed as an aid, and not for fully autonomous operation. 

pres589 (djronnebaum)
pres589 (djronnebaum) PowerDork
12/8/19 7:55 a.m.

The system is too close to full autonomy to stop people from treating it as such, in my opinion.

ebonyandivory
ebonyandivory PowerDork
12/8/19 8:03 a.m.

Joe Rogan has one of the most popular podcasts in the world and talks about cruising around on auto and how amazing it is. I'm sure it is but like many things in life. If it screws up once, it can screw up again.

This system is failing to recognize objects for what they are while also being fooled by what is does see.

And yes, I understand actual humans can suck at driving too but at least they can get cited, arrested and imprisoned for their errors in judgement.

yupididit
yupididit UberDork
12/8/19 9:20 a.m.

Welp, there's only one way for it to get better at autopilot and that's to keep testing and using it. That will provide data to the developers who are making this come to life. The more data that gets aggregated and then put into development the better it gets. To build any automation, AI, or machine learning acquiring real world data and using it the further development is required. This isn't like building a part and testing it's longevity through heat cycles and stress test, this is software development that requires many iterations and revisions to get right (not perfect).

As many recalls and failures all manufacturers have that cause injury and death this is along the same line. Except, everyone expects Tesla to be perfect and whenever there's a mishap then it's controversy. Guess that's the plight of the pioneer.

RX8driver
RX8driver Reader
12/8/19 9:44 a.m.
yupididit said:

Welp, there's only one way for it to get better at autopilot and that's to keep testing and using it. That will provide data to the developers who are making this come to life. The more data that gets aggregated and then put into development the better it gets. To build any automation, AI, or machine learning acquiring real world data and using it the further development is required. This isn't like building a part and testing it's longevity through heat cycles abs stress test, this is software development that requires many iterations and revisions to get right (not perfect).

As many recalls and failures all manufacturers have that cause injury and death this is along the same line. Except, everyone expects Tesla to be perfect and whenever there's a mishap then it's controversy. Guess that's the plight of the pioneer.

Except that recalls and such are due to mistakes that were made, not intentional beta testing on public roads using unsuspecting members of the public (Tesla owners), endangering non-consenting members of the public (everyone around them).

Keith Tanner
Keith Tanner GRM+ Memberand MegaDork
12/8/19 9:56 a.m.

I have a pretty low opinion of the average driver over time. If we had worldwide headlines every time somebody dropped their phone and slammed into a tree, we’d never notice the autonomous crashes. But that’s now how news stories work. 

That said, autonomy is in a bad place right now. Telsa Autosteer (Autopilot is just adaptive cruise according to my car) is like a timid teenage driver. You have to be ready at all times to take over either because it’s making bad decisions or because it’s given up. But if it goes any significant period of time without a problem, you’ll relax and start paying attention to other things. Even worse, you might abdicate your driving duties completely. And then when the Autosteer runs out of ideas, you’re not ready. 

This is going to be the case until full autonomy rolls out. There’s an awkward middle ground that just doesn’t work for the car or the driver. Some Tesla owners are putting themselves in this middle ground but assuming they’re not. I don’t have a good answer for it, honestly. 

Knurled.
Knurled. GRM+ Memberand MegaDork
12/8/19 10:00 a.m.
ebonyandivory said:

Joe Rogan has one of the most popular podcasts in the world and talks about cruising around on auto and how amazing it is. I'm sure it is but like many things in life. If it screws up once, it can screw up again.

This system is failing to recognize objects for what they are while also being fooled by what is does see.

And yes, I understand actual humans can suck at driving too but at least they can get cited, arrested and imprisoned for their errors in judgement.

Oh, there are laws on the books, if your autopiloted car hits somebody, you are at fault because you were supposed to be paying attention to what was going on.

Rons
Rons GRM+ Memberand Reader
12/8/19 10:09 a.m.

In reply to Keith Tanner :

Full autonomy can never be fire and forget. Inevitably some there will be some system wide or localized melt down, sunspots? Weather conditions? I don't know. What I do know is berkeley ups will be colossal.

Being from Vancouver I know nothing about autonomous systems, no wait I do. Sky train and Canada Line are autonomous and work great right up until they don't and then the Berkeley up is epic.

_
_ Dork
12/8/19 10:09 a.m.

And that's part of the problem. Folks get it in Their head that this thing is full control. While in reality, it merely makes it so you don't have to have your hands on the wheel or feet in the pedals at all times. Nothing in the manual says you can takes your eyes off the road at anytime. 

Knurled.
Knurled. GRM+ Memberand MegaDork
12/8/19 10:16 a.m.

In reply to _ :

That is why I don't get the appeal.  It would seem to me that just sitting there, having to constantly mind what is going on and prepared to take control, would be more fatiguing than just driving the car in the first place.

rslifkin
rslifkin UltraDork
12/8/19 10:25 a.m.
Knurled. said:

In reply to _ :

That is why I don't get the appeal.  It would seem to me that just sitting there, having to constantly mind what is going on and prepared to take control, would be more fatiguing than just driving the car in the first place.

On an average day, I don't see a whole lot of benefit.  But coming from the boating world, the idea of not having to micromanage the thing in less than ideal conditions can greatly reduce fatigue and also keep you less task saturated so you can better monitor what's going on outside.  Provided the autopilot can drive well in snow, for example, it could significantly reduce workload on a snowy highway with slush ridges everywhere by letting the driver handle the big picture while the car handles the details. 

yupididit
yupididit UberDork
12/8/19 10:26 a.m.

In reply to RX8driver :

The Tesla owners aren't  unsuspecting. Nowhere has Tesla said they've released a fully functional autopilot. Owners who stop paying attention are at fault. My gf Honda Accord will stay in its lane on its own, as well as maintain a distance from the car in front of you on its own, it will even hit the brakes if that distance is compromised. Will I leave it to the car to do those things while I focus on the puppy in the back seat? Never. Why? I'm not an idiot. 

Also, I'll refer back to the quote below in regards to beta testing. AI and machine learning is what I do in the Air Force. If Tesla only acquired data in test environments then the development time of autopilot driving would take significantly longer. But, by releasing pieces here and there as attended automation (under human attention and control not to be left unmonitored). Taking that data and aggregating it to improve upon what's already there while still receiving live feedback is the best way. AI and machine learning requires massive amounts of data collection and analysis. Since they're the ones heading this charge they have to get the data somehow. These companies aren't doing it and sharing it with each other either which would be best for us all. Bottom line, autopilot driving is safer than human pilot driving. In order tho get there we have to use every instance of driving utilizing these systems as data for further development. This is why Tesla collects data from the cars, especially when one of these systems does something off. I hope I made some kind of sense lol

To build any automation, AI, or machine learning acquiring real world data and using it the further development is required. This isn't like building a part and testing it's longevity through heat cycles and stress test, this is software development that requires many iterations and revisions to get right (not perfect).

AAZCD
AAZCD HalfDork
12/8/19 10:28 a.m.

Tesla has self driving data for nearly 2,000,000,000 miles of driving and uses it to improve the AI.

That's quite a foundation to build on. National 5G connectivity will make a huge difference in the ability to drive autonomously and it's coming soon. It keeps learning and adapting - it will be safer than human driving at some point.

 

yupididit
yupididit UberDork
12/8/19 10:31 a.m.

I also know there will be people who are against this and criticize based on their own perceptions and desires no matter what the future state may look like. 

I'm not a Tesla fanboy at all but from the art and science of what they're doing, I understand. Now if idiot drivers would understand to pay attention no matter what nannies are activated then we wouldn't be getting these instances. But, hey at least these human mistakes will just make self driving cars even better. 

MTechnically
MTechnically Reader
12/8/19 10:34 a.m.

I think the problem comes largely down to the owners who believe the system is more capable than it really is, and Telsa doing very little to counteract that idea. Especially in the social media marketing of Autopilot, hell the name is a bit misleading on its own, Telsa seems more than happy to let people continue in the belief that their cars can "drive themselves". Telsa should be making the limitations of their systems abundantly clear.

BoxheadTim
BoxheadTim GRM+ Memberand MegaDork
12/8/19 11:40 a.m.

In reply to MTechnically :

I agree with you - the hype if far outpacing the actual system capabilities at the moment. That doesn't mean they're not continuously improving, but the problem right now is that the systems are in this somewhat unhealthy no-man's land between good enough in a lot of cases and spectacularly bad in others.

I for one would love to have a car with enough self-driving capabilities to take me home from the airport when I get to the airport at midnight and face a two hour drive home. Or (back when we were living in Nevada) home across the mountains safely in bad weather. Unfortunately the technology isn't there yet and nobody knows when we'll actually get there.

There were some studies I read about in an ACM journal that it takes pilots longer to react to issues that require taking over from the autopilot than it takes them to react to the same issue when they're flying the plane themselves. The explanation for that was that even though these are highly trained professionals and are monitoring the systems like they should, the autopilot lulls the brain into a different "mode" that takes a second or two to get out of. Now project that on the average car driver...

MTechnically
MTechnically Reader
12/8/19 12:11 p.m.

In reply to BoxheadTim :

Sounds like we pretty much agree entirely. I would prefer to drive 95% of the time, but it would be nice to just turn on a truly autonomous system. The real issue is that we are in the grey area of driver aids versus truly autonomous systems, and the average consumee/driver is not made aware of the differences. That leads to drivers relinquishing too much attention to the road and we see the bad results.

irish44j
irish44j MegaDork
12/8/19 12:21 p.m.
MTechnically said:

I think the problem comes largely down to the owners who believe the system is more capable than it really is, and Telsa doing very little to counteract that idea. Especially in the social media marketing of Autopilot, hell the name is a bit misleading on its own, Telsa seems more than happy to let people continue in the belief that their cars can "drive themselves". Telsa should be making the limitations of their systems abundantly clear.

I follow car things, and this is exactly what I thought. I thought "actual" autopilot was already out there and on the road, including self-steering. The average consumer can't even change a tire, you think they're going to "get" the nuance that "autopilot" isn't REALLY an autopilot, it's just a cruise control? nope.....

ebonyandivory
ebonyandivory PowerDork
12/8/19 12:34 p.m.
Knurled said:

Oh, there are laws on the books, if your autopiloted car hits somebody, you are at fault because you were supposed to be paying attention to what was going on.

While the creator of the software that failed to do what it's designed to do shrugs it of as productive research?

 

ebonyandivory
ebonyandivory PowerDork
12/8/19 12:40 p.m.

I'm wondering if Tesla tells the new owners that they're all actually Beta Testers rather than "look what I just bought" consumers.

ebonyandivory
ebonyandivory PowerDork
12/8/19 12:44 p.m.

Just wait until the first Tesla completes the Cannonball Run on "autopilot"!devil

irish44j
irish44j MegaDork
12/8/19 2:38 p.m.
ebonyandivory said:

Just wait until the first Tesla completes the Cannonball Run on "autopilot"!devil

Nobody in the car at all = nobody can lose their license or get arrested lol....

BlueInGreen - Jon
BlueInGreen - Jon SuperDork
12/8/19 2:50 p.m.

Is Tesla “Autopilot” any different than the adaptive cruise and lane-keeping assist that other manufacturers offer?

ebonyandivory
ebonyandivory PowerDork
12/8/19 4:27 p.m.
BlueInGreen - Jon said:

Is Tesla “Autopilot” any different than the adaptive cruise and lane-keeping assist that other manufacturers offer?

Our 2019 Pacifica has both. The cruise is pretty cool but the lane departure scares me. It'll nudge the wheel enough to make you notice but I've experimented on empty roads and it WILL NOT keep you from driving over the lines.

That said, driving in either mode (or both) feels like having your newly adopted pit bull guarding your newborn.  It'll probably be ok... You just can't trust it. 

1 2 3 ... 5

You'll need to log in to post.

Our Preferred Partners
ZkzhSaFdoKV7MYjCuGUNMPEZxtRX1bJuoL4sxKkxKso8eBsFFc2H4vz0qiQlHFyJ