this post was submitted on 14 Aug 2023
502 points (96.7% liked)

Technology

58061 readers
31 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 36 points 1 year ago (3 children)

You know what might work, program the car so that after the second unanswered "alert" the autopilot pulls the car over, or reduces speed and turns on the hazards. The third violation of this auto pilot is disabled for that car for a period of time.

[–] [email protected] 15 points 1 year ago (1 children)

I drive a Ford Maverick that is equipped with adaptive cruise control, and if I were to get 3 "keep your hands on the wheel" notifications, it deactivates adaptive cruise until the vehicle is completely turned off and on again. It blew my mind to learn that Tesla doesn't do something similar.

[–] [email protected] 2 points 1 year ago (1 children)

It does and did... He kept driving anyway. Drink drivers FTW.

I presume AEB kicked in but all that can do is reduce the speed of inpact.. if you're determined to kill yourself there's not much the car can do.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

This is preventable and Tesla and other auto manufacturers should respond to make it so. No consumer vehicle should under any circumstance choose to drive full speed into a barrier or allow a driver to do so. It's the reason we have driver assistance: improved safety.

[–] [email protected] 0 points 1 year ago (1 children)

The problem with this is what if the car thinks there's a barrier in front of you but there isn't? People are arguing that these systems are too intrusive while also arguing that they don't go far enough to take control away from drivers.

This situation happened because a drunk driver ran into police cars, something that has been happening for as long as cars have existed.

[–] [email protected] 0 points 1 year ago (1 children)

That's the issue with current "self driving" systems in a nutshell. We're in this terrible middle ground right now where these features let careless drivers take their attention away, but not actually be able to control the vehicle safely. We should ban all that crap until actual self driving is viable.

[–] [email protected] -1 points 1 year ago (1 children)

How does it become viable if you ban the technology? What we have now is advanced cruise control that protects drivers in some circumstances while having zero effect in others. Drivers were equally dumb and careless long before this technology existed. This new tech doesn't make that aspect any worse. Banning it now just means more people will crash and more people will be injured.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

Here's a an article referencing a UK white paper that talks about the issues with level 2 and 3 autonomous vehicles.

https://www.tu-auto.com/adas-level-2-3-avs-are-hazards-experts-warn/

*“With adaptive cruise control (ACC) for instance, it takes twice the amount of time to respond to a sudden braking event than it does when you are manually driving. Drivers may believe that ACC is safer but actually taking your foot off the accelerator pedal and letting the car make the decisions leads to lower workload and can mean drivers are unprepared for an unexpected event.”

University of Sussex object recognition researcher Dr Graham Hole was also questioned for the study and dubs Levels 2 and 3 “the worst of all worlds”. He says: “Human beings are rubbish at being vigilant – vigilance declines after about 20 minutes. With semi-autonomous you are reducing the driver to monitoring the system on the off-chance something goes wrong. Most of the time nothing goes wrong, leading the driver to have massive faith in the system in all conditions, which of course isn’t always the case.”*

[–] [email protected] 1 points 1 year ago (1 children)

The paper features a defense of ADAS by Thatcham Research principal automated driving engineer Colin Grover, who claims much of the tech “operates in the background, like autonomous emergency braking … not all ADAS adds distraction … it is there to help when needed.”

Your first quote is only referring to ACC which maintains speed and distance between you and the car in front of you, but doesn't include automatic braking, something included on all the cars with these systems currently.

I'll ask again, how do you achieve level 4/5 autonomy if you ban these from the road and they never get real world testing.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

Well, to answer your question, I'd say that it needs to be a coordinated national/international effort (e.g. led by the E.U for Europe). This gives the ability to enact long term, coordinated planning with predetermined cut-off dates where not only the technology of the cars would change, but also infrastructure.

To me it doesn't make sense to adapt the vehicles to work with an infrastructure designed for humans, so if we really want self driving vehicles we should adapt the infrastructure for it, and also we should have all the cars talk to each other so they can work in unison (e.g. they would all start perfectly at the same time after a "red light", which wouldn't even need to be one, and eliminate collisions since everything would be predicted by the AI, what can't be would still have to rely on cameras and sensors of course).

Meanwhile, car manufacturers could keep adding smart safety features, but nothing marketed as "autopilot" or "self-driving".

[–] [email protected] 1 points 1 year ago (1 children)

This didn't answer how a system would be fully developed without ever setting foot on a real road, with real obstacles, real weather, and real drivers.

Furthermore, if we were to follow this plan, would everyone in a participating nation receive a new car when the changeover occurs? In the US there are something like 250 million registered vehicles which would need to be replaced at the same time in order to be equipped with this new technology needed to work in unison with every other vehicle on the road. Frankly this is an unworkable solution IMO.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

It would need to be a staggered thing. Even maybe level 2/3 would be needed during some of the stages. The stages will need to be long enough and/or subsidized enough to let every road user get appropriate vehicles under predefined timelines.

Obviously I won't pretend I worked out every single details, but I just don't think leaving it up to Elon&Co to figure it out while gambling with people's lives is the right way to go.

It sounds like you're from the US so I do understand why you'd think countries making companies work together towards something like this is impossible. It might be, there. It would be a colossal project tbh, but I stick by my opinion that it needs to be a transition supervised by regulatory bodies and not just the wild west with every company doing different things.

The whole system will 100% have to be unified to support full self driving.

[–] [email protected] 8 points 1 year ago (3 children)

This is literally exactly how it works already. The driver must have been pulling on the steering wheel right before it gave him a strike. The system will warn you to pay attention for a few seconds before shutting down. Here’s a video: https://youtu.be/oBIKikBmdN8

[–] [email protected] 7 points 1 year ago (1 children)

Ah, so its just people defeating the system

[–] [email protected] 5 points 1 year ago (2 children)

The system with cars is that you don't distract the driver from driving, having a system that takes over driving is exactly that, so the idea of the system is flawed to begin with.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

I have to say this is extremely inaccurate imo. Self driving takes over the menial tasks of keeping the car in the lane, watching the speed, etc. and allows an attentive driver to focus on more high level tasks like looking at the road ahead, watching the sides of the road for potential hazards, and keeping more aware of their blind spots.

Just because the feature can be abused does not inherently make it unsafe. A drunk driver can use cruise control to more accurately control the vehicle’s speed and avoid a ticket, does that make it a bad feature? I wouldn’t say so.

Autopilot and other driver assist systems are good when used responsibly and cautiously. It’s frustrating to see people cause an accident after misusing the system and blame the technology instead. This is why we can’t have nice things.

[–] [email protected] 0 points 1 year ago

It’s frustrating to see

This is why we can’t have nice things

It is also frustrating to see people whining for technology when they should rather think about dead policemen and rescuers.

You should get your priorities straight if you ever hope to be taken seriously

[–] [email protected] -1 points 1 year ago

Screenshotting this because it's so well put.

[–] [email protected] 2 points 1 year ago

Here is an alternative Piped link(s): https://piped.video/oBIKikBmdN8

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

The system will warn you to pay attention

... and if we have learned anything from that incident, it is that the warnings have been worthless.

The system can be tricked even by the worst drunkards! 150 times in a row.

for a few seconds before shutting down.

Few seconds are not enough. The crash was already unavoidable.

[–] [email protected] 1 points 1 year ago (1 children)

You’re misinterpreting what I said and conflating two separate scenarios in your 2nd statement. I didn’t say anything about the system warning “for a few seconds before shutting down” in the event of an eminent collision. It warns the driver before shutting down if the driver fails to hold the steering wheel during normal driving conditions.

The warnings were worthless because the driver kept responding to them just before they timed out and shut autopilot down. It would be even worse if the car immediately pulled off the road and stopped in traffic without warning the driver first.

They aren’t subtle either, after failing to touch the wheel for about 5-10 seconds it starts beeping loudly and flashing an icon on the screen.

This is not a case of autopilot causing an accident, this is a case of an impaired driver operating a vehicle when they should not have been. If the driver was using standard cruise control, would we be blaming the vehicle because their foot wasn’t touching the accelerator when the accident happened? No, we wouldn’t.

[–] [email protected] 1 points 1 year ago (1 children)

This is not a case of autopilot causing an accident, this is a case of an impaired driver

It is both, of course. The drunkard and the autopilot, both have added their share to create such danger, that ended deadly.

Driving drunk is already forbidden.

What Tesla has brought on the road here should be forbidden as well: lane assist combined with adaptive cruise control AND such a bunch of blind sensors.

[–] [email protected] -1 points 1 year ago

The driver was in autopilot. Auto pilot is cruise control and lane assist. It's not FSD. Tesla didnt bring that " to the road ". The driver was drunk, and with most auto pilot or FSD accidents...its user error.

Still unaware of a proven FSD accident.

[–] [email protected] 3 points 1 year ago

They didn't say he didn't respond to the alerts. If you don't respond, autopilot turns off.