Get the top HN stories in your inbox every day.
alextheparrot
randycupertino
I think the key difference is the backup driver was watching a video on her phone when the accident happened. I was a summer lifeguard on the ocean for many years (SOYA summer! Sit On Your Ass Summer), and we would have been fired if browsed our phones while on duty.
alextheparrot
I’m in complete agreement with your assessment on that. If the processes I mentioned above weren’t in place and someone drowned while the lifeguard was watching a video or reading a magazine, they should be held to account. I’m remiss to completely remove individual agency from scenarios like this. The processes just help make those failure cases less accessible because you get fired before it happens.
That’s really the fundamental point, these processes exist because those failure cases are understood, acknowledged, and chosen to be managed diligently.
SOYA summer was a lot of fun, though an office job still qualifies for the acronym even if you don’t get a great tan and have a lot less fun, heh.
Hokusai
> I think the key difference is the backup driver was watching a video on her phone when the accident happened.
Blaming the individual would not solve the problem. The question that the parent comment tries to answer is "what do we do so the backup driver does not watch his phone?".
That would save lives. To blame the individual and move on will kill more people in the future.
dathinab
I think the real question is:
Do self-driving cars which need a always very fast available backup driver make sense?
robertlagrant
> The question that the parent comment tries to answer is "what do we do so the backup driver does not watch his phone?".
We should do the same thing we do to stop regular drivers watching their phones.
sandworm101
>> What do we do so the backup driver does not watch his phone?
You make the human drive the car. The robot can jump in and slam the brakes when the human makes a mistake. That is the safe option.
But it just won't sell. Nobody here would buy a car that prevents the driver from making mistakes. Autodrive car that obeys speed limits while allowing driver facebook on phone: acceptable. Car that doesn't allow the human driver to speed or run red lights: market kryptonite.
_AzMoo
We should absolutely be blaming the individual in this case. They were negligent in their stated job role which led directly to the death of a bystander. We should also be finding fault with the processes and finding solutions for them, but this does not absolve the driver of their responsibility.
Tempest1981
> a manager actively observing for engagement
Maybe this is the key.
crazygringo
> "what do we do so the backup driver does not watch his phone?"
I mean, a first step is to hold the driver criminally negligent if they get into an accident. Which already happens. That's what legal punishment is for -- to deter extremely dangerous/harmful/etc. behaviors.
But because people often think "it'll never happen to me", that's why companies generally have supervision and spot checks of employees, to try to catch bad behavior and reprimand/fire them first, before it results in damage/death.
Which, if I were Uber, would probably mean installing fish-eye cameras in the corner of each vehicle that would regularly be spot-checked to see if backup drivers were, in fact, not doing their job.
This isn't a particularly difficult problem. And I don't think privacy is a particular issue here.
dragonwriter
> Blaming the individual would not solve the problem.
Punishing the culpable lapse by the individual is necessary but not sufficient. It should have been monitored and supervises by the employer, and failing that is also a culpable lapse which should be punished.
ayewo
To your question: The question that the parent comment tries to answer is "what do we do so the backup driver does not watch his phone?".
This is a solved problem: you fit the experimental car with an interior camera aimed at the driver. If the backup driver is observed to be distracted by the interior camera, the car can sound a warning for the driver to be alert and if they ignore the warning, the car's hazard lights would come on and it would be slowly brought to a halt.
hugh-avherald
Indeed. But to me the distinction between your experience and the driver's is that you were supervised and compliance was checked. If no-one was (at least occasionally) making sure the driver was always focusing on the road ahead, it was inevitable that the driver would not think the rule was that serious, and lapses were bound to occur.
kstenerud
The point is to not place someone in a position where their attention would likely wander (for example not having active shifts over 30 minutes in the case of a lifeguard).
Safety is about proactively defending against human nature, not blaming people after the fact.
tobyhinloopen
Again, it’s not the attention wandering away. It’s watching a video while driving a 2000kg steel vehicle
dathinab
But would that really have made a difference if that person would have starred straight forward instead.
If you don't expect something to happen reaction times can be really slow.
Which is also how it differs from the lifeguard job, there you always expect that from time to time something will happen, but as far as I can tell many people would not expect this when they use a self-driving car. Because you know, the car is already meant to be able to handle this. Or at least this is how many people think about self driving cars I think.
kevinmchugh
Hopefully people hired to be the safety check on self-driving cars are being trained that the cars are not perfect and require human oversight.
schaefer
>To make sure those interventions happen when needed, pools implement processes to better manage the risk.
Thankfully, this is already happening in the self driving space as well! At least as far as research and development efforts are concerned.
I used to work right next to another self-driving tech company: Aptiva in Las Vegas. Several times a day, I would see their R&D cars launching from headquarters, and there were always two backup drivers. In addition, the backup driver in the passenger seat always has a clip board up and was actively taking notes. A “two man rule“ is common practice in critical, high risk activities.
I’ve known the details of Uber’s murder of Elaine Herzberg for a long time. I read the original police reports early on[1]. And I don’t take that fatality lightly.
Just the same, even on a bicycle, I always felt safe sharing the road with Aptiva R&D cars because of their two man rule, and constantly attentive drivers.
[1] https://www.ntsb.gov/investigations/AccidentReports/Reports/...
f154hfds
Not to be pedantic but the indictment is 'Negligent Homicide'. I believe this is involuntary manslaughter. The reason I bring this up is because I see people misuse the term 'murder' on HN often, leading their readers to surmise the possibility of intent when that is not at all the charge.
lifeisstillgood
One technical "solution" to this might be to have the computer ask (voice) questions on the road - such as are we approaching a right or left bend, or was that a red car we overtook. These could measure attention to the road and reaction speed. It is also quite easy to see that being put in place for normal driven vehicles - one could imagine a insurance requirement to prove attention of the driver during the journey.
It will of course totally ruin the illusion of being driven smoothly by robot but I think that should be canned asap.
but I agree - the company is also liable for failing to have processes to stop the inattentive behaviour. IF you were on duty watching your phone, and no manager spotted this, or took away the phone they are just as liable.
DharmaPolice
I think the technical solution has merit. It doesn't need to be voice, presumably you could rig something up on a steering wheel with buttons to press to record every time they go past an intersection or something. Something to force them to watch the road and keep their hands in the right area. You're right though, that does dent the image of unattended self-driving but we've already demonstrated that isn't what this was.
People are condemning this person for being on their phone (and fair enough), but if I am sitting in a comfortable vehicle with nothing to do / read then I will fall asleep soon enough. I think this is quite common.
Jugurtha
Not a lifeguard, but I pull several people from the water during the summer. I once pulled someone to the shore in Tunisia, a few feet from where the lifeguard was standing. He neither noticed the person being in trouble, nor the rescue. A couple of elderly Germans saw us arrive and hurried to alert him but he had his attention directed elsewhere and didn't even notice them talking to him. They were outraged in an adorable elderly manner.
caf
I assume this is the reason that casino croupiers are rotated frequently off and on the floor as well, although there of course it is about protecting profits rather than safety.
jdeibele
Never worked as a lifeguard but have 3 kids who like to swim and who have survived to be teenagers.
The lifeguards at our local pools rotate about every 10 minutes. It seems like changing position and also that you have some interaction would be helpful to staying focused.
It's hard to replicate that in a car but perhaps something that was asking for data - "What's the speed right now? Are there any fire hydrants in sight?" - that could be flagged if contradicted could weed out drivers who are watching a movie or nodding off.
Cthulhu_
But don't lifeguards often work in teams of at least two, frequently (e.g. every 15 - 30 minutes) alternating simply because of that limited attention span?
fredophile
They do. That's why the OP mentioned rotations and redundancy. Uber could easily put two people in the vehicle and have them swap seats every 20-30 minutes. The person in the driver seat would be responsible for watching the road and reacting while the person in the passenger seat supervises.
kohtatsu
It's okay self-driving cars aren't yet single seater.
wutbrodo
Yup, I've had experience with safety drivers in autonomous vehicles that are extremely professional and that I feel very safe with. After seeing the Uber video, the first red flag for me was the fact that she was alone; the second was (obviously) the fact that she was watching a video on her phone, which means either she was being criminally negligent (if she had been on-shift for a short period of time) or Uber was (if she had a long, isolated shift).
shajznnckfke
I think it’s useful not to label this person’s job “driver”, but instead “fall guy”. Their job is to sit around all day doing absolutely nothing. But in the rare event where the car fails, they need to suddenly become alert and fix it, or take the blame for the machine’s failure. I think it’s less accurate to describe this transaction as a form of labor, but instead as an indemnity, an assumption of liability, paid for via a wage.
For train conductors with a similar challenge, a solution has been invented requiring them to pass various visual attention challenges that detect if they aren’t alert. Such systems weren’t present here. The system wasn’t designed to work - it was designed to protect Uber by shifting the blame for failure.
Aeolun
It is hard to imagine they weren’t aware that their job involved doing nothing (but pay attention to the road).
I’d be inclined to give them a pass if they were looking at the road and just inattentive (because that’s expected, like you say), but I find it hard to sympathize with someone watching a show on their phone, explicitly ignoring their one job.
clairity
that's the fundamental attribution error again, attributing poor judgement/negligence to the driver, who's the visible actor, rather than the inherent systemic flaws designed by other unseen actors who hold greater responsibility.
ALittleLight
It's an error to attribute negligence to the safety driver of an experimental car who is watching a video rather than watching the road?
Obviously the system should improve to make errors less likely, but that doesn't absolve anyone of guilt in my view.
petra
I think that might have passed if he only listened to the radio, a good enough way to relieve boredom. But streaming video ? That's a punishable offense while driving.
Also, many people work at really boring jobs. If serious errors happens while you watch netflix, you'll get fired, for sure.
But again, sometimes listening to music is permissable.
baddox
Would you say the same thing for a chauffeur or truck driver who killed someone because they were using their phone while driving?
undefined
undefined
robertlagrant
Saying it's an attribution error is circular. It's a conclusion, not a premise.
And I think it's wrong. If my car is on cruise control, that doesn't mean I can stop worrying about my speed. You can't always just blame the company because it's the bigger entity.
asdfasgasdgasdg
I think we can all agree that many murderers end up committing their crimes partly because of a system that gives rise to poverty and the knock on effects of poverty on humans. However, I think most folks also agree that incarceration is an important factor in discouraging the commission of more murders. Not everyone does but even the most lenient countries tend not to let the perp off scott free.
Which is all to say that "the system made me do it" really doesn't fly as an excuse for felonious behavior.
mindslight
Stepping back, I also find it hard to sympathize with someone watching a show on their phone. But there are two parties to this crime, and only one of them is being charged. Uber is responsible for creating this situation - I doubt this "backup driver" watches videos while driving her own car - yet has managed to escape liability. Hence why it's appropriate to describe her as a "fall gal".
goodluckchuck
Yeah, the problem isn’t that they weren’t paying attention, or that they weren’t even trying, but that they where intentionally NOT paying attention.
That said, I don’t think this gets Uber off the hook. If I were on a jury, I’d likely say Uber was guilty of manslaughter (barring a real look at the evidence).
akira2501
> That said, I don’t think this gets Uber off the hook
There is exceptional negligence on their part. The fact that vehicle was just blithely travelling the posted speed limit, even at night, even though the speed violated it's assured clear ahead ability, is a damning point. The vehicle will operate unsafely in it's default configuration.
PopeDotNinja
Honestly if I’d been given the job of babysitting a self driving car, the same thing could have easily happened to me. I’d get bored out of my mind and unintentionally pull out my phone & start browsing Reddit before I caught myself. It’s a shit, soul sucking job to actively do nothing.
undefined
hinkley
We know that you can pay someone as much as you want but you can’t get them to be vigilant for a low probability event. Except, as you say, by giving them some continuous skin in the game, even if it’s illusory.
This person was being paid to pay attention, which is hard enough to get right. What happens when the person owns the car, and is just commuting? And there are millions of them, not just one person?
Mayhem, if it’s a variant on this system instead of something much more sophisticated.
Erlich_Bachman
> you can’t get them to be vigilant for a low probability event.
How about starting by simply removing their phone? Isn't that easy? The driver was looking at her phone at the time when she was supposed to be looking at the road. This is a simple violation. The phone should have been banned and the driver fired after a first violation.
I am pretty sure that while what you say might be hard to do for some people, there are individuals out there for whom sitting and looking at the road for hours would not be such a big problem. If she couldn't handle the job, she shouldn't have worked there.
thedrbrian
> I am pretty sure that while what you say might be hard to do for some people, there are individuals out there for whom sitting and looking at the road for hours would not be such a big problem
Those people are known as drivers and they’re doing more than just looking at the road. The driving/feedback from the controls is keeping them engaged.
ocbyc
I believe it was "his" phone at the time.
zkms
> I think it’s useful not to label this person’s job “driver”, but instead “fall guy”.
Or "sacrificial part" -- designed to break first to protect the rest of the system. The NTSB report (https://www.ntsb.gov/investigations/AccidentReports/Reports/...) is an interesting read, by the way.
TeMPOraL
Corporate equivalent of ablative armor then?
AlexandrB
This is exactly right, and probably the way that liability will shake out with self-driving vehicles in general since it's in the manufacturer's best interest for it to be that way.
"Here's your self-driving car with a 20 page EULA/ToS. Oh, but if it's in self-driving mode and something goes wrong it's not our fault, you must respond (within seconds) and fix it yourself."
derwiki
Their job is definitely not to sit around and do nothing. Self-driving cars regularly disengage from autonomy (e.g. emergency vehicle, construction zone, erratic bicyclist) and the safety driver needs to be ready at all times. Safety drivers also undergo extensive training around this.
shajznnckfke
I believe the normal behavior of self-driving cars in these situations is for the car to brake to avoid hitting the obstacle, so the safety driver has ample time to take over and navigate the situation. Didn’t Uber disable the auto-braking because it was oversensitive, choosing not to pause road tests until that issue was fixed?
sokoloff
Uber disabled the OEM (Volvo) auto-braking but not the Uber braking algorithms. IMO, that’s largely a red herring as they could have equally well chosen to base their platform on a car that didn’t originally have a factory auto-braking system.
Cthulhu_
No it is not; there is no legislation in place that shifts the burden onto technology. The technology is assistive, and the driver is the end responsible.
Would you blame an airplane's autopilot if it crashes? We still have two pilots even though 99% of the time the thing is on autopilot.
She's a test driver, she did not pay attention, and she killed someone. At best, her employer should help and compensate her, given how it was a workplace, on-the-clock accident.
smnrchrds
> Would you blame an airplane's autopilot if it crashes?
If the plane crashed due to an autopilot error, yes, absolutely.
_jal
This is exactly right.
This job is a wage against a (very nasty) lottery ticket, to sit there to absorb the legal fallout for decisions made far away.
slg
>Uber made a series of development decisions that contributed to the crash’s cause, the NTSB said...Uber deactivated the automatic emergency braking systems in the Volvo XC90 vehicle and precluded the use of immediate emergency braking, relying instead on the back-up driver.
If the driver committed homicide, it sure sounds like Uber is also guilty of homicide.
choppaface
Not just the Volvo system, but Uber had tuned the object detector to ignore the sparse lidar returns that the car saw 6 seconds before impact. Meyhofer, the head of ATG, fought to tune the car that way because trees cause similar sparse returns and the car had been stopping for trees while testing. The pressure to tune was there because of an impending demo with Uber CEO Dara and Meyhofer stood to gain tens of millions of dollars from the demo. It’s not just homicide but Uber also defrauded their safety drivers.
bbotond
Eerily similar to how the Challenger tragedy happened
clusterfish
Deactivating this optional feature is somehow worse than buying a car without such feature in the first place? The latter is neither illegal nor immoral.
The safety driver had an actual job to do, he wasn't there "instead" of automatic emergency braking - which is not certified for driverless operation btw. But he was distracted with a phone instead of looking at the road.
The halo effect here is unreal.
bb611
The American legal system places much more emphasis on acts you may have committed than omissions, and tends to avoid compelling action.
So yes, in an American court, disabling a proven safety feature is significantly worse than simply purchasing a vehicle without the feature.
The safety driver failed at their job, but the NTSB clearly lays significant blame for that failure on Uber, who should know well that humans are poorly suited to monitoring automated systems, and committed acts and omissions that increased the likelihood of an accident.
tomalpha
This brings to mind the classic Trolley Problem:
https://en.wikipedia.org/wiki/Trolley_problem
The scenario is notably different, but it does dig into the issues around acts vs omissions and how we perceive them.
sebws
She.
undefined
Cthulhu_
> If the driver committed homicide, it sure sounds like Uber is also guilty of homicide.
No; an automatic emergency braking system is not required by law. A capable, attentive driver on the other hand is. Working brakes are as well, but it's eventually up to the driver to engage them.
DreamSpinner
I think an interesting variation on this is that even if a safety system is not required by law, but is available - then disabling could constitute criminal negligence. Consider what happens if safety equipment on industrial equipment is disabled and injuries result. I'm fairly sure that criminal charges could result for whoever disabled the safety mechanisms (though the there are likely differences between workplace safety criminal law and road safety).
jedberg
I mean this makes sense. She was watching a video on her phone while driving. It was her literal job to know that the car might make mistakes and correct for them, so she should have known that she still has to pay attention as though she were actually driving.
brudgers
It makes sense because holding users personally responsible for the inadequacies of self-driving products externalizes all the risk for companies selling self-driving systems. The Uber system detected something in the road and proceeded because it did not recognize it. That's how it is designed. Otherwise the car would never go very far without stopping because the system does not recognize most things it detects.
To put it another way, the self-driving system did not alert the driver that it had detected something and did not know what it was. It wasn't an emergency, it was the car's normal operation.
bastawhiz
The system was, by its very nature, not ready for production. That's why it had a safety driver in the first place. It's crazy to argue that the safety driver should have been alerted...the whole point they're there is to handle failures of the system, including ones where the system fails to detect an issue.
If my skydive instructor doesn't deploy the backup parachute because I, the student, didn't alert them that the primary chute failed to deploy, it's entirely their fault if we hit the ground at terminal velocity.
If a lifeguard is working at a public pool watching Netflix on their phone and a kid drowns, you can't argue that the kid should have splashed more.
liability
Production? It wasn't even ready for testing on the unsuspecting public.
Erlich_Bachman
Of course it was "inadequate system" if you consider it a full self-driving. It is inadequate in the simple fact that it was not complete. It was under development.
But how do you suppose we create FSD cars if we can't try them out before they are ready? There is just no other way to do it than vigilant drivers that watch what these cars do.
> To put it another way, the self-driving system did not alert the driver that it had detected something
Well of course not, who would expect that? If the car could positively identify the collision before it happened, it would have simply stopped, no need for a driver at all. The driver is there to prevent exactly this kinds of accidents, and this driver failed by getting our her phone and distracting herself with streaming videos instead of doing her one job. Plain and simple.
formercoder
She wasn’t a “user”
derwiki
This was a training mission. Uber wasn't offering rides to passengers who weren't employees.
skywhopper
As mentioned by others, I think it depends strongly on what the driver's training was. Between the overblown hype about the capabilities of self-driving cars, the state government's abdication of its regulation power, and the intentional disabling of existing safety devices on the vehicle (it's unclear if this change or the risks of the change were communicated to the driver), I would be honestly surprised if the driver believed it was actually that dangerous to allow her attention to drift.
It's one thing to charge negligent homicide with a typical car. But the near-fraudulent claims of self-driving car hucksters at the time had a lot of people believing these vehicles were already far more capable than they will be for decades to come. And it's inevitable that the Koolaid drinking in this particular program within Uber was at its strongest. So I doubt the safety drivers were adequately informed of the actual capabilities of the cars or the risks involved.
tmsh
I agree. This is where torts get fun in first-year law school. If Uber's training wasn't perfectly clear about the importance of the job of monitoring outside the vehicle, then part of the liability shifts to them.
Consider two training courses:
* instructors who are gung-ho on automation being "nearly there" and encouraging people to relax in the car and let the software do its work!
* instructors who are constantly impressing upon students that they need to be vigilant.
One can see how one party is 90% liable. And the other is 10% liable, etc.
The interesting part of this case comes down to the training. In that if it was lacking, it then makes it very clear to future companies that their training needs to be more rigorous.
Cd00d
Thank you for this comment and the insight. I have a follow up question: even after training, is there a liability burden to ensure the training is followed? I know no hiring process is perfect, so you're always going to end up with the occasional employee that disregards safety training. Knowing that, is there a burden on Uber to actively monitor the driver's attention and communicate to the driver when it is not sufficient? Even on an audit basis, but more ideally constant?
cyrux004
Its literally in the name of the job title "Car safety driver"
xadhominemx
She was not brand new on the job, right? She knew how often intervention was required and chose to watch TV instead.
darkerside
The fundamental attribution error suggests that the job she was asked to perform may not make any sense.
cameldrv
The vehicle is designed to ultimately be a level 4 system. She effectively has the job of a test pilot of an experimental airplane. The job requires careful attention and is not easy.
There would be a different blame calculus if this were a production level 2 system like autopilot. In that case, it's not a paid test pilot, it would be the paid purchaser of a certified aircraft.
darkerside
And yet she was probably paid like the test pilot for a beta web application. A test pilot for experimental aircraft doesn't even bring their phone in the vehicle with them.
johnnyfaehell
But even if she wasn't watching a video, as far as I remember, there was nothing that could have been done. It was an accident that would have happened if it was a manually operated car with an alert driver, no?
user5994461
>>> It was an accident that would have happened if it was a manually operated car with an alert driver, no?
No, the last articles and discussions were quite clear that this accident wouldn't have happened with a real driver. The road conditions were good, the visibility was good. A driver would have seen the woman crossing the road well enough and slow down.
jeroenhd
From the footage I've seen (both the footage Uber released, which portrayed the road as much darker than it actually was, and from videos of people driving around in the same area after it happened) I can only conclude that an attentive driver would've been able to prevent the crash.
However, I don't think it's reasonable at all to expect someone to remain attentive while looking at a self-driving car. This is the same problem train drivers face, which has been mitigated with all sorts of methods, least of which a dead-man switch. Some countries let their train drivers mention every signal they come across to themselves, with Japanese train drivers even pointing at signs to ensure they're paying attention.
This was a vehicle that had been modified to reduce certain safety features (because Uber couldn't get them to work properly) with someone at the wheel expected to be 100% focused on the road while giving them nothing to do at the same time. You can only go through so many hours of sitting in a card doing nothing before you go crazy.
From a revenge-seeking perspective it's easy to blame the one person who could've stopped the car for her obvious disregard for safety (streaming video on the job), and I suppose a criminal justice case might be in order. However, I think Uber should be mainly responsible for the loss of life because their flawed design not only made the car less secure but also completely disregarded the human psychology when they designed how their human safeguard driver should do their job. Even human-operated cars will beep and yell at you if you don't pay attention while you're driving in cruise control, if such safety features were omitted in the self-driving design then clearly the driver was set up to take the fall when something bad would happen.
I strongly believe Uber only put that woman in there because local law wouldn't let them test their car without a human at the wheel, not because they wanted to ensure their car didn't kill anyone.
dllthomas
> Japanese train drivers even pointing at signs to ensure they're paying attention.
I think pointing out things in the environment is a great idea for safety drivers in this kind of setting. It helps keep them engaged, possibly helps the system notice when they're distracted, and possibly provides additional useful training data.
DevoidSimo
From what I remember the person that was hit was fairly visible. The video uber released was very dark however, giving that impression.
Cthulhu_
Accidents do happen, but the circumstances in this case are important; she was not paying attention to the road when it happened. If the victim came out of nowhere and she could not act, the case would pan out differently.
vaccinator
depends what she was told by the company I guess... and if he situation was avoidable at all
kevin_thibedeau
It doesn't matter what she was told. She was in charge of the vehicle. She was the licensed operator. The presence of a broken ML autosteer doesn't abdicate responsibility.
This is part of the general attitude that rights and freedoms don't matter if a machine violates them. When you walk out of a store post purchase and the security alarm goes off you have zero obligation to disclose what is now your property to the loss prevention experts. But somehow it's acceptable to assume you're a criminal because a machine said so.
user5994461
>>> When you walk out of a store post purchase and the security alarm goes off you have zero obligation to disclose what is now your property to the loss prevention experts.
protip: You probably want to let them figure out what is the thing beeping while you're still in the shop. It's not great to get home only to realize that they forgot to take off the anti theft device on some of your beers.
mindslight
> This is part of the general attitude that rights and freedoms don't matter if a machine violates them. When you walk out of a store post purchase and the security alarm goes off you have zero obligation to disclose what is now your property to the loss prevention experts. But somehow it's acceptable to assume you're a criminal because a machine said so.
IMO this is much more applicable to Uber's share of the responsibility for this incident.
undefined
rhino369
Intent matters. The state has to provide criminal negligence, which in most states requires very risky behavior.
Driving while watching a video certainly qualifies normally, but purely hypothetically, if Uber had internally said it was totally safe to drive distracted, then maybe she has a defense.
nafey
"licensed operator" [citation needed]
I dont even know what license beyond a driving license exists for this job. Clearly not every driving license holder is fit to hold this position.
baddox
That is, presumably, something that the grand jury considered.
vaccinator
maybe she had an attorney chosen by Uber?
vmception
mmm yeah with their expert witness that can make up stuff on the spot
lets just acknowledge inefficiencies in the system
jeffbee
How many Americans have been killed by distracted drivers, though? How many of those were indicted for homicide? Usually what happens when you run over someone, no matter what the circumstances, is the cops exonerate you on live television, with a statement about how "no criminality is suspected" which is suuuuuuuper weird because there's no other kind of killing you can do and get that treatment.
pb7
How many have videos of both the victim being clearly hit and the driver being severely negligent leading up to and including the accident? How many are pedestrians that subsequently die?
jeffbee
Well, 20 pedestrians die on American roads every day. It's incredibly common. The cops usually don't even bother looking for evidence. This idiot live-streamed himself killing a pedestrian in San Francisco, after months of filling up his Instagram with motorized jackassery, and the cops only charged him with manslaughter, not negligent homicide.
https://hoodline.com/2020/08/driver-who-killed-pedestrian-at...
renewiltord
Drivers getting away with killing pedestrians and cyclists is like the most common thing in the world.
As for "how many have videos": when Amelie Le Moullac was killed by a truck, the police didn't even check videos. There had to be a grassroots effort to go get the video.
You're kidding me.
gkop
To any curious readers like me: the charge is negligent homicide (the indictment itself is linked at bottom of article)
jedberg
Most of the time there isn't a video of both the driver and the incident with detailed telemetry data. Also, commercial drivers are held to a higher standard.
samcheng
That poor person is Uber's "moral crumple zone"
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2757236
Fitting, given Uber's reputation for questionable morality...
jameslk
> After the crash, police obtained search warrants for Vasquez's cellphones as well as records from the video streaming services Netflix, YouTube, and Hulu. The investigation concluded that because the data showed she was streaming The Voice over Hulu at the time of the collision, and the driver-facing camera in the Volvo showed "her face appears to react and show a smirk or laugh at various points during the time she is looking down", Vasquez may have been distracted from her primary job of monitoring road and vehicle conditions.[0]
That poor person, who's job it was to watch the road, seems to have been watching Hulu instead. That's a pretty willful disregard for their job and for other's lives. I don't have much pity for them if this is the case.
I get your point about moral crumple zones and I suppose there's cases one could to point to demonstrate it, but this is definitely not a very good example of it.
0. https://en.wikipedia.org/wiki/Death_of_Elaine_Herzberg#Distr...
jakear
Agreed.
> "Had the vehicle operator been attentive, she would likely have had sufficient time to detect and react to the crossing pedestrian to avoid the crash or mitigate the impact," the November 2019 NTSB report stated.
This isn’t a case of the Uber performing some split-second error that needed immediate correction then blaming the driver when that didn’t happen, the way so many in this thread seem to want to make it out to be. This is a person whose job it was to watch the road watching Hulu instead and as a result failing to react to developing conditions that, if NTSB is to be believed (and I trust them a whole lot more than the randos commenting here), could have easily been prevented by an attentive driver.
clusterfish
Guy was using his phone not paying attention to the road when testing experimental equipment on a public road. He was there specifically to mitigate potential malfunctions of said experimental equipment, but he was reading reddit or whatever instead of keeping lookout.
I can't see how this is not clear cut negligence.
akira2501
The vehicle was travelling the posted speed limit at night, as designed. The NHTSA pointed this out that the law may require you to travel at a "safe" speed, and simply choosing to always move at the posted limit is potentially a moving violation and a design flaw.
The vehicle was travelling 20mph faster than it's own "safe clear ahead" zone allowed for. Simply ignoring it's own safety limits and travelling at a higher rate of speed is a design flaw.
Designing a large public road autonomous vehicle test and not taking into account an easily predicted human vulnerability that's been known in other automation experiments for decades is a project design flaw.
Whether or not the driver is at some fault, I think the court will be capable of determining that. Arizona's willingness to completely disregard the mountains of negligence on part of a giant corporation is disturbing.
JanSt
Isn't it the drivers job to take control whenever there is a design flaw? I thought that was the exact job description.
If the car is going over it's speed limit, the driver should react.
I see that some people think Uber should have added systems to control the safety driver. I think that's a fair point, but it doesn't take away the drivers responsibility. They knew the car isn't perfect. It's their job to take over control whenever that happens.
clusterfish
Uber posted a safety driver specifically to make the car safe at regular speeds despite the limitations of the autonomous system. That driver just wasn't doing their job.
ardy42
> Guy was using his phone not paying attention to the road when testing experimental equipment on a public road. He was there specifically to mitigate potential malfunctions of said experimental equipment, but he was reading reddit or whatever instead of keeping lookout.
> I can't see how this is not clear cut negligence.
1) The driver is a woman.
2) While I think the driver does bear some fault here, they don't bear all the fault. Uber designed an unsafe system that relied on an unnatural amount of vigilance from a single person while simultaneously discouraging that vigilance [1]. They didn't design the car to shut down when one of it's critical safety components (the driver) was not operating correctly, and they didn't even give that driver amphetamines or something to increase their vigilance to artificial levels.
[1] Basically: pay close attention to a boring process while doing absolutely nothing for hours on end. I'm pretty sure that's a classic "humans suck at this" task.
clusterfish
That "unnatural amount of vigilance" is just sitting in a comfy seat and looking at the road ahead of you go by. There is nothing unnatural about that. Lots of people do similar or more boring tasks just fine.
This isn't a truck driver falling asleep from exhaustion caused by aggressive scheduling. You don't "accidentally" take out your phone when your job is looking ahead. That's deliberate negligence.
If you want to require eye sensors to detect distraction, by all means, pass a law about it. Maybe include regular non-AI cars too. #1 cause of accidents over here.
HeWhoLurksLate
I agree with you except for
> and they didn't even give that driver amphetamines
Yo, what? IIRC, most *amphetamines are either illegal in the US or like FDA Class 1 substances or something like that, and you just want to throw them at people operating > 1 ton machinery? That seems like it could have an even higher risk for danger.
Aunche
Modern cars are essentially self driving on the highway, and it doesn't require an an unnatural level vigilance to operate them. To be fair, the driver's assistance features are usually come with an attention monitor, so Uber is negligent in that aspect.
Also, no one said the driver has to do nothing. They can safely listen to audiobooks or talk on the phone.
_2d30
How is the required amount of vigilance any different than that of normal driving?
Erlich_Bachman
In what world is asking someone to at least not pick up their phone and start watching videos instead of doing one's job and at least looking at the road outside is "unnatural amount of vigilance"?
It's the basic job description. If she couldn't do it, she shouldn't have taken the job.
brokenmachine
This is the first of many such cases to come.
It's unreasonable to hold someone accountable for a "self-driving car" that suddenly decides at a split-seconds notice that it can't cope driving.
Of course this is extra bad because it's an experimental car, but it's the same in my opinion with those Teslas on the road now that do the same thing.
Erlich_Bachman
> It's unreasonable to hold someone accountable for a "self-driving car"
It is not unreasonable at all. She had one job to do - look at the road. She failed it because she felt that her entertainment was more important than doing her job. She picked up her phone and started streaming videos. She failed at her one job, plain and simple. She knew everything about the job and still chose to watch some videos and risk lives.
lhorie
What I find puzzling is that I keep hearing people say that it's impossible for people to stay alert while not being the driver, when there's actually a pretty popular term to describe that very thing: a backseat driver[1].
If you go on a long multi-day road trip, you might even end up relying on a passenger's feedback in a moment of tiredness.
undefined
marcinzm
I give you a weight, you are to hold the weight in front of you with your arm outstretched, for 12 hours straight. If you drop your arm then a random person is shot. If you drop the weight is the death your fault or my fault for putting you in an impossible, for a human, situation?
Humans are very bad at paying attention for a rare event and having nothing else to do 99.999% of the rest of the time. This is surely well known to the people who set up this system in the self driving car.
clusterfish
Looking at the road when sitting in the drivers seat without being absorbed by your phone is not, in any way, a hard task. Truck drivers, train drivers, pilots and other equipment operators exist. Millions of people do it every day. Those that don't, and end up killing someone are held responsible if found out.
Don't take a boring job if you can't handle it. The required basic level of continued attention is not a superhuman skill. It's not a job for everyone but it's not exceptional at all.
gamblor956
Not even remotely the same thing.
Dude was paid to pay attention. He didn't. It wasn't because he got fatigued, it was because he was deliberately doing something else, by choice
Erlich_Bachman
She could always just take a break if she felt her attention was lacking. Or you know, not take the job in the first place if this job was so hard for her personally.
It is not at all fair to make the comparison you are making.
taneq
The poor guy's a lady. Other than that, agreed.
Edit: Post fixed now, disregard. :)
samcheng
Sorry! I should know better than to assume gender these days. Fixed.
kikokikokiko
He can desire reeeeaaaallllyyyy hard to be a lady all he wants, but he'll never be one. Reality does not change just because of human desire. The guy deserves to be treated with the same amount of respect as any human being does, but he haves no authority over me to obligate me to say an airplane is a submarine.
On the subject at hand, he was paid to do one job: look at the road and be the safety net between an experimental car and the other people on the road.
If he was listening to a podcast using headphones it could have been avoided, but he was looking down at his phone for the whole duration of the recorded video we have from the incident.
It's negligent homicide just as it would be to any other machine operator, on any other industry, that did not fulfilled his duty to the company that contracted him would.
blackbrokkoli
This discussion reminds me an awful lot of Don Normans example regarding human error:
> Air Force: It was pilot error - the pilot failed to take corrective action
> Inspector General: That's because the pilot was probably unconscious.
> Air Force: So you agree, the pilot failed to correct the problem.
We can go and blame the driver all day long but that will not actually solve anything. Was it neglect? Probably-maybe, I'm not a lawyer. But that is not the point. Why did the driver look at her phone? How can we prevent that? What other, similar failure modes are there?
We did this dance with pilots, truck drivers, forklift operators, life guards, factory workers and god knows how many more. It is frankly quite disappointing that HN is overwhelmingly like "this time, we'll just blame the operator!"...
tobyhinloopen
The operator was watching a video on her phone. That’s not failure to take action, that’s intentionally putting yourself in a position to not take action
TeMPOraL
It's also a well-known and completely predictable behavior for humans. In just about every other job requiring constant vigilance, there are multiple factors mitigating this failure mode (shift lengths counted in minutes, multiple observers, automated attention checking devices that shut down the machine if not reacted to, other employees ensuring the observers pay attention, etc.).
Uber should absolutely get the blame for creating this situation in the first place; letting a self-driving car out with only a backup driver as safety, and with her phone on her to boot, should not even be allowed.
crazygringo
> Why did the driver look at her phone?
Because she chose to watch a TV show instead of performing her job.
Yes, I'm absolutely blaming the operator on this one.
I don't see how making that conscious choice, of your own free will, when your entire job is safety supervision, is anything but criminal negligence.
davidhyde
> "The vehicle operator’s prolonged visual distraction, a typical effect of automation complacency, led to her failure to detect the pedestrian in time to avoid the collision."
This is quite an important point. It was very dangerous of Uber to disable both the car’s built-in collision avoidance system and to have nothing to replace it but the backup driver especially when there is a non-zero risk of automation complacency. I’m not trying to cover for the driver as they were clearly at fault. The evidence seems overwhelming. However, Uber shouldn’t be cleared of fault just because the backup driver is found guilty.
SilasX
This. Keep in mind, they rotate lifeguards every 15 minutes, specifically because it's boring and hard to hold attention. It was negligent of Uber not to have some kind of countermeasure like this, even if the driver takes some responsibility.
cyrux004
For a development vehicle,I am assuming they bypassed the car's stock functionality I am sure they had their own braking system that was supposed to stop for cars, pedestrians, cyclists etc. It didnt function currently at that time
davidhyde
> "Uber had disconnected the Volvo's factory-installed crash avoidance system. While the Uber vehicle's autonomous system did detect Herzberg before the impact, the vehicle — and Uber — relied on Vasquez to take action if an emergency arose."
The Uber software can detect an imminent collision but relies on the backup driver to act on it. For everything else the car is designed to drive itself. This is why "automation complacency" should not be ignored.
throwaway0a5e
I still haven't heard a good reason for them to not have also been running Volvo's system in parallel. Considering that the system is clearly not so hair trigger that humans hate it it shouldn't hobble their AI and it seems like all upside and no downside to me.
foobar1962
> I’m not trying to cover for the driver as they were clearly at fault.
What was the fault? Not paying attention, or driving the car into the pedestrian?
The is the case that decides the law. Glad it's not me.
yholio
The six levels of vehicle autonomy are marketing bullshit. They imply failure of automation is acceptable and the human can be left to pick up the slack.
In reality, there is no spectrum of automated driving, with the human doing less and less. Driving is not an instantaneous activity like swimming, it requires planing and strategizing, the decisions made in the past influence situational awareness. I plan an overtake maneuver based on a myriad factors and have escape routes already planned if things don't go well.
When the automated driver fails a human driver cannot be expected to just drop in and correct the mistakes - by that time, they are uncorectable and the ramp-up time of the human driver far exceeds the available time in most real life situations. In the overtake example, if the autopilot fails when a 20 ton truck is approaching, I have no recourse, I have no idea if an attempt to brake and regroup will be successful because I did not plan the maneuver. Therefore, computer drivers cannot be allowed to fail in such a scenario.
In practice, there are only two real levels: 1. autonomous; and 0. non-autonomous, with various automation that helps the driver, while he remains fully in control and the sole decision maker. What can constitute a self-driving spectrum is the type of road where full automation is expected to work: from restricted and instrumented roads, where only other similar vehicles are allowed, for the dumbest self-driving modes, and up to full self-driving on general purpose roads mixed with human drivers and pedestrians.
But the idea that you can achieve level 5 self driving by incrementally improving the systems designed to aid the driver is a dangerous pipe dream.
slashink
I hadn’t considered this before but you’re absolutely right. Driving is a continuous operation so it’s almost impossible to pick up without the prior context.
bob1029
I still think general AI (i.e. AI so advanced that we would consider protecting each instance as we would a human life) is a prerequisite for truly autonomous driving on our complex roadways today. This is the kind of game theory that would make me feel more comfortable sharing the road with these kinds of cars.
I completely agree with the assertion that the grey area in between is where a bunch of people are going to wind up dead.
yholio
Maybe not a prerequisite, but certainly necessary to prevent certain type of long-tailed events. For example, I can anticipate by the look of a neighborhood that it's more likely that there would be jaywalking and poor infrastructure, I can feel that the driver riding ahead is drunk or on drugs and keep my distance etc.
It remains to be seen if a automated drivers can make-up for increased risk in such scenarios with good performance in areas they excel, such as reaction time, highly distributive attention, near perfect physical predictions and decisions made on that basis, vehicle-to-vehicle coordination and vehicle to road communication etc.
What is clear though is that any practical system in the near term (non-general AI) will approach driving quite differently than a human driver. Hence the jury is still out of lidar vs camera-only approaches (Waymo vs Tesla).
Aeolun
> I completely agree with the assertion that the grey area in between is where a bunch of people are going to wind up dead.
But will more people end up dead than in the scenario where we had no ‘self-driving’ cars at all?
Humans are much more accident prone than robots in all the research on this I’ve seen.
toast0
The thing is, most of the deaths from self driving will be seen as preventable, if only a human was driving. That's going to make it look bad, even if the numbers are less.
If you want to reduce deaths, I think you really want to invest in things like automatic emergency braking, monitoring driver attention, and safely stopping the vehicle if the driver is incapacitated. Having the computer supervise the human means the human is engaged in driving and aware of the circumstances for the most part; a computer supervising a human can act with great speed if the situation warrants, but a human supervising a computer is likely not to react so quickly.
j7ake
Calculating just deaths / DrivingDistance is easy but overly simplifies the complicated comparison between human- vs self-driving. Comparing normalized deaths is not sufficient to persuade real people to choose self-driving over human-driving. Real humans and society take into account many more factors to accurately compare the two driving modes.
Take a hypothetical scenario where self-driving cars reduce deaths by 10x compared to now, but the deaths caused by self-driving cars are all small children and people with darker skin in human-preventable ways (you can imagine if cameras are used why this could happen). I doubt society would accept this tradeoff.
The safer future is to have humans still drive, but be enhanced by computers and cameras that are installed in the car.
yholio
Truth is, we will never accept robots on the roads that are as deadly as human drivers. Especially since they will fail in specific, surprising and hard to comprehend ways, see here.
Robotic drivers will probably need to demonstrate at least an order of magnitude improvement in safety before they can be accepted by the public.
dboreham
You can probably achieve a safer-than-humans solution without strong/general AI, provided it only operates on a special kind of road (in-highway sensors, no pedestrians, all other vehicles are also self-driving).
PiRho3141
Here's the problem. The person was being paid to pay attention to the road. According to the below news article, the person was watching the voice on their mobile phone while the car was driving. Perfectly acceptable if the vehicle has already been rated to be fully autonomous but, in this case, they are still testing the automation and the car detected the bicyclists 5.6 seconds prior to the accident.
Uber should be partially at fault since they deactivated the automatic emergency braking.
Yes, I agree when vehicles become fully autonomous, people will be less attuned to the environment, but in this case, the person was hired specifically to pay attention at all times.
https://news.sky.com/story/uber-safety-driver-charged-over-d...
runamok
FYI it was a bicyclist (singular) but they were walking their bike across a crosswalk at 10pm.
Not to victim blame but if they were not wearing lights or a reflective vest I think a human could have made the same tragic error.
As you point out the system actually DID detect them.
curiousgal
"Herzberg was crossing Mill Avenue (North) from west to east, approximately 360 feet (110 m) south of the intersection with Curry Road, outside the designated pedestrian crosswalk"
goodluckchuck
This idea needs a little more fleshing out, but is very good.
The Uber vehicle IIRC didn’t abandon control when it saw the pedestrian, it did so (if ever) when it determined that it couldn’t avoid them... So the vehicle had more and better information about the situation.
“Failing to the driver only works when it’s not helpful.“
btbuildem
Not everybody drives the same way -- from how you describe it, to you driving is an all-encompassing activity, requiring presence, planning, attention etc. Don't get me wrong, that's ideal. Unfortunately most people don't drive like that. For many people, it's an incidental activity, a means to an end, done with the bare minimum of resources required to get by. That's why there are so many "accidents" and why you see so much stupidity on the roads.
anticensor
So, you think SAE J3016 (document that defines this 6-level terminology) is pointless and we could just get away without it? You need to prove your case further.
undefined
tantalor
Your dichotomy may make sense in theory, but in practice where the average driver makes a fatal mistake at rate X, a level-4 autonomous vehicle with a failure rate Y<X (including when human override is required) is an win.
darkerside
I think your claim is far more theoretical in nature. In a vacuum, sure, fewer deaths is better. In the real world, where people aren't satisfied unless someone can be held accountable in a clear way, we are far better off with humans driving.
Retric
The maker and or operator of a self driving car can be held accountable depending on the specifics. IMO, we have this odd social expectation that humans killing people with cars or pollution is basically acceptable and nothing else really gets that pass.
IMO, killing someone with a car or a gun is a direct equivalent.
btbuildem
This story has all the key elements that come up in discussions around autonomous vehicles (thankfully omitting that inane trolley problem). To me, it is a preview of what is likely happen when adoption is at scale.
The "driver" (or rather "liability scapegoat") will bear the responsibility when things go wrong -- simply because auto makers or service operators have more legal resources than an individual.
Any person left to "monitor" the vehicle will not be paying attention most of the time. Even a conscientious, mindful and diligent person will tune out sooner or later; the task at hand is almost perfectly tailored to make you drift off. Maybe we can look to train engineers to see how that problem is tackled?
Further development to improve the self-driving capabilities of vehicles is likely to plateau at some arbitrary milestone, because pushing further will cost more than the court battles and keeping favour of public opinion.
crazygringo
> Any person left to "monitor" the vehicle will not be paying attention most of the time.
I'm not actually convinced that's true. Heck, people drive regular cars "mostly on autopilot". Haven't you ever "stopped paying attention" and then realize you don't even remember the last 5 minutes of driving?
But it's not really a safety issue, because your brain is very capable of automatically and instantly detecting anything out of the ordinary to alert you to, like something in the road in front of you.
Literally all you have to do is just keep your eyes on the road. You don't have to be paying constant "conscious" attention... you just have to have your eyes in the right place so your brain can automatically alert you if something's not right. Which it is excellent at doing.
Now I'm not saying that somebody can do it for 36 hours straight or without occasional breaks or whatever. But the idea that this is somehow inherently a task we're not suited for is just ludicrous.
In this particular case though, it doesn't even have anything to do with her capabilities of sustained attention. She chose to watch a TV show on her phone. She actively and consciously chose to utterly neglect her safety duties, the entire purpose of her job.
LeftHandPath
I think this is the best argument for why self-driving cars are much further away than we realize. A fully autonomous vehicle that is perfectly suited for modern city roads is a long way off. A human is still required. In this case, it seems best that the human remains in total control so that they remain in total dedication to the task. For the same reason that some aircraft autopilots shut off when they notice a problem, so that the pilot has to take full control and becomes more aware of problems like if the plane is consistently pulling left or right.
The ideal is that the car helps, but does not take total control. The car keeps you in your lane and automatically speeds up / slows down with highway traffic. It warns you when you turn on your indicator while a car is in your blind spot, and applies the brakes when it detects an imminent collision. But the car should not make turns or apply gas or allow the driver to dedicate their attention to something besides the road. It should augment, but not replace, human abilities.
Hnaomyiph
"The "driver" (or rather "liability scapegoat") will bear the responsibility when things go wrong" This was a test vehicle, testing software, not a fully-fledged, mass-produced, 100% autonomous car. The backup driver's entire job was to ensure that is this test software malfunctioned, there would be someone in the seat ready to take over. We can have discussions of who's at fault when we actually build a 100% autonomous car, until then, the driver is, and should be, the one at fault.
yodon
As systems become more reliable, it becomes progressively harder for humans to stay focused and maintain the concentration required to respond in the way this driver was expected to respond.
Yes, the driver is at fault for failing to maintain vigilance, but the engineers who built this system are far more at fault for designing a system whose operating parameters (long stretches of nothing to do culminating in either more nothing to do or a sudden life or death emergency requiring full concentration to detect and process and prevent) are well outside the operating envelope of the human in the system.
The driver was hired to perform a task that was almost guaranteed to result in a deadly injury while some driver was behind the wheels. She didn't have any way to know that, but the engineers did.
weaksauce
> The driver was hired to perform a task that was almost guaranteed to result in a deadly injury while some driver was behind the wheels. She didn't have any way to know that, but the engineers did.
I think the culpability lies a bit more on her than you imply... she was watching a video on her phone instead of surveying the road/instruments for dangers
yodon
By virtue of being a HN reader, you are almost certainly much better educated and much more highly skilled than the person Uber hired to baby sit their vehicles during testing. Set the alarm on your watch to go off in two hours. Start looking at your monitor to see if any pixels fail in the next two hours. Ask yourself how far into that 2 hour task you were able to maintain vigilance. Now try to do it day after day. Was she at fault, yes. Were the Uber engineers who designed this testing system at fault too? Absolutely.
weaksauce
> Was she at fault, yes. Were the Uber engineers who designed this testing system at fault too?
Those two tasks are quite different. One is driving and observing new stimuli and the other is looking at paint dry. I don't think anyone could do that without adequate breaks.
I never implied that the guilt was solely on her... she's culpable to an extent as is uber for not having two backup drivers and/or more breaks to stay fresh. I don't know if they had a written policy about it but I can't imagine being on the phone watching a movie is allowed.
sokoloff
> The driver was hired to perform a task that was almost guaranteed to result in a deadly injury while some driver was behind the wheels.
Every airline pilot is hired to perform a task that is guaranteed to result in a deadly injury while some pilot is behind the [yoke/sidestick].
Every Registry/Department of Motor Vehicles each weekday grants licenses to civilians which are guaranteed to result in a deadly injury while some of them are behind the wheel.
Every commercial truck driver is hired to perform a task...
That can't possibly be the standard by which we object.
yodon
Airline pilots are highly trained, subject to regular physical inspections and testing, and spend much of their careers practicing for emergency procedures to maintain their vigilance.
Auto and truck drivers operating normal vehicles are in a state of at least partial task concentration while operating their vehicles. The task concentration context in which this test driver was seated was completely unlike that of a normal vehicle driver.
tunesmith
One of the other front page articles right now is the PDF about Judea Pearl's book, and it describes exactly this case, including that there was a roughly six second window. And that the car normally had a feature to stop in those cases, but that the engineers (who I take to not be the driver) turned it off due to false positives.
I'm still unclear on how they find liability here - should it all fall on the driver? You could argue "but for" the driver's failure to pay enough attention to stop in those six seconds, and you could also argue "but for" the engineers' decision to turn off the safety feature.
jbay808
False positives are dangerous, because cars stopping suddenly and unexpectedly can be dangerous. The engineers probably were right to disable that, if the false positive rate was too high.
akira2501
> The engineers probably were right to disable that, if the false positive rate was too high.
This isn't a binary. They can also choose to stop the public test of a vehicle they can't adequately control.
throwaway0a5e
I think GP is talking about the OEM system that shipped with the car. Considering that their false positive rates are so low as to not annoy human drivers I don't see why they disabled it.
GeorgeTirebiter
IF the engineers told the driver, "Hey, we disabled pretty much all the safety features on your vehicle because we can't figure out how to make them work -- so be Extra Careful!!!" --- well, maybe. But shifting the blame onto a low-wage worker to shield Corporate Hubris seems....wrong.
true_religion
It's not any more responsibility than any other low wage work would have as a driver.
You can put someone in a car without collision avoidance or even anti lock brakes, and it's okay to ask the driver to use their own skill and judgment instead.
ogre_codes
If this wasn't a self driving car and it was just a normal driver texting while driving at night, it would never have gone to court. The driver would have said "I didn't see her", and the police wouldn't have even charged her. This is particularly true since the victim was homeless.
onion2k
Here in the UK we have a specific law that doubles the penalties for careless driving if you're using a phone. If you ran someone over you'd be charged with 'causing death by wreckless driving' and you'd probably spend some time in prison. Saying "I didn't see her" would be an admission of guilt.
ogre_codes
Wasn't considering the phone at all. In a lot of places in the US, using a phone is also a distraction likely to get you jail/ conviction. (In some places it's considered driving impaired).
The accident would just never have gotten the scrutiny it did. Unless the driver came out and mentioned they were using their phone, the whole incident would have just vanished. They certainly wouldn't have bothered getting a subpoena to check the drivers cell record.
onion2k
You're suggesting that if a driver runs someone over and kills them the police wouldn't do a basic check like looking at the driver's cell phone record? It's no wonder people all over the US want to defund the police if they're that incompetent. That would not happen here in the UK.
undefined
Get the top HN stories in your inbox every day.
This job reminds me a lot of when I lifeguarded in high school during the summer. You sit on a chair in the sun, waiting for something to happen. The job is not stimulating, which further increases the difficulty of recognising if someone is in trouble - try watching some rescue training videos [0]. While a lot of the interventions a lifeguard does are prophylactic (“No running!” vs administering first aid to someone who tripped and split their skull open), interventions did happen a few times a month. To make sure those interventions happen when needed, pools implement processes to better manage the risk.
These processes seem to be the key difference between that job and this one. Rotations every 30 minutes, a manager actively observing for engagement, and overlapping zones of coverage all were instrumental in 0 fatalities - I can recall examples where each of those saved at least one person’s life. Redundancy, supervision, and engagement are what make a pool safe. While I think this woman failed to do her job and may be punished for it, it is important to question why your community pool has better safeguards than an experimental car.
[0] https://youtu.be/4sFuULOY5ik