Wife of Tesla employee who died in what may have been the first Full Self-Driving crash said she and her husband were ‘guinea pigs’

Hans von Ohain, a Tesla recruiter, died in a car crash on a Colorado mountain road after his Tesla Model 3 veered off the road and barreled into a tree, bursting into flames, according to a new report from The Washington Post.

Erik Rossiter, a friend of Ohain’s who was in the vehicle at the time of the 2022 crash and survived the incident, told first responders Ohain had been using the “auto-drive feature on the Tesla” and the vehicle “just ran straight off the road,” the outlet reported.

If what Rossiter said is true, the incident — which Tesla has so far refused to acknowledge publicly — would be the first known fatality linked to the car company‘s Full Self-Driving technology.

While Ohain had been intoxicated at the time of the crash, with a blood-alcohol level more than three times the legal limit, investigators found the incident was not a typical drunken-driving crash. Sgt. Robert Madden of Colorado State Patrol told the Post there were no skid marks, which would have indicated the vehicle attempted to brake before impact, but there were “rolling tire marks,” meaning power was still being deployed to the wheels after the crash.

Madden said that “given the crash dynamics and how the vehicle drove off the road with no evidence of a sudden maneuver, that fits with the [driver-assistance] feature” being engaged.

Madden also described the subsequent fire, which engulfed the car, as one of the “most intense” vehicle fires he had encountered, due largely to the lithium-ion battery cells housed in the undercarriage of the Tesla that Ohain was driving.

Ohain’s cause of death was determined to be smoke inhalation and thermal injuries, per the Post, and Madden said he probably would have survived the crash had it not been for the intensity of the flames.

Keep reading

Drunks WON’T be able to get into driverless cars after a boozy evening: Ministers dash hopes of the futuristic vehicles acting as chauffeurs following a heavy night

Driverless cars promise motorists hands-off journeys, which many have hoped might allow for a couple more pints at the pub before travelling home.

But those planning to use their autonomous vehicle as a personal taxi service should beware, with the government announcing legislation to make sure it is treated like drink driving.

Being over the limit, going on your phone or having a nap behind the wheel of the futuristic cars will be illegal, according to documents published alongside the Automated Vehicles Bill which was announced in this week’s King’s Speech.

The Law Commission has already drawn up a draft proposal for legislation around the legal use of driverless cars and vehicles on Britain’s roads. 

Motorists must ‘remain in a fit state to drive’ while their car is on the road, and there must be a ‘user in charge’ who is able to take control if the self-driving system requests for them to do so.

Drivers will still need to be sat in the front seat and have a driving licence to operate their vehicles, and failing to do so could open them up to prosecution.

Keep reading

Injured person reportedly dies after Cruise cars block first responders

On Aug. 14, two stalled Cruise vehicles delayed an ambulance from leaving the scene of a crash in which a driver had hit a pedestrian with their car, according to reports from the San Francisco Fire Department. The pedestrian later died of their injuries, which first responders linked to the delay in getting them to the hospital. 

“The fact that Cruise autonomous vehicles continue to block ingress and egress to critical 911 calls is unacceptable,” one emergency responder wrote in a report. Cruise spokesperson Tiffany Testo countered that one of the cars cleared the scene and that traffic to the right of it remained unblocked. “The ambulance behind the AV had a clear path to pass the AV as other vehicles, including another ambulance, proceeded to do,” she wrote in a statement to SFGATE. 

According to several reports written by first responders, first obtained by Forbes, emergency personnel arrived at Seventh Street and Harrison in SoMa and began treating a “critically injured” pedestrian who had been struck by a car. The patient was quickly loaded into an ambulance, but the ambulance driver was unable to immediately leave the scene, according to two reports written by members of the ambulance team.

Two autonomous Cruise vehicles and an empty San Francisco police vehicle were blocking the only exits from the scene, according to one of the reports, forcing the ambulance to wait while first responders attempted to manually move the Cruise vehicles or locate an officer who could move the police car. 

Collectively, these interferences “contributed to a poor patient outcome, delaying the definitive care required in severe trauma cases,” according to one of the reports. The patient reportedly died of their injuries approximately 30 minutes after arriving at San Francisco General Hospital.

SFFD representatives did not immediately respond to SFGATE’s request for comment. 

Keep reading

Armed with traffic cones, protesters are immobilizing driverless cars

Two people dressed in dark colors and wearing masks dart into a busy street on a hill in San Francisco. One of them hauls a big orange traffic cone. They sprint toward a driverless car and quickly set the cone on the hood.

The vehicle’s side lights burst on and start flashing orange. And then, it sits there immobile.

“All right, looks good,” one of them says after making sure no one is inside. “Let’s get out of here.” They hop on e-bikes and pedal off.

All it takes to render the technology-packed self-driving car inoperable is a traffic cone. If all goes according to plan, it will stay there, frozen, until someone comes and removes it.

An anonymous activist group called Safe Street Rebel is responsible for this so-called coning incident and dozens of others over the past few months. The group’s goal is to incapacitate the driverless cars roaming San Francisco’s streets as a protest against the city being used as a testing ground for this emerging technology.

Over the past couple of years, driverless cars have become ubiquitous throughout San Francisco. It began with human safety drivers on board who were there to make sure everything ran smoothly. And then, many cars started operating with no humans at all.

They’re mostly run by Cruise, which is owned by GM, and Waymo, which is owned by Google parent company Alphabet. Both companies have poured billions of dollars into developing these autonomous vehicles. Neither Cruise nor Waymo responded to questions about why the cars can be disabled by traffic cones.

Keep reading

Miss Car Payment? Future Ford Vehicles Could Repossess Themselves

Ford Motor Company filed a US patent application that shows autonomous or semi-autonomous vehicles could potentially repossess themselves if their owners miss lease or loan payments.

The idea of self-driving cars repossessing themselves might sound dystopian, but it is not surprising that automakers are considering this technology to ensure payment. Repossession is a common practice, and as we’ve described recently, cracks are beginning to form in the subprime auto loan market (read: here & here).

While this patent application was first filed in Aug. 2021 and formally published on Feb. 23, it could be years before Ford implements such a technology.

The patent, titled “Systems and Methods to Repossess a Vehicle,”explains how a future lineup of Ford vehicles would be capable of “[disabling] a functionality of one or more components of the vehicle.”

If a driver misses a car payment, the vehicle will disable air conditioning, radio, GPS, and cruise control to irritate the driver.

Keep reading

Lasers Can “Hack” Self-Driving LiDAR Sensors, Creating False “Blind Spots”, New Study Reveals

In what is likely going to be another thorn in the side of Elon Musk, Tesla and Autopilot, it was revealed in a report last week that many self-driving features in vehicles can be “messed with” using lasers. 

A brand new study that was put together and uploaded in late October, called “You Can’t See Me: Physical Removal Attacks on LiDAR-based Autonomous Vehicles Driving Frameworks” made the revelation, which was also reported on by Cosmos Magazine. 

Researchers in the U.S. and Japan found that vehicles could be tricked into not seeing pedestrians (or other objects in their way) using lasers. These cars, which use LiDAR to sense objects around them, send out laser lights and then use the reflection back to judge how far away objects are. 

The study revealed that a perfectly timed laser shone back into a LiDAR system can create “a blind spot large enough to hide an object like a pedestrian,” according to Cosmos. 

The study’s abstract says: “While existing attacks on LiDAR-based autonomous driving architectures focus on lowering the confidence score of AV object detection models to induce obstacle misdetection, our research discovers how to leverage laser-based spoofing techniques to selectively remove the LiDAR point cloud data of genuine obstacles at the sensor level before being used as input to the AV perception. The ablation of this critical LiDAR information causes autonomous driving obstacle detectors to fail to identify and locate obstacles and, consequently, induces AVs to make dangerous automatic driving decisions”

Keep reading