TEMPE, Az. – It’s day four of the Tempe Police Department’s investigation into Uber’s self-driving car crash, the accident that resulted in the first fatality of a pedestrian from a car in full autonomous mode. And there’s still no clear answer what went wrong.

The Arizona police, along with federal investigators, have been examining the vehicle and accident site, while also gathering information about the technology in the car, the vehicle operator and the pedestrian. They’ve also been studying a video that captured the crash Sunday night from the car’s dashboard camera.

That video was released to the public on Wednesday. It shows footage of the pedestrian, Elaine Herzberg, walking her yellow bike loaded with bags across a dark road. It stops at the moment of impact. The video also shows the vehicle operator, Rafaela Vasquez, sitting at the wheel constantly glancing down at her lap. She looks up just as the car collides with Herzberg. The video is graphic and difficult to watch.

While it’s unclear how a human driver would’ve reacted, some autonomous-vehicle experts who’ve watched the video say the driverless car’s broad array of sensors should have detected Herzberg before she was hit.

Cortica, a technology company that develops autonomous artificial intelligence, analyzed the video and provided its evaluation exclusively to CNET. Its system detected Herzberg at 0.9 second before impact when the car was about 50 feet away. Cortica’s CEO, Igal Raichelgauz, said that would have been enough time for an autonomous vehicle to react and save Herzberg’s life.

“The advantage of machine response time and control, the right actions could be made to certainly mitigate the damage,” Raichelgauz said.

Tempe police say the car didn’t slow down or swerve as Herzberg appeared on the road. It hit her traveling at 38 mph.

“The video is disturbing and heartbreaking to watch, and our thoughts continue to be with Elaine’s loved ones,” an Uber spokeswoman said in an emailed statement. “Our cars remain grounded, and we’re assisting local, state and federal authorities in any way we can.”

A work in progress

Driverless cars are equipped with a system of cameras, radar and lidar sensors that allow them to “see” their surroundings and detect traffic, pedestrians, bicyclists and other obstacles. If confronted with a pedestrian, self-driving cars are supposed to stop. These sensors are said to work as well at night as in daylight.

“Although this video isn’t the full picture, it strongly suggests a failure by Uber’s automated driving system,” said Bryant Walker Smith, a University of South Carolina law professor who studies autonomous vehicles. “The victim is obscured by darkness — but she is moving on an open road. Lidar and radar absolutely should have detected her and classified her as something other than a stationary object.”

For the most part, testing of autonomous technology has shown driverless cars to be safe. But it’s still a work in progress. The vast majority of vehicle tests haven’t been done on public roads, and the cars are still learning how to drive.

“Driving a car can seem like a rote process, but it is not,” said Timothy Carone, a driverless car expert and associate teaching professor at the University of Notre Dame’s Mendoza College of Business. “We make complex decisions and value judgments continually when we are behind the wheel.”

As autonomous systems mature and become more capable of handling unusual situations, they’ll get better at making those complex decisions, Carone said. But, he added, that could take years.

“As our society transitions to using more systems like driverless cars, pilotless airplanes, driverless trucks and trains, and weapons, the accidents will continue happening,” Carone said.