Early on, the software had the regrettable habit of hitting police cruisers. No one knew why, though Tesla's engineers had some good guesses : stationery objects and flashing lights seemed to trick the A.I.

The car would be driving along normally, the computer well in control, and suddenly it would veer to the right or left and - smash - at least 10 times in just over three years.

For a company that depended on an unbounded sense of optimism among investors to maintain its high-stock price, these crashes might seem like a problem.

But to Elon Musk, Tesla's chief executive, they presented an opportunity. Each collision generated data, and with enough data, the company could speed up the development of the world's first truly self-driving car. He believed in this vision so strongly that it led him to make wild predictions :

''My guess as to when we would think it is safe for anybody to essentially fall asleep and wake up at their destination : probably toward the end of next year,'' Musk said in 2019. ''I would say I am certain of that. That is not a question mark.''

The future of Tesla may rest on whether drivers knew that they were engaged in this data-gathering experiment. I wanted to hear from the victims of some of the more minor accidents, but they tended to fall into two categories :

They either loved Tesla and Musk and didn't want to say anything negative to the press, or they were suing the company and remaining silent on the advice of counsel.

THEN I found Dave Key. On May 29, 2018, Key's 2015 Tesla Model S was driving him home from the dentist in Autopilot mode. It was a route that Key had followed countless times before : a two lane highway leading up into the hills above Laguna Beach, Calif.

But on this trip, while Key was distracted, the car drifted out of the lane and slammed into the back of a parked police S.U.V., spinning the car around and pushing the S.U.V. up onto the sidewalk. No one was hurt.

Last fall, I asked Key to visit the scene of the accident with me. Key brought along a four-page memo he drafted for our interviews, listing facts about the accident.

He was particularly concerned that I understand that Autopilot and F.S.D. were saving lives :

''The data shows that their accident rate while on Beta is far less than other cars,'' one bullet point read, in 11-point Calibri. ''Slowing down the F.S.D. Beta will result in more accidents and loss of life based on hard statistical data.''

Key drew an analogy of the coronavirus vaccines, which prevented hundreds of thousands of deaths but also caused rare adverse reactions.

'' As a society,'' he concluded, "we choose the path to save the most lives.''

Three weeks before Key hit the police S.U.V., Musk wrote an email to Jim Riley, whose son Barrett died after his Tesla crashed while speeding. 

Musk sent Riley his condolences, and the grieving father wrote back to ask whether Tesla's software could be updated to allow an owner to set a maximum speed for the car.

Musk, while sympathetic, replied : ''If there are a large number of settings, it will be too complex for most people to use. I want to make sure that we get this right. Most good for most number of people.''

Mr. Musk is a man who simply embraces astonishing amounts of present-day risk in the rational assumption of future gains.

The Publishing continues into the future. The World Students Society thanks author Christopher Cox.


Post a Comment

Grace A Comment!