Make wishtv.com your home page

How Tesla can sell ‘full self-driving’ software that doesn’t really drive itself

The inside of a Tesla vehicle is viewed as it sits parked in a new Tesla showroom and service center in Red Hook, Brooklyn on July 5, 2016 in New York City. (Spencer Platt/Getty Images)

(CNN) — Tesla CEO Elon Musk has said the company will roll out the latest beta version of its “full self-driving” software to 1,000 owners this weekend.

Yet there aren’t actually any self-driving cars for sale today, according to autonomous vehicle experts and the National Highway Traffic Safety Administration, which regulates cars. Tesla’s “full self-driving” is more like an enhanced cruise control, they say.

Videos posted on the internet by people who already have the feature unlocked show that it might stop for traffic lights and turn smoothly at intersections, but it also might veer toward pedestrians or confuse the moon for a traffic signal.

Tesla says that a human driver needs to be watching and ready to take over at any moment, and the company is only allowing initial access to the system to the people it considers the safest drivers.

Despite those limits, Tesla is free to call its technology “full self-driving.” Tesla owners who download the “full self-driving” beta must check a box confirming that they understand they are responsible for remaining alert with their hands on the wheel, and must be prepared to take action at any time. “Full self-driving” does not make their car autonomous, it says.

A person buying a Tesla vehicle on its website sees the technology described in big, bold letters as “full self-driving,” but the fine print below that says the technology is a driver-assist technology. Driver-assist technologies are intended to help a human drive more safely, with features such as forward collision warning, blind spot warning and lane departure warning systems.

“The problem is that Tesla has one foot on both sides,” said Bryan Reimer, a scientist at the MIT AGE Lab whose research has looked at driver attention with Tesla’s features.

The U.S. government has no performance standards for automated driver-assist technologies, Reimer said. Tesla or any automaker can essentially do whatever they want when it comes to these technologies.

“We’re at mercy of the auto manufacturers to put in safe systems. We are reliant on drivers wanting to be risk adverse,” Cathy Chase, president of the Advocates for Auto and Highway Safety told CNN Business. “The combination of those two is a perfect storm for future disasters.”

Tesla did not respond to a request for comment and generally does not engage with the professional news media.

There are signs that the NHTSA is moving toward regulating driver-assist technologies, but how fast that might happen remains to be seen. It can take years to complete a rulemaking process, and sometimes an administration may halt a process a previous administration started. NHTSA’s agenda earlier this year called for proposing a rule that would set performance standards for automatic emergency braking as well as specifying a test to determine if automakers comply.

NHTSA launched an investigation this summer into Tesla’s rear-ending emergency vehicles while using Autopilot. Chase said she’s concerned that the Tesla technology may play a role in other types of crashes we aren’t yet aware of.

Autonomous driving experts have long cautioned that Tesla’s description of “full self-driving,” and its more rudimentary predecessor, Autopilot, may lead to drivers putting too much trust in the technology. Tesla drivers have already died in high-profile crashes using Autopilot, drawing rebukes from the National Transportation Safety Board.

U.S. Senators Ed Markey and Richard Blumenthal, Democrats from Connecticut, have called on the Federal Trade Commission to investigate Tesla and take enforcement action because Tesla’s marketing overstates its vehicles’ abilities, they say.

“When drivers’ expectations exceed their vehicle’s capabilities, serious and fatal accidents can and do result,” they wrote.

The Federal Trade Commission, which is tasked with protecting consumers from deceptive or unfair business practices, declined to tell CNN Business why it has not taken any action against Tesla. The FTC also said it does not comment on whether it has any open investigations, a spokesperson said.

Picking the wrong technology

NHTSA and Congress, which can push the agency to regulate specific things, missed an opportunity to focus on driver-assist technologies in recent years, according to autonomous driving and law experts.

Rather than regulate driver-assist, they focused on self-driving vehicles, the types of cars that wouldn’t need steering wheels or pedals. The U.S. House of Representatives passed a law related to self-driving cars in 2017. NHTSA has released several versions of guidance for fully autonomous vehicles.

“It’s jumping ahead many, many steps while we’re not addressing what could be saving lives right now,” Chase said.

The decision to focus on the revolutionary technology followed years of hype from self-driving companies. Some of the biggest names in the industry said the technology was only years way. But self-driving software has proven incredibly difficult and companies have failed to hit predictions. Only Waymo offers a self-driving robotaxi service, and it’s in a limited portion of the greater Phoenix area. On rainy days, Waymo’s vehicles require a human behind the wheel.

Bryant Walker Smith, a professor at the University of South Carolina law school, said that European regulators are far ahead of the United States on regulating driver-assist technologies. Smith said that technologies like pedestrian detection and automatic braking are sensible to mandate on new cars, and there should be performance standards for them.

“There’s low-hanging fruit,” Smith said. “We could save lives far earlier than we have self-driving cars.”