You have probably heard over the last few months that Tesla cars have been making headlines for their autopilot feature not quite living up to its billing. To be fair to Tesla, the autopilot feature is not supposed to function in the same way as a self-driving car. But this feature is named in a way that makes consumer and drivers think that they don’t have to think, pay attention or have their hands on the wheel when it is active.

The last few months have shown that sentiment is demonstrably untrue.

There have been a number of high-profile accidents in recent months involving Tesla vehicles and the autopilot feature, the most recent one occurring in China. Thankfully no one was hurt, but the story highlights two things.

The first is that autopilot and self-driving technology is still in its infancy. These programs and function can’t be relied upon yet, and even if they were reliable, there would still be a massive societal and legal shift that would need to occur to accommodate such vehicles. For instance, who is liable if a self-driving car crashes into another vehicle? What happens if the GPS, guidance systems or lane sensors fail? From a legal standpoint, there are still a lot of questions that need to be answered.

The other aspect here is that, in it’s current state, autopilot shouldn’t be used by drivers. It can lead to accidents which they may be liable for in some way.

Source: Washington Post, “After yet another Tesla crash, do autopilot critics have a point?,” Brian Fung, Aug. 10, 2016