NFI Decrypted Tesla Storage Code
As talked about in class, information online is not as secure as it seems. We talked about how if it’s ethical for self-driving cars should exist and how they should make decisions – and WHO should make the decisions about the decisions the car makes. Lately, there have been many accidents involving Tesla’s self-driving/autopilot – notably with emergency vehicles. The Dutch government’s forensics lab decrypted the code to Tesla’s driving data-storage system. The Netherlands Forensic Institute (NFI) said there was much more data involving accidents than Tesla provided. Tesla collects data such as speed, accelerator pedal position, steering wheel angle, and brake usage. There’s constant security surrounding accents.
Tesla only gave some and in a small time frame. They complied with investigators by giving information, just not all because they only gave what was requested, and the investigators could only ask for what they knew of.
In an instance that was evaluated, a Tesla rear-ended a car that suddenly stopped on the highway. Data showed that the driver had a normal reaction time, but the autopilot wasn’t keeping a distance far enough away from the car.
I personally think that if Tesla wasn’t asked for the information, then they shouldn’t be required to give it over. But, I think Tesla should still be held accountable for these accidents.