Wen / Gu 20
Source: new entropy (ID: Baoliao Hui)
In Robin Li's view, China's users can give up privacy and security for convenience. But now in the field of smart cars, even if the owners of information streaking, driving freedom is still difficult to achieve.
A few years ago, the auto driving schedule that the auto industry had boldly predicted is now being delayed. Before 2019, the general recognition that automatic driving technology was difficult to apply was attributed to the 4G network which could not meet the requirements of low delay at that time.
With the improvement of 5g infrastructure development and deployment in 2020, it is possible to meet the low delay network of automatic driving. 2020, known as the first year of 5g, also ushers in the first year of automatic driving.
In order to show confidence, on Tesla's autopilot open day in April of the same year, musk once again strengthened its commitment: by the end of 2020, Tesla will upgrade its autopilot system to full autopilot function, configure it on model s and model 3, and deploy more than 1 million driverless taxis.
At the opening ceremony of the world artificial intelligence conference in Shanghai on July 9, 2020, musk gave a speech and said: Tesla is very close to L5 automatic driving, and is confident to complete the development of L5 level basic functions in 2020.
In Musk's view, there is no fundamental problem in realizing L5 level automatic driving, but there are many detailed problems to be solved.
Today, the automatic driving schedule of Tesla is being considered by the media as musk bubble, because it is proved that the L5 level automatic driving which was not implemented in 2020 is almost impossible to achieve even in 2021.
Musk once thought it was just a small problem of details, but now it is becoming a big problem that makes the smart car deeply involved in public opinion.
Owners' streaking everywhere
According to Intel's forecast, a self driving networked car that runs 8 hours a day will generate 4tb of data in 2020. As a comparison, in 2020, Internet users will generate 1.5GB of data every day. Based on this calculation, the amount of data generated by an Internet connected car every day is equivalent to the amount of network data generated by 3000 people every day.
Such a huge amount of data mainly comes from hundreds of on-board sensors in order to achieve automatic driving. According to Intel research data, the camera can generate 20-40mb of data per second, and the lidar can generate nearly 10-70mb of data per second.
Although the amount of data is amazing, new energy manufacturers can't get the trust of car owners for the way and use of data collection.
Take the Tesla monitoring gate, which caused social controversy recently when musk personally came down to respond on twitter. The cause of the incident is that musk sent a tweet to remind the trial users of automatic driving bate version, saying that according to the surveillance camera, some of the existing car owners do not pay enough attention to the road conditions, and they need to withdraw the trial permission for these users.
Then a netizen asked if the camera in the car could monitor the car owner, and musk said yes. After that, the issue of whether Tesla monitored the car owners and infringed on their privacy aroused heated discussion among netizens.
For this matter, Tesla official subsequently issued a statement on March 19, saying: Tesla's vehicles do not infringe the owner's privacy through the in car camera, and stressed that all Tesla vehicles in the Chinese market have not turned on the in car camera.
The official answer didn't reassure car owners. According to a recent Weibo poll by Sina Technology, 70% said they didn't believe it. The main reason may be that in 2017, for the function of the camera inside the car, Mask once said that it was prepared for sharing the autopilot car, and could monitor whether anyone deliberately smudge your car through the camera.
Later, automatic driving was postponed indefinitely, and the sharing plan was not settled. But the camera in the car is used to monitor the driver, which makes the car owners think that monitoring or not is entirely up to Tesla, and they can't know or control it freely.
However, people no longer believe in smart cars, because compared with their right to know of the installation according to the regulations, Tesla and other smart cars may not need to let you know that they can start the camera in the car through software update.
According to the IT times, some models of a group of new forces such as Weilai and Xiaopeng have cameras to monitor whether car owners are tired. As for whether it will infringe on privacy, manufacturers say that they will only detect drivers' eyeballs and give reminders when they are tired.
In addition to concerns about cameras that may reveal privacy, the privacy of humble second-hand parts has also raised concerns before.
CNBC, the US consumer media, pointed out earlier that if you unfortunately crash your Tesla car, even if it is towed to the dump, you should not forget that it may carry a lot about your history, because the data stored in the Tesla electric car is far beyond your imagination.
Moreover, in March 2019, according to U.S. consumer news reports, the retrieved Tesla still has data stored. Although the user can use the factory settings option to clear sensitive data in the car, once the old parts are removed from the car, the user cannot clear them.
The right to know about the data is not clear, as the owner does not know what state the car is in; the data of second-hand parts is not completely erased, and the owner's private information is accompanied by the flow of second-hand parts among different people.
According to the statistical data in the proposal on strengthening the data security supervision of intelligent vehicles submitted by Tan Jianfeng, President of Shanghai Information Security Industry Association of the two sessions of this year, Tesla can collect more than 200 information covering the personal information of vehicle owners, vehicle environment information, vehicle driving information, and vehicle owner's hand information, and there are also more than 170 domestic similar manufacturers.
A lot of information and data are collected to support automatic driving, but even so, automatic driving is still difficult to achieve.
Hard to drive freely
The stability of automatic driving, like the privacy and safety of car owners, is also an unknown data black box.
Take Tesla, which has been publicly declared to be self driving, as an example. On March 6, musk said that the beta version of Tesla's full driving system had been released. However, shortly after the launch of the internal test version, in a 13 minute public video of the evaluation blogger, Tesla cars in automatic driving mode frequently appeared on a street in California.
On March 23, the owner of Tesla Model 3 in Taiwan, China, had an accident while trying out the automatic driving function. After passing the red light, the vehicle ran into several motorcycles running horizontally and eventually overturned. A week earlier, another Tesla Model 3 crashed into a stationary police car in Michigan, triggering an investigation by the national highway traffic safety administration.
In March 18th, the National Highway Traffic Safety Administration officially announced that it had investigated 27 safety accidents in the Tesla automatic driving system, of which at least 3 occurred in recent weeks.
In fact, since 2016, automatic driving related safety accidents have been happening.
On January 20, 2016, a rear end collision accident occurred in Handan section of Beijing Hong Kong Macao Expressway in Hebei Province. A Tesla car directly hit a road sweeper in operation. The Tesla car was damaged on the spot and the driver Ya Ya Ning was killed. It took more than a year. On February 27, 2018, Tesla confirmed that the vehicle turned on the automatic driving function when the accident happened.
On May 7, 2016, a driver in Florida was killed because the sensor and camera system failed to detect a large truck crossing the road and failed to brake, which caused many doubts about Tesla's automatic driving technology.
In March 24, 2017, an autopilot of Uber collided with another SUV on the road test. In August 26, 2017, there were more than ten traffic accidents in Google automatic driving vehicle.
Automatic driving is difficult to achieve, but the cause of automatic driving accidents is basically human.
The report said that when the owner tried to park the Tesla vehicle in the garage or on the side of the road, the vehicle suddenly accelerated. There are also some owners said that sudden acceleration occurred in the process of traffic or the use of automatic driving assistance system, eventually leading to a crash. The vehicles involved include Tesla, model 3, model s and model x produced from 2013 to 2019.
After a year of detailed evaluation, the national highway traffic safety administration of the United States held that the vast majority of accidents were not caused by technical failures of Tesla vehicles, but were caused by misoperation of vehicle owners, and Tesla's design would not improve the probability of misoperation of vehicle owners.
On the one hand, relying on its own data, almost all of Tesla's test reports point the problem to the car owner; on the other hand, relying on the release of quarterly data reports, Tesla is proving that automatic driving is becoming safer and safer.
Since 2018, Tesla has been making a benchmark for its improvement in automatic driving safety by publishing such a report to compare the mileage of each accident between automatic driving and non automatic driving. However, according to foreign media reports, many people doubt the authenticity of the data.
When Tesla provided data to the National Highway Traffic Safety Administration (NHTSA) in 2017, QSC, an American consulting company, once found that there was a problem with the two data points provided by Tesla to NHTSA: the last reading of the odometer before the installation of the autopilot system was different from the first reading of the odometer after the installation.
In short, it is Tesla's manipulation of statistical data that makes NHTSA's data totally untrustworthy. According to QCs, Tesla provided NHTSA with data on 43481 vehicles, 69% of which lacked mileage data before autopilot was activated.
In addition to Tesla, which has always claimed to be self driving, several major intelligent manufacturers of Tesla, such as ideal car, had an accident in Qingdao in 2020 due to assisted driving.
Even so, it is still one of the ways to speed up automatic driving. Because in the view of the industry, only on the road can there be data, and a large amount of data can promote the development of automatic driving.
What's wrong with automatic driving
Compared with electric vehicles, battery technology is often emphasized to ease the mileage anxiety of vehicle owners. Automatic driving vehicles love to emphasize data volume, and data volume has also become one of the criteria for measuring a company's automatic driving ability.
Data is the fuel of AI, but for car owners, even if they entrust their personal information data to them, automatic driving has been delayed again and again. Even radical as musk, but also frequent accidents in Tesla from the Chinese official off the shelf of the automatic driving propaganda gimmick.
The main advantage of the visual faction is that the hardware is cheap, but because of the long training time of the software neural network and the expensive data set, it is estimated that Tesla's subsequent full set of FSD updates will reach about $200000, so Tesla is also known as a car factory that will sell software to increase revenue in the future.
Industry analysts believe that: through data feeding, Tesla can continue to support neural network algorithm iteration, but only data feeding training is far from enough. More infrastructure, information exchange between vehicles and information collaboration between vehicles and roads are the future of automatic driving in the Internet of vehicles.
For autopilot, no matter how powerful the neural network algorithm is and how perfect the data set training is, there is an inevitable problem that it will face infinite approximation on AI. In short, it can only approach 100% correctness infinitely, but it can't be as 100% machine correctness in 0 and 1 binary computer forever.
Human beings allow themselves to make mistakes, but for autonomous driving, there is no conclusion about the safety of infinite approach.
However, according to the experience of ISO safety certification of traditional automobile manufacturers, for safety, take the 10-year-old with the highest human survival rate as the standard, as long as the safety of the car is higher than the 10-year-old, the human survival rate is considered to be safe enough.
We may need to improve a lot of automatic driving standards before they come. Such as data privacy, such as security standards, not just a sec automatic driving classification.