University of California, Berkeley, former professor, Google brain outstanding engineer David Patterson
Google for artificial intelligence and machine learning and specialized research and development of dedicated chip TPU, the event shocked Silicon Valley, and R & D team behind it is a 69-year-old man.
CNBC opened the TPU research and development behind the core of the veil, the following compilation of the AI generation of the article content:
- 40 years after working in the academic field of computer architecture, David & middot; David Patterson retired from the University of California at Berkeley in the summer of 2016.
- Now, he has become the Google artificial intelligence (AI) and machine learning field dedicated integrated chip TPU's "behind the brain" and "rbquo".
- Without this key chip, Google executives expect Google to increase the data center by a factor of two to support the current limited voice processing capabilities.
In May 2015, the University of California at Berkeley held a retirement ceremony for Professor Patterson, who has been working in computer architecture for 40 years.
In this celebration, the organizers broadcast a 16-minute video, introducing Patterson's personal experience, such as his high school and university as a wrestler, at the University of California, Los Angeles, a math degree, The aircraft (Hughes Aircraft) worked briefly and finally worked for 40 years at the University of Berkeley.
Patterson and Stanford University professor John & middot; Hennessy (John Hennessy) co-authored two books, but also the president of the Institute of Computer Research. Patterson summed up his success factor, said he was only concerned about a major event. And his "next big event" can be extremely large.
After retirement, Patterson did not rush to enjoy his later years, but in July 2016 joined Google, ambitious to help them design a new chip. The chip is running at least 10 times faster than today's processor, enough to help deal with artificial intelligence required for intensive computing. This chip is called TPU, Patterson is becoming one of its main "evangelists".
After the anniversary of the retirement ceremony, Patterson gave a lecture to more than 100 students and faculty at the University of California at Berkeley. He was wearing a Google research team printed with Google Brain logo T-shirt said: "4 years ago, they (Google) had such concerns, and as a top priority to treat. & Rdquo;
Google is worried that if each Android user every day using Google's machine learning technology for 3 minutes of dialogue translation, they will have to double the number of data centers.
Google parent Alphabet's annual capital expenditure of up to 10 billion US dollars, of which a large part are used to maintain data center operations. Today, Google is betting on the so-called "machine learning revival" and the depth of learning the network requires a huge breakthrough in hardware efficiency.
As data becomes more and more complex, the computer will learn through modeling and become smarter over time.
Not long ago, Patterson also presented the same presentation at Stanford University, including the authors of the TPU Performance Report released in April.
The report finds that the TPU runs 15 to 30 times faster and 30 to 80 times more efficient than Intel and NVIDIA's contemporary processors. The report, co-authored by 75 engineers, will be released at the National Computer Architecture Symposium in Toronto in June.
The report is also Paterson joined Google's debut. He will go to Google every week at the headquarters of Mountain View City 1, and to the San Francisco office 2 times. He is responsible for reporting to Jeff Dean, who is the veteran of Google for 18 years and is also the head of the Google Brain team.
However, this is not Patterson for the first time with Google. During the 2013 to 2014 academic vacation, he traveled to Google to work. But this time, he has joined the TPU project, and served as Distinguished Engineer (outstanding engineer, equivalent to the director level).
Just two months after Paterson's retirement, the TPU chip was applied to the Google Data Center.
Retirement does not exit
Patterson is a lonely person.
He was still teaching at the same time as he was in 2013. He also participated in the weightlifting competition and broke the California record of his age group.
"In retrospect, there is no evidence that he (Patterson) has retired, even if we have held a retirement celebration for him," said Mark Hill, who has been instructed by Patterson in 1987, Mark Hill and middot; & Rdquo;
Hill is now director of the Department of Computer Science at the University of Wisconsin, where Patterson is one of the greatest academic experts in the second half of the 20th century. He called Paterson and Hennessy co-author of the computer architecture as the field of 25 years the most influential textbooks.
Google claims that TPU chips are being extensively tested within the company. It is used for every search query, as well as improved maps and navigation services.
In addition, this technology was also used to help DeepMind's smart program AlphaGo over the last year to win the tournament champion Lee Terry (Lee Sedol). But do not expect Google to compete with Intel or NVIDIA and other chip giants compete for the semiconductor market.
External developers believe that Google's processor may be used for the company's cloud computing services and TensorFlow and other products, the latter for the machine to learn the workload of open source software library.
Depth of learning start-ups Skymind co-founder and chief executive officer Chris & middot; Nickelson (Chris Nicholson) said: "Google is better at developing tools for themselves, not for others. They have never been a successful hardware company, I have not seen them in the pursuit of this strategy. "At least one start-up wants to try to fill Google's blank area."
Several engineers of this project have formed a mysterious start-up called Groq in partnership with Investor Kamas & Middot; Chamath Palihapitiya.
For the TPU, it is still very early in the initial stage. So far, this processor has proven to be very efficient in so-called "inference", which is also the second stage of deep learning.
The first stage is the training period, and Google still relies on the existing processor. But Google has always been known for its cutting-edge projects. Speaking at the University of California at Berkeley, Patterson was asked about the next project of his team, and he did not disclose any information.
Patterson said that one of the things he learned at Google was that, unlike academics, he was banned from discussing future products openly.
He said: "I can say is that Google has never stopped the pace. It seems to be a very good experience, they never give up, and no one is fired. & Rdquo;