Home > News content

Google is strong, but in the AI hardware market, it's not really an Nvidia rival

via:博客园     time:2017/5/27 17:08:38     readed:625

Google

Even as strong as Google, in a short period of time can not take the AI hardware market from the mouth of Nvidia, this fat meat.

The graphics chip maker Nvidia's share price has tripled in the past year, the GPU AI has now become the most popular fields of deep learning mainstream platform. Nvidia's GPU, originally designed for generating computer games and graphics, is now used to handle the intensive computing in the training of large, deep learning networks with millions of data.

Last week, Google launched its second generation Tensor processing unit at its annual Developer Conference

It is understood that Google hopes to be able to provide its AI hardware to other companies through Google Cloud in the future. In addition, Google plans to provide 1000 Cloud TPU for researchers conducting public AI research. However, the real intention of these Cloud TPU seems to be to enhance the competitiveness of Google cloud. In the cloud services market, Amazon and Microsoft's two giants have left Google far behind, and Google has had to look for differentiated competition.

The impact of Cloud TPU on Nvidia is twofold. On the one hand, Nvidia is still dominant in this emerging market of AI hardware, and Cloud TPU gives developers more choices. On the other hand, when Google released Cloud TPU last week, it revealed that its company still relies heavily on Nvidia's GPU, and its cloud services will also support Nvidia's next generation GPU Volta.

On the whole, I think Google's future in the AI hardware market will face severe challenges, mainly due to two:

Limitations of Cloud TPU

First of all, Cloud TPU locks users into the TensorFlow framework and the Google Cloud cloud services.

Senior analyst at Bernstein research Inc.

Nvidia's GPU is available in mainstream cloud service vendors like Google, Amazon, Microsoft, and IBM. Developers can choose freely according to their own preferences, and change the cloud service provider at any time. In addition, Nvidia has optimized its hardware to run various depth learning frameworks such as Caffe, Torch, and PaddlePaddle. The AI market is still young, and no one knows which framework will eventually win. As a result, restricting yourself to the TensorFlow framework of Google is an adventure for any company.

CEO founder and founder of AI Clarifai Matt Zeiler said:

Although the academic circle can use Cloud TPU freely, it can only use the TensorFlow framework, which limits its charm greatly. Alexei Efros, an associate professor at University of California at Berkeley, said in an e-mail:

Even for startups that have adopted TensorFlow in full, resistance remains, and that is how to get Cloud TPU. Hammond co-founder and co-founder of AI Bonsai CEO Mark said:

Google doesn't sell chips

Another big weakness for Google is that it doesn't sell chips directly to customers like Nvidia does. Zeiler says that many deep learning startups like to use their own hardware, because the cost of storing large amounts of training data in the cloud can be very high. Clarifai, for example, chose to buy Nvidia's GeForce game graphics card and train the neural network at its own data center in New Jersey. Zeiler says:

Nvidia's real competition is more likely to come from chip companies. At present, dozens of AI chip start-ups are rising, Intel even in order to acquire the top chip start-up company Nervana, cost more than $400 million.

Zeiler said that over time, TPU may affect the GPU market, but the impact is not just from Google, after all, other chip manufacturers will not sit.

The biggest problem with Nvidia is that GPU is designed to generate graphics rather than deal with AI algorithms, says Feldman Andrew Cerebras, co-founder and CEO Systems of AI chip startups. Feldman says:

Nvidia was aware of that too. As a result, it has added specialized Tensor Cores computing core to the latest graphics chip architecture. Tensor Cores is optimized for mathematical operations conducive to deep learning operations. With Tensor Cores, Nvidia's GPU is becoming more and more like a professional AI processor, not just a tool for generating graphics. Nvidia, vice president of accelerated computing Ian Buck said:

For the increasingly fierce competition in the AI field, Nvidia said, "the market is growing so fast that every enterprise has its value.". Buck says:

ViaForbes

China IT News APP

Download China IT News APP

Please rate this news

The average score will be displayed after you score.

Post comment

Do not see clearly? Click for a new code.

User comments