On May 20, according to foreign media reports, Microsoft announced at the annual developer conference, build 2020, to build the world's top five supercomputers.
Microsoft says, With OpenAI exclusive partnership with AI non-profit organizations, a top five supercomputer has been built to train oversized artificial intelligence (AI) models on Azure public cloud.
This is a single system supercomputer with more than 285000 CPU cores, 10000 GPUs and 400gbps network bandwidth per GPU.
Microsoft says it's the latest
In the past, limited by computing power and operational efficiency, machine learning experts used to train individual tasks with smaller individual AI models, such as translating languages, recognizing objects, reading words, etc., Microsoft said.
A higher level of super large scale AI model has been proved to have better performance:It can deeply understand the subtle differences in language, grammar, knowledge and concept, and better complete complex tasks, such as summing up long speeches, eliminating swearing in real-time game voice, finding relevant statements from thousands of legal documents, and even directly finding code writers from GitHub.
Microsoft's massive AI development
In February this year, Microsoft released the Turing model for natural language generation, which is the largest AI language model in the world.
Microsoft's goal is to open up large AI models, optimization training tools and supercomputer resources through azure AI services and GitHub open source community, so that developers, data scientists and commercial users can use this super large-scale AI platform to develop their own projects.
User comments