Hyperscalers like Google, Facebook and Amazon already have it available, their own chip to run Artificial Intelligence (AI). Now there is another cloud provider on the market launching its own AI chip, namely Alibaba.
At the in-house conference ‘Apsara,’ which took place in Hangzhou, China, Alibaba presented its AI inference chip ‘Hanguang 800’ developed in-house by the company’s research unit, Pintouge research. This technology is currently being used internally for the ecommerce company’s services, but the Chinese technology giant is planning to make the chip available on the Alibaba Cloud at a later date.
“The launch of the Hanguang 800 is an important step towards next generation technologies and strengthens the computing capacity that will power both our current and emerging businesses while improving energy efficiency,” said Jeff Zhang, CTO of Alibaba Group and President of Alibaba Cloud Intelligence. “In the near future, we also want to strengthen our customers by offering them access to the advanced computers that the chip enables through our cloud business.”
According to Alibaba, a single AI chip should be able to handle 78,563 inferences per second (IPS) at peak performance with an efficiency of 500 IPS per watt based on Resnet-50 Inference tests.
Alibaba’s research unit, Pintouge, has already developed several processors for Alibaba‘s internal and cloud applications. Earlier this year, the company unveiled Xuan Tie 910 – an IoT processor based on the open source RISC-V instruction set architecture.
Alibaba plans to license the entire IP to chip manufacturers. This would open up a new revenue stream for the company. According to Alibaba, components of the associated code will be released on Github to extend the reach of the architecture.