The performance of cloud AI chip is relatively strong, and it can support a large number of operations to run together at the same time. In addition, it can also support a variety of different applications such as pictures and voice. In AI chip, the technology based on cloud AI chip can make a variety of intelligent devices and cloud servers connect quickly, and maintain the maximum stability in the connection, which is also very important. For example, the online voice translation function we use on the Internet now is not only based on AI Artificial Intelligence technology, but also benefits from the application of cloud AI chip, which can ensure that devices can obtain cloud information in time. Ai artificial intelligence technology has a strong development momentum in 2018. Although we only see the word AI Artificial Intelligence often, there are many chips behind it Line assisted run support.
AWS originally wanted to ask other manufacturers to design arm chips that can run on a large scale in the cloud, but after years of waiting for no results, it simply rolled up its sleeves and designed its own chips. AWS kicked off the annual re: invest cloud computing event on Monday night with a host of new services, including a new cloud chip designed by itself.
The arm based chip, called graviton, is available to its cloud customers through AWS’s EC2 cloud computing service. It was designed by Annapurna labs, a chip developer that Amazon acquired in 2015. Baineng network, affiliated to Qinji group, is a leading electronic industry service platform in China. It provides online components, sensor procurement, PCB customization, BOM distribution, material selection and other electronic industry supply chain solutions to meet the comprehensive needs of small and medium-sized customers in the electronic industry.
The main attraction is that the chip provides lower cost computing services. Amazon claims that its chips cost up to 45% less than Intel’s or AMD’s chips when running specific workloads, and AWS also offers chip rental services. In other words, in addition to the examples of Intel chips and AMD chips, cloud customers now have a third choice in terms of chips.
Speaking on the stage in the evening, Peter de Santis, vice president of AWS global infrastructure and customer support, said that the chip service or new instance itself is called ec2a1, which is designed for so-called scale out applications that can run on many machines, and running on many machines is a feature of many web applications. More specifically, in addition to being suitable for web servers, development environments and cache server clusters, this chip service is also suitable for containerized microservices or bundled applications, so that they can run on a variety of types of computers and software.
Graviton chip marks the dominant position of ARM chip. Traditionally, ARM chip is mainly used for low-power devices such as smart phones, and recently for devices such as network routers. Now, they are starting to enter the mainstream data center servers, and the use of graviton chip in A1 computing instance means that it is now in the cloud.
At the same time, this signal indicates that Intel and AMD no longer dominate the data center or cloud field, but Intel still occupies the dominant position. Over the years, amd has tried to challenge Intel’s dominant position, but with little success. However, its new epyc chip has been making progress, and has received good response from server buyers and AWS and other cloud companies.
Graviton is not the only enhanced feature announced by AWS. The company also announced other new computing examples, including a geek named p3dn, which provides the function of graphics processing unit and can be used to run machine learning, AI and high-performance computing applications. It uses NVIDIA’s high-end V100 Volta chip to provide what AWS AI general manager Matt wood calls the largest and fastest training example in the cloud.