How Artificial Intelligence Changes The Design of Chips
The names of Intel, Nvidia, AMD, Qualcomm are bound by many aspects. Defeated of all competition within this factor, there is a new category: artificial intelligence. This trend, apart from having become a repetitive term in recent months, is influencing the guts of hardware. The processor manufacturers have launched the race to position themselves advantageously in the field. And they are changing the design of chips to suit the demands of AI.
A chip design adapted to artificial intelligence is important as long as they have a greater ability to deploy and train specialized algorithms. Today AI hardly accounts for 0.1% of workloads in data centers. But it is expected that in the future a large part of the servers will be engaged in artificial intelligence tasks.
This is the scenario for the cloud, which will be the great provider of AI services, such as personal assistants. But the technology will also be given in mobile device, not connected by cable to the network, as autonomous cars. Hence the big names inside the manufacturers of processors have interest in placing itself at the head of the movement.
Intel works hand-in-hand with Nervana System, a company it acquired last year, to adapt its Xeon and Xeon Phi processors to the AI. In addition, the company has launched its new Lake Crest chip, which effectively supports deep learning tasks. This mode of AI, which bases the learning on a layer structure, to simulate a neural network, is one of the chosen ones to mark the way that the technology will follow.
The other company that most efforts are doing in AI is Nvidia. The graphics processor firm is still pending video games, but has broadened its view. Data centers and virtual reality are among your preferences now too. And the trend of the future is clear: artificial intelligence, again.
Nvidia has some advantages in this area. The first thing is that it has been working this field for years without making much noise. But it is that one of the determining factors today for success in applying an AI algorithm is the graphic component. Image recognition has been widely used in research, in medicine, for self-employed cars, on online platforms. And the graphic power of the company has driven part of these jobs.
The company works to make its chips easier to perform mathematical calculations and other operations suitable for AI. Also AMD is at it, although its effort has more to do with the acceleration software for its processors. For its part, Qualcomm has been hormonally lately its Snapdragon with accelerators for automatic learning.
And, behind all these companies, come startups with fresh material. An example is Graphcore, a Bristol firm that develops deep learning chips from scratch.