Intel And Facebook Collaborated To Introduce AI "Inference" Nervana Processors

As the demand for faster computing continues to rise, so as the needs for better processors for AI.

Here, Intel is continuing its quest to build AI-focused processors. With the help of the social giant Facebook, the semiconductor giant unveiled a new version of its Nervana neural network chip, which focuses on AI inference.

This chip is designed and tuned to help apply knowledge to new data rather than training with usual deep learning algorithms. To do this, the chips run on pre-trained machine learning algorithms which should make devices use less hardware and less energy to do AI stuffs.

In turn, the chips can make it cheaper and easier for companies to use artificial intelligence.

Intel revealed this AI chip at the 2019 Consumer Electronics Show in Las Vegas.

With AI making computers smarter, the demand for such technologies is indeed rising.

As AI software and hardware are becoming the needs for companies that aim for an edge in the development and deployment of AI, Intel wants a huge role in the market as their supplier.

Here, the upgraded Nervana chips leverage "inference" AI which should help companies (including Facebook itself) to deploy machine learning technologies in a more efficient and effective manner.

For example, Facebook uses AI to do a lot of things. From tagging people in images, targeting ads, translating posts from one language to another, policing News Feed to block prohibited contents and so forth. With the many AIs Facebook is using, the social giant requires a lot of time and energy to maintain the system.

Using this inference-focused Nervana chips, Facebook can deploy those AIs more cheaply as it requires less resources than generic hardware.

This advancement in AI should help many to benefit in AI.

Intel Nervana
Intel Nervana NNP processor chip

This is also a step up for the company that is somehow lagging behind Nvidia, the competitor which is at the time, the leader for AI hardware.

Nvidia specializes in creating dedicated chips which provide split computation. Because deep learning run less efficient on general-purpose computer chips, Nvidia which includes split computation on its GPUs, quickly secured its spot in the high-end AI hardware business.

Intel kickstarted its AI chip development when it acquired a startup called Nervana Systems in 2016. A year later, the company announced its first AI chip, the Intel Nervana Neural Network Processor (NNP). And the upgraded Nervana chips announced at the CES, is called the NNP-I (the "I" for "inference").

With this chip, Intel is facing fierce competition with Nvidia, as well as some other chip-making companies, including Google and Amazon which are also developing chips to power their cloud AI services.

Intel may be late, but the company's knowledge and experience in creating integrated circuits should be a key factor in design innovations and performance improvements.

As for Facebook, the company is also developing its own AI software packages. It is also designing some silicon chips to benefit its own.

Showing how companies are working together to create ever-increasing compatibility of AI-focused software and hardware, the world is yet to take another step in AI evolution.

With the vast amount of data people are generating on their devices, the raw data is valuable for these AIs to learn and explore.

With time, the technology should advance far enough, but at the same time, eco-friendly as the components produce less carbon footprints when operating.

Published: 
08/01/2019