New processors will handle AI workloads in data center environments
Intel has unveiled two new CPUs designed for large computing centers which will be the chipmaker's first to utilize artificial intelligence (AI).
The two chips are the company's first offerings from its Nervana Neural Network Processor (NPP) line and one will be used to train AI systems while the other will handle inference.
The Nervana NNP-T, codenamed Spring Crest, will be used for training and comes with 24 Tensor processing clusters that have been specifically designed to power neural networks. Intel's new system on a chip (SoC) provides users with everything they'll need to train an AI system on dedicated hardware.
The Nervana NNP-I, codenamed Spring Hill, is the company's inference SoC that uses its 10 nanometer process technology along with Ice Lake cores to help users deploy trained AI systems.
- Google could ditch Intel’s server CPUs for AMD Epyc
- Intel debuts Comet Lake processors for laptops and tablets
- 'Nearly all' Intel chips have major security flaw
AI workloads
Intel's new AI-focused SoCs are designed to handle AI workloads in data center environments so that users no longer have to utilize its Xeon CPUs to handle AI and machine learning tasks. Xeon chips are capable of handling such workloads, though they are not nearly as effective or efficient.
The Nervana NNP-T and NNP-I were designed to compete with Google's Tensor Porcessing Unit, Nvidia's NVDLA-based tech and Amazon's AWS Inferentia chips.
Vice president and general manager of Intel's Artificial Intelligence Products Group, Naveen Rao explained how the company's new processors will help facilitate a future where AI is everywhere, saying:
“To get to a future state of ‘AI everywhere,’ we’ll need to address the crush of data being generated and ensure enterprises are empowered to make efficient use of their data, processing it where it’s collected when it makes sense and making smarter use of their upstream resources. Data centers and the cloud need to have access to performant and scalable general purpose computing and specialised acceleration for complex AI applications. In this future vision of AI everywhere, a holistic approach is needed—from hardware to software to applications.”
- This is everything you need to know about AI
Via The Inquirer
No comments:
Post a Comment