[ad_1]
In their collective search for an alternative to Nvidia’s GPUs to train large language models, Meta, Microsoft and OpenAI have announced that they will use AMD’s GPUs. This move marks a significant change in the landscape of AI hardware, potentially ushering in a new era of competition and innovation.
Why the move is important for development of AI
The decision carries substantial weight, considering the prominent roles of Meta, Microsoft, and OpenAI in the generative AIspace.Meta houses large language models, Microsoft boasts the impressive Copilot assistant, and OpenAI is the brains behind the ubiquitous ChatGPT. These AI systems, capable of generating text, code, images, and more, depend heavily on high-performance computing resources. Until now, Nvidia’s reigned supreme in this arena, but AMD’s Instinct series will provide an alternative to other tech companies.
According to a report by CNBC, the companies will use AMD’s Instinc MI300X GPU, which is in direct competition with Nvidia’s H100 GPU. AMD says that the GPU is based on new architecture and features a best-in-class 192 GB of HBM3 memory capacity as well as 5.3 TB/s peak memory bandwidth to deliver the performance needed for increasingly demanding AI workloads.
“AMD Instinct MI300 Series accelerators are designed with our most advanced technologies, delivering leadership performance, and will be in large scale cloud and enterprise deployments,” said Victor Peng, president, AMD. “By leveraging our leadership hardware, software and open ecosystem approach, cloud providers, OEMs and ODMs are bringing to market technologies that empower enterprises to adopt and deploy AI-powered solutions” he added.
Why the move is important for development of AI
The decision carries substantial weight, considering the prominent roles of Meta, Microsoft, and OpenAI in the generative AIspace.Meta houses large language models, Microsoft boasts the impressive Copilot assistant, and OpenAI is the brains behind the ubiquitous ChatGPT. These AI systems, capable of generating text, code, images, and more, depend heavily on high-performance computing resources. Until now, Nvidia’s reigned supreme in this arena, but AMD’s Instinct series will provide an alternative to other tech companies.
According to a report by CNBC, the companies will use AMD’s Instinc MI300X GPU, which is in direct competition with Nvidia’s H100 GPU. AMD says that the GPU is based on new architecture and features a best-in-class 192 GB of HBM3 memory capacity as well as 5.3 TB/s peak memory bandwidth to deliver the performance needed for increasingly demanding AI workloads.
“AMD Instinct MI300 Series accelerators are designed with our most advanced technologies, delivering leadership performance, and will be in large scale cloud and enterprise deployments,” said Victor Peng, president, AMD. “By leveraging our leadership hardware, software and open ecosystem approach, cloud providers, OEMs and ODMs are bringing to market technologies that empower enterprises to adopt and deploy AI-powered solutions” he added.
How Meta, Microsoft, OpenAI intend to use AMD’s GPU
As per the report, Meta will be using AMD’s MI300X GPUs for “AI inference workloads such as processing AI stickers, image editing, and operating its assistant.” OpenAI, on the other hand, all use it for Triton 3.0, which is used for AI research. Meanwhile, Microsoft will be deploying AMD Instinct MI300X accelerators to power the new Azure ND MI300x v5 Virtual Machine (VM) series optimised for AI workloads. Oracle plans to include AMD Instinct MI300X accelerators in its upcoming generative AI service.
[ad_2]
Source link
More Stories
Google Maps: Three privacy features coming to Google Maps on Android, iPhones
Most-Downloaded IPhone App: This Chinese app was the most-downloaded iPhone app in the US in 2023
Ukraine’s largest mobile operator goes offline for millions of users after cyber attack