DMCA.com Protection Status The chip ‘war’: Microsoft may launch its own AI processor next month – Times of India – News Market

The chip ‘war’: Microsoft may launch its own AI processor next month – Times of India

The chip ‘war’: Microsoft may launch its own AI processor next month - Times of India

[ad_1]

As AI is taking the centre stage, tech giants are looking to reduce their dependence on chipmakers and maximise their ROI. Recently, a report said Microsoft-backed OpenAI, the company that launched ChatGPT, is looking to foray into AI chipmaking, and now a separate report claimed that the Windows maker is planning to unveil the company’s first chip designed to process AI models at its annual developers’ conference.
Citing a person with direct knowledge of the matter, The Information reported that the launch, “a culmination of years of work, could help Microsoft lessen its reliance on Nvidia-designed AI chips, which have been in short supply as demand for them has boomed.”
The chip is reportedly codenamed “Athena” and could debut at Microsoft’s Ignite annual conference which is starting November 14 in Seattle.
The Microsoft chip is said to be similar to Nvidia GPUs and is designed for data centre servers that train and run large language models, the software behind conversational AI features such as OpenAI’s ChatGPT.
Microsoft currently uses Nvidia GPUs to power LLMs for cloud customers, including OpenAI and Intuit, as well as for AI features in Microsoft’s productivity apps. The chip could allow Microsoft to reduce its reliance on Nvidia’s H100 GPU, which is said to be facing supply constraints due to surging demand.
OpenAI may make its own AI chips
The development comes at a time when a report suggested that OpenAI is also considering making its own AI chip. The company has indulged in discussions of AI chip strategies since at least last year, news agency Reuters reported. OpenAI CEO Sam Altman has also made the acquisition of more AI chips a top priority for the company, it said.
Meanwhile, Google has a Tensor Processing Unit, or TPU, to train its large generative AI models like PaLM-2 and Imagen. Amazon has proprietary chips both for training (Trainium) and inferencing (Inferentia).



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *