Microsoft Azure's AI inference accelerator Maia 200 aims to outperform Google TPU v7 and AWS Inferentia with 10 Petaflops of FP4 compute power.
Higher-than-anticipated capital spending and slower growth in cloud computing are weighing on shares.
See how three organizations strengthen security and accelerate AI innovation with Microsoft’s family of security products.
Microsoft says the new chip is competitive against in-house solutions from Google and Amazon, but stops short of comparing to ...
Microsoft has unveiled its latest AI chip, Maia 200, which the company says delivers up to three times the performance of ...
Through its Go Digital framework, CLA empowers businesses to adopt technologies like Microsoft Fabric for unified analytics and AI-driven decision-making. These solutions enable clients to integrate ...
Microsoft is not just the world’s biggest consumer of OpenAI models, but also still the largest partner providing compute, networking, and storage to ...
New report claims Microsoft grabbed a huge chunk of SK hynix’s memory supply — stock surged almost 9%.
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the ...
Earlier in the week, Raymond James upgraded Google owner Alphabet (NASDAQ:GOOGL) to Strong Buy, arguing the company is moving ...
Good afternoon, and thank you for joining us today. On the call with me are Satya Nadella, Chairman and Chief Executive Officer; Amy Hood, Chief Financial Officer; Alice Jolla, Chief Accounting ...
Microsoft recently announced Maia 200, a new AI accelerator specifically designed for inference workloads. According to ...