Microsoft has been instructing engineers working on its AI infrastructure to get ready to host Elon Musk’s Grok AI model, according to a trusted source familiar with the plans. In recent weeks Microsoft has been in discussions with xAI to host the Grok AI model and make it available to customers and Microsoft’s own product teams through the Azure cloud service. The move could prove controversial internally and further inflame tensions with Microsoft’s partner OpenAI.
I’m told that if the deal proceeds, Grok will be available on Azure AI Foundry, Microsoft’s AI development platform that gives developers access to AI services, tools, and pre-built models in order to build AI applications and agents. This will allow developers to tap into Grok and use it within their apps, and for Microsoft to potentially use the AI model across its own apps and services. Microsoft refused to comment for this story.
Microsoft has been steadily growing its Azure AI Foundry business over the past year, and has been quick to embrace models from a variety of AI labs that compete with Microsoft’s partner OpenAI. DeepSeek, the Chinese startup that shook up the world of AI earlier this year, forced Microsoft to move quickly to embrace its supercheap R1 model. The DeepSeek deployment on Azure AI Foundry was unusually fast for Microsoft, as I reported previously in Notepad, with Microsoft CEO Satya Nadella moving with haste to get engineers to test and deploy R1 in a matter of days.
I understand Nadella has been pushing for Microsoft to host Grok, as he’s eager for Microsoft to be seen as the hosting provider for any popular or emerging AI models. Microsoft’s Azure AI teams are constantly having to onboard new models or procure hardware that unlocks even more AI capabilities, in Microsoft’s bid to build an AI platform and turn AI agents into a digital workforce.
“All of the systems that we’ve built for 50 years need to apply to AI agents,” said Asha Sharma, corporate vice president of Microsoft’s AI platform, in an interview with The Verge last month. “For Azure AI Foundry we’re thinking about how we evolve to become the operating system on the backend of every single agent.”
Making Grok available to developers through Azure is part of Microsoft’s goal to become that important infrastructure and platform behind AI models and AI agents, but it doesn’t mean AI labs are turning to Microsoft for their AI model training needs. xAI chief Elon Musk reportedly canceled a potential $10 billion server deal with Oracle last year, and posted on X at the time that xAI would be moving to train its future models “internally” instead of relying on Oracle servers.
It’s not clear if Microsoft will secure an exclusive deal on hosting the Grok AI model, or whether competitors like Amazon will also be able to host the model. I understand that Microsoft is looking at only providing capacity to host the Grok model, and not the servers for training future models.
Microsoft’s move to host Musk’s Grok AI model could create some tension internally at the company, particularly given his involvement in the controversial Department of Government Efficiency (DOGE) project. Musk has said he will step back from his work at DOGE at some point this month, and an announcement of Grok on Azure may well come at Microsoft’s Build developer conference on May 19th.
Beyond DOGE concerns, hosting Grok could also further inflame tensions in Microsoft’s partnership with OpenAI. The ChatGPT maker countersued Musk earlier this month over claims that the Tesla boss is using “bad-faith tactics to slow down OpenAI.” Elon Musk and OpenAI have been in a rather public spat, stemming from Musk’s messy breakup with the AI lab he helped to co-found.
At the same time, there have been multiple reports of tensions between Microsoft and OpenAI over capacity requirements and access to AI models. Just this week, The Wall Street Journal reported that Nadella and OpenAI CEO Sam Altman are “drifting apart,” and that Nadella’s hiring of Mustafa Suleyman last year was an “insurance policy against Altman and OpenAI.”
Suleyman and his Microsoft AI team have reportedly been working on building AI models that can compete directly with OpenAI, but without much success. That’s led to Microsoft continuing to rely on OpenAI for most of its AI features in Office and Copilot. I understand Microsoft had also been anticipating OpenAI’s GPT-5 model this month, but OpenAI’s schedule has been all over the place in recent weeks with delays to new model announcements and capacity issues after the success of its upgraded image generation. It’s now unlikely that GPT-5 will appear this month, I’m told.
Hosting Grok on Azure is another clear sign that Microsoft is willing to look elsewhere for AI models. Microsoft-owned GitHub Copilot already supports models from Anthropic and Google, in addition to OpenAI. So, it’s not inconceivable that one day the main Copilot will also let you pick from a variety of competing AI models, especially if it helps further Microsoft’s ambition to become the number one destination for AI developers and users.
I’m always keen to hear from readers, so please drop a comment here, or you can reach me at notepad@theverge.com if you want to discuss anything else. If you’ve heard about any of Microsoft’s secret projects, you can reach me via email at notepad@theverge.com or speak to me confidentially on the Signal messaging app, where I’m tomwarren.01. I’m also tomwarren on Telegram, if you’d prefer to chat there.
Thanks for subscribing to Notepad.