Skip to content

Search the site

Enterprise ChatGPT on Azure? For now, Satya Nadella's just a tease

Is it all just a "nothing burger"?

The big news out of Redmond involves the expansion of OpenAI services on Azure to “general availability.” We're putting that in quotes, writes Tristan Greene, because, as far as we can tell, Microsoft's keeping access limited to a select few and there's no indication as to when ChatGPT, the newest viral sensation out of the OpenAI labs, will actually arrive on the service.

Microsoft announced the news on Monday with a blog post and a tweet from CEO Satya Nadella that read “ChatGPT is coming soon to the Azure OpenAI Service, which is now generally available, as we help customers apply the world’s most advanced AI models to their own business imperatives.”

ChatGPT on Azure? Questions galore...

The OpenAI Service on Azure started out in a limited preview back in November of 2021. As of today, it includes access to OpenAI APIs built on the company’s GPT (language generation), DALL-E (image generation), and CODEX (programming aid) models.

The big news here, obviously, is the impending addition of ChatGPT on Azure. GPT, or, to be more accurate, GPT 3.5, the technology backing ChatGPT, has been featured on the service since it launched. But ChatGPT is a specific iteration of the GPT model that's been fine-tuned for dialogue. Based on current evaluations, this could be a game-changer for enterprises using the OpenAI Service on Azure to back their company chatbots.

Join peers following The Stack on LinkedIn

That being said, it would also be great news for enterprises around the globe if Microsoft was actually opening access to the OpenAI Service on Azure to businesses around the world. Unfortunately, that doesn't appear to be the case.

As the Register's Simon Sharwood noted on Tuesday, businesses interested in such enterprise access will still need to go through the same approval and onboarding process as before. It even appears as though the same application form is still being used.

We reached out to Microsoft to confirm this, but received no immediate response.

It makes sense that Microsoft would want to individually onboard businesses for long term accounts, but the lack of trial availability or service-on-demand access paints an interesting picture.

For example, you can pretty much brute force your way through the vast majority of AWS services with a credit card, but the fact that you essentially have to pitch your data needs to the Azure team before you can add the OpenAI API tells us that Microsoft either hasn't landed on a tiered pricing structure or isn't ready to debut a separate Azure vertical just for OpenAI API customers yet.

This could just be part of the good old-fashioned up-sell routine Microsoft's perfected over the years — the best way to charge for Office is by giving Windows away for free. But it could also indicate that the future isn't crystal clear concerning pricing for the integration.

That might have something to do with the reported 49% stake in OpenAI Microsoft hopes to claim by way of a $10 billion investment. The terms and costs associated with the Azure/OpenAI partnership could shake out differently depending on how that ends up.

What does this mean for IT managers and CIOs who are on the fence as to whether they should consider buying into the Azure and OpenAI Services suite of tools? The only thing that's changed is the costs associated with adoption. And Microsoft hasn’t announced those yet.

See also: 119 new AWS services and features in 30 words each

Azure does list its pricing for OpenAI services online but you'll still need a personalized quote to understand how many training hours, tokens, and other associated services you'll actually require for your project. Here's a calculator to help give you an idea.

But the big questions for IT leaders are: when will ChatGPT actually hit Azure and what will we be able to do with it?

As to the first, Microsoft just says it'll be there “soon.” The fact of the matter is that ChatGPT is a moving target. OpenAI has been teasing GPT 4, the latest iteration of the GPT model, for months now.

If GPT 4 represents as big a leap between models as the jump between 2 and 3.5 did, then there's no telling what a ChatGPT iteration running on it might be capable of. Right now, the best use cases for enterprise might involve building out user interfaces with code, copy, and images in a matter of moments to accelerate design. We’ll have to wait and see how the next model improves on that technology.

Perhaps a more important question is, when will GPT 4 arrive? Rumors abound that a Q1 2023 launch is possible, but OpenAI founder Sam Altman has been playing up the idea that they're sitting on the model until they can be sure it's “safe” to launch.

What it all adds up to is a matter for speculation, but one view is this: OpenAI knows exactly what it's going to do and Microsoft's hedging its bets either way by capitalizing on ChatGPTs viral popularity right now.

At the end of the day, the announcement is a complete nothing-burger. OpenAI Services on Azure still require approval and they're not available in every region Microsoft serves. Couple that with the “news” that ChatGPT integration is still coming “soon,” and you've got a perfectly-timed market announcement for nothing much.

That being said, there's probably going to be a wild dash to sign up now that ChatGPT is all but guaranteed to get an enterprise access edition on Azure in the near future. You better move fast if you want in.

Get our newsletter with a single click on LinkedIn

Latest