Welcome to Mostly Cloudy! Today: why Microsoft hopes OpenAI can help it build a moat around its enterprise business; Intel FINALLY ships its latest generation server chip, and IBM steps off the patent treadmill.
OpenAI CEO Sam Altman (left) knows Microsoft CEO Satya Nadella needs him more than he needs Microsoft. Photo: Microsoft
Azure thing?
Microsoft has done remarkably well under Satya Nadella transforming itself from an as-legacy-as-it-gets software company to a true powerhouse in modern enterprise computing, thanks to its willinginess to embrace ideas that weren’t invented in Redmond and to stop fighting the battles of the last war. But on the product development front, it has been quiet for the last several years, rolling out incremental improvements to its cloud infrastructure and enterprise software businesses that haven’t really stood out to enterprise tech buyers.
That approach appears set to change. Microsoft is looking to put OpenAI, the nonprofit AI research organization behind ChatGPT, at the heart of its software strategy, according to multiple reports over the past week. It has discussed investing $10 billion in the company and has explored using ChatGPT features to jumpstart an also-ran product (Bing) and to shore up a dominant product (Office).
Like almost all platform companies of its size and reach, Microsoft has invested billions in homegrown AI research and development over the last few decades. It’s telling that it feels the need to go outside to find technology it thinks could power the next generation of one of its most important products and generate huge amounts of activity on Azure, where OpenAI’s workloads run.
Microsoft enjoys a strong position in cloud computing (peers like IBM and Oracle would trade places in a heartbeat) but it can be hard to understand what makes its infrastructure services unique, compared to AWS and Google. While Azure and Office365 have definitely seen organic growth, Microsoft doesn’t like to talk about how much of its cloud growth has come from transitioning old Windows Server and desktop Office customers to the cloud versions of those products.
AWS, on the other hand, has a pioneer’s advantage and the operating discipline that Microsoft only moved to correct in the last few years. Google is finally figuring how how to translate its world-class computing infrastructure and AI research into enterprise-friendly services, cutting into Microsoft’s ability to position itself as the AWS alternative.
Assuming ChatGPT will have the eventual impact on business and consumer products that sent believers into a frenzy over the last few months (far from certain, but you have to imagine Nadella has deep insight into the future road map) it could give Microsoft a unique advantage it has sought for years to build on its enterprise tech comeback story.
Understanding how ChatGPT (and OpenAI’s other services) use Azure could allow Microsoft to build differentiated AI services for companies that will never be able to duplicate that effort and only need a fraction of its capability. AI-powered user interfaces that actually work could also allow Office to maintain its huge advantage among modern knowledge workers, even as a generation raised on Google Docs gains power.
That’s worth $10 billion. We won’t know for some time how far a lead OpenAI has over AWS and Google’s own heavily financed efforts to develop similiar products, or how long that might last, but it makes more sense than spending $51 billion on Pinterest.
Nothing rapid about it
Intel finally shipped the product that sent it into a tailspin over the last few years, releasing the Sapphire Rapids server processors to everyone a few months after seeding the chips with the big cloud computing companies.
Unfortunately for Intel, the chips still don’t quite match AMD’s latest efforts in the server market, according to The Next Platform. Sapphire Rapids was originally supposed to ship over two years ago but design and manufacturing problems forced Intel to delay the launch several times, opening up a huge window for AMD to gain market share with quicker launch cycles and giving the market time to experiment with cheaper alternatives such as AWS’s Graviton processor.
This will be a very interesting year for Intel. Two years into Pat Gelsinger’s return to the company, he can no longer point to the sins of his predecessors when explaining Intel’s current and future problems.
It sounds like Intel has overhauled the way it designs and tests its chips in hopes of finding problems earlier in the process, but server-chip competition is stronger than it has been in decades, and its customers are wary.
Around the enterprise
Nothing underscores the need for patent reform more than the fact that IBM was awarded more patents than any other company in the country for almost 30 years, a period during which it fell further and further behind companies actually coming up with tech innovation. It has wisely decided to focus on different things.
If you’re still running Windows Server 2008 in your own data center, please stop. After plenty of warnings, Microsoft is finally done supporting the on-premises version of the ancient product.
The AI boom is unevenly distributed: Scale AI announced plans to lay off 20% of its workers after running into the same problems that dozens of other enterprise tech companies have faced in the past few months.
Microsoft will keep pace with AWS and Google’s efforts to develop custom cloud silicon with the purchase of Fungible, a DPU startup.
Thanks for reading — see you later this week!