This era of enterprise software is either the dawn of a new era of corporate productivity or the most hyped money pit since the metaverse. ServiceNow's Amit Zavery talks about the impact of generative AI, how SaaS companies should think about AI models, and his decision to leave Google Cloud.
Today: Microsoft shores up its AI strategy heading into a pivotal year, Meta is getting into the AI SaaS business with the former leader of Salesforce's AI division, and the latest enterprise funding.
Today: OpenAI would rather ChatGPT users spend more time using its tool than other "copilots," HPE rolls out a new supercomputer design, and the quote of the week.
Today: Why GitHub's embrace of OpenAI's rivals marks a turning point in the generative AI era, Google Cloud revenue surged during the third quarter, and the latest funding rounds in enterprise tech.
Welcome to Runtime! Today: Why GitHub's embrace of OpenAI's rivals marks a turning point in the generative AI era, Google Cloud revenue surged during the third quarter, and the latest funding rounds in enterprise tech.
(Was this email forwarded to you?Sign up here to get Runtime each week.)
Thanks for the billions though
GitHub Copilot is the shining star of the generative AI boom, perhaps the most widely used AI enterprise tool among businesses around the world. Microsoft and its GitHub subsidiary spent the last several years promoting their special relationship with OpenAI and its large-language models as the secret sauce behind the coding assistant, but times have changed.
GitHub Copilot users will be able to swap in AI models from Anthropic and Google to generate answers when using Copilot Chat to ask questions, GitHub CEO Thomas Dohmke announced Tuesday at GitHub Universe. "It is clear the next phase of AI code generation will not only be defined by multimodel functionality, but by multimodel choice," he said in a blog post.
Anthropic's Claude 3.5 Sonnet — which has gained a lot of traction as a coding assistant this year — is available today, while Copilot users will be able to select Google's Gemini 1.5 Pro in "the coming weeks," according to GitHub.
Developers will be able to continue using several models from OpenAI, including GPT-4o, o1-preview, and o1-mini.
“We truly believe that the era of a single model is over,” Dohmke told TechCrunch, citing the tradeoffs that Copilot customers need to make between latency and accuracy.
It's not clear from Tuesday's presentation if other models, such as Meta's Llama, will eventually make their way into Copilot, but now that GitHub has built the ability to switch models into the tool it's not hard to imagine it offering several other choices at a later date.
But as the pace of OpenAI's model breakthroughs has slowed, rivals like Anthropic, Google, Meta, and others have quickly managed to erase much of that advantage.
Simply providing exclusive access to OpenAI's models was enough to jumpstart Microsoft's cloud AI business, but the real enterprise tech competition has shifted to the vendors that build the best tools and platforms that their customers need to build their own AI apps atop those models.
Over the last year, Microsoft has made several other models available through its Azure AI service in a strategy that more closely resembles AWS's approach with Bedrock.
GitHub has always had an independent streak within the Microsoft universe, as COO Kyle Daigle told Runtime last year at AWS re:Invent. But Tuesday's announcement makes it clear that if the company behind one of the most popular generative AI tools on the planet thinks it can no longer afford to rely entirely on OpenAI, nobody can.
"We, at GitHub, believe in developer choice and that developers — for reasons of company policy, benchmarks that they have seen, different programming languages and of course, personal preference, or because they’re using that model for other scenarios already — prefer one of the competing models, and so we’re officially partnering with both Anthropic and Google," Dohmke told TechCrunch.
Mountain View multiplier
Google Cloud kicked off infrastructure cloud provider earnings week Tuesday with a strong report, outpacing Wall Street estimates thanks to a 35% increase in revenue. The company reported revenue of $11.4 billion for the division, which also includes sales of the Google Workspace productivity package, and operating profit rose more than 600%.
Believe it or not, Google CEO Sundar Pichai chalked up the results to the company's progress with AI, which is bringing in new customers, he said in the earnings release. Google spent $13 billion on capital expenditures during the quarter to support that AI growth, and new CFO Anat Ashkenazi said that fourth-quarter expenditures would be around the same.
And speaking of coding assistants, Pichai said that 25% of new code generated at Google was written by AI, and put to use after reviews and tests by its employees. Presumably it is using Gemini for a sizable portion of that code, which could bode well for its new partnership with GitHub.
Microsoft reports earnings on Wednesday, and Amazon follows on Thursday.
Sierra landed $175 million in new funding that values Bret Taylor and Clay Bavor's agentic AI startup at $4.5 billion.
GMI Cloud scored $82 million in Series A funding as it builds out a network of GPUs to compete with upstart AI cloud providers like CoreWeave and Lambda Labs.
Read AI raised $50 million in Series B funding for its office productivity AI copilots, which compete directly against similar tools available from both Microsoft and Google in their suites.
Browserbase scored $21 million in Series A funding that will allow the company to further its work developing a browser that AI agents can use to complete tasks across the web.
Microsoft introduced GitHub Copilot for Azure, a preview of a new tool that helps GitHub Copilot users better understand how their code will run on Azure.
Stack Overflow, which was a major source of coding questions and answers before Copilot, launched an extension that will let GitHub Copilot users tap into that repository directly from the service.
Tom Krazit has covered the technology industry for over 20 years, focused on enterprise technology during the rise of cloud computing over the last ten years at Gigaom, Structure and Protocol.
Today: Microsoft shores up its AI strategy heading into a pivotal year, Meta is getting into the AI SaaS business with the former leader of Salesforce's AI division, and the latest enterprise funding.
Today: OpenAI would rather ChatGPT users spend more time using its tool than other "copilots," HPE rolls out a new supercomputer design, and the quote of the week.
Today: Why enterprise open-source contributors might be the secret weapon against patent trolls, AI models are starting to run into scaling problems, and the latest enterprise moves.
Today: Snowflake rolls out new tools for building agents that work with corporate data, CDN companies deal with the effects of streaming saturation, and the latest funding rounds in enterprise tech.