Over the last few years, Microsoft has been deepening its relationship with the AI research organization OpenAI and the “GitHub for ML” company Hugging Face in order to integrate their tools such as GPT-3 and DALL-E-2 into Azure. In return, the tech giant provides the compute resources needed for OpenAI to train and run its large AI models.
Here’s what that means for the industry and how it’s paving the way for other organizations to rely on AI-as-a-Service, or AIaaS.
Implementation Is Key with AI
As we’ve seen with ChatGPT’s release, the way an AI system is deployed has a huge impact on its value. While ChatGPT is built on technology that has existed for years, the freely available web tool took off because anyone could use it for free, anytime. This lets the entire world explore use cases and embed AI into their workflows. Microsoft’s partnerships with OpenAI and Hugging Face dramatically shorten the road to implementation and profitability of cutting-edge AI systems, providing real value for Microsoft and allowing OpenAI and Hugging Face to quickly receive market feedback to accelerate functional iteration.
OpenAI’s powerful technology unlocks much more value when it’s delivered in convenient, user-friendly ways. The Azure integration allows GPT-3 and DALL-E-2 to provide value for non-technical users as well. For example, Microsoft already applied GPT-3 to convert natural language queries into data formulas in Power Fx, a general-purpose programming language for expressing logic across the Microsoft Power Platform. In this blog post, Microsoft mentioned another AI-powered feature that lets people building an e-commerce app describe a programming goal using conversational language like “find products where the name starts with ‘kids.’”
Implementation is also vital for organizations. Open source, APIs (application programming interfaces), and web-based chatbots might be the holy grail for freelance hackers, but they’re not ready for enterprise use. Pushing OpenAI’s technology into the Azure ecosystem offers the convenience, security, reliability, compliance, data privacy, scalability, and enterprise-grade capabilities that organizations need to adopt new systems. The offering not only adds value to existing Azure customers, it also helps Microsoft differentiate itself from other cloud service providers.
A Strategic Partnership
More generally, the partnership with OpenAI lets Microsoft dominate any markets that arise from GPT-3 and other AI tools. This is because the GPT-3 API, which is still open to everyone else, serves as a product research project for Microsoft.
Whatever use cases a company finds for GPT-3, Microsoft will be able to do it cheaper, faster, and more accurately, and deploy it immediately to its existing customer base that spans across verticals. When OpenAI announced its $100M fund for AI startups, many speculated it was just a tool to spot potential acquisitions for Microsoft.
The Expanding AI Landscape of AWS and Google
Microsoft isn’t the only company working on AI integration and services. Amazon and Google are working on similar things. A quick ChatGPT query told me that AWS provides:
- Amazon SageMaker: a fully-managed service that allows developers to build, train, and deploy machine learning models at scale
- Amazon Rekognition: a service that uses deep learning algorithms to perform image and video analysis, including object and facial recognition
- Amazon Lex: a service that allows developers to build chatbots and other conversational interfaces using natural language understanding and automatic speech recognition
- Amazon Polly: a service that converts text into lifelike speech, allowing developers to build applications that can speak in multiple languages and voices
- Amazon Comprehend: a service that uses natural language processing (NLP) to extract insights from text, including sentiment analysis, entity recognition, and language detection
While Google provides:
- Cloud AutoML: a suite of machine learning tools that enables developers with limited machine learning expertise to train high-quality models
- Cloud Natural Language: an NLP service that allows developers to extract insights from unstructured text data
- Cloud Vision: a computer vision service that enables developers to analyze and understand the content of images and videos
- Cloud Speech-to-Text: a speech recognition service that allows developers to convert audio and voice into written text
- Cloud Text-to-Speech: a text-to-speech service that enables developers to convert written text into natural-sounding speech
AI as a Service, or AIaaS, Emerges
AIaaS delivers the same benefits as cloud services themselves, allowing organizations with smaller budgets to create their own web servers. And the Azure integrations show the value of being able to run ML systems in the cloud without setting up or managing the technology directly.
Research finds that most organizations rely on outside talent to develop their AI systems from the bottom-up:
Nonetheless, these vendors can be extremely expensive given the lack of available talent in AI/ML.
That means go-to-market strategies must continue to evolve for AI tools. Making a website open for all worked in terms of hype (see ChatGPT), but that doesn’t make a great product. As is usually the case with new tech, it begins in open source with early adopters fiddling around, until the technology develops to a point where organizations can bundle together different technologies to create a product the rest of us can use without a deep understanding of the tech stack.
This is where AI is today. Large organizations have already started their AI implementation, but what about small to medium-sized enterprises (SMEs)? Firms outside of tech and major metro areas? Family-owned shops? Creators or influencers?
These businesses may not have the resources to build ML systems and data pipelines from the bottom up, even if they can benefit from them. AIaaS built into customer relationship management (CRM) software, enterprise resource planning (ERP) software, and cloud ecosystems like Azure is the solution.
And, here’s another big trend: Increasingly powerful AI systems embedded into consumer-facing tools like monday.com, Excel, the Adobe suite, and much more. This gives end users the power of AI without ever touching the tech.
Commercialization and Ethical Considerations
I always like to end with a reality check. Large language models (LLMs), transformers, AIaaS. Cool-sounding names. Great for profitability.
But this is also a story of consolidation and commercialization, which come with their own drawbacks and beliefs around how AI systems should be deployed and the profit expectations that come with commercialization.
OpenAI’s previous model of a non-profit research organization with clear intentions to develop AI for good could change as commercial interests need to be satisfied in order to fund their continued operations. Shifting into a business-oriented status has potentially significant implications.
Profit is a core objective of any commercial enterprise — but if single players dictate the most advanced AI systems in the world, they also should invest in responsible AI development, AI ethics, and transparency.
Looking for real-world insights into artificial intelligence and hyperautomation? Subscribe to the AI and Hyperautomation channel: