With so much attention focused on the new breed of large language models (LLMs) powering generative artificial intelligence (AI), it’s easy to forget another pioneering technology that, for a long time, represented the pinnacle of human/machine interaction, namely IBM’s Watson.
When it was unveiled in 2011, interestingly with a primary goal of winning the game show “Jeopardy,” it was by far the most impressive technology capable of answering questions posed in natural language.
Leapfrog more than a decade, and the market is once more bowled over by AI, this time generative AI, and IBM quickly unveiled its own offering, watsonx, the latest incarnation of Watson. The watsonx platform comprises three core modules that collectively enable enterprises to accelerate AI and data processes to maximize the scope and efficiency of the latest AI innovations.
The suite includes watsonx.ai, for training, fine-tuning, and deploying generative AI, foundation models, and machine learning functions. watsonx.data is a data store specifically designed for scalable AI workloads. And watsonx.governnance is the platform’s AI governance tool due for general release in December 2023.
Recently, IBM revealed a series of enhancements to the watsonx suite. In this analysis, I’ll explain what they are, how they will benefit customers, and what they mean for IBM’s generative AI strategy.
New LLMs for watsonx.ai
IBM has introduced the first models from its watsonx Granite model series to watsonx.ai. Built on decoder-only architecture, which provides LLMs with the capability to predict the next word in a sequence, the models support a variety of natural language processing tasks, such as summarization and content generation. The highly efficient 13-billion parameter models are trained on a variety of enterprise-specific datasets and have been developed in different sizes to meet the specific needs of companies.
IBM also now offers Meta’s Llama 2-chat 70 billion parameter model and the StarCoder LLM, built for code generation. Beyond this, IBM has announced the imminent release of its Tuning Studio enabling users to tune foundational models to adapt to specific tasks using proprietary data, and a synthetic data generator for low-risk AI model training.
Watsonx.data gets an upgrade
Planned for release in Q4 2023, IBM is integrating watsonx.ai generative AI capabilities into watsonx.data. This will enable users to access and refine data for AI use cases through a self-service interface using natural language.
Furthermore, IBM will be integrating a vector database into watsonx.data to support retrieval augmented generation (RAG) use cases. RAG is an important AI framework that enables LLMs to retrieve facts from external sources ensuring they are always generating the most up-to-date responses based on timely, accurate, verified data. It also gives administrators a better view of where the data used by the LLM comes from so they can govern it accordingly.
Regarding watsonx.governance, IBM has announced that it will be launching a tech preview of the platform. Ultimately, watsonx.governance will enable users to implement approval processes for AI workflows, ensuring human oversight, and automatically document foundation model details, metrics, and risk governance capabilities to be presented through accessible dashboards.
With its history of AI innovation through Watson, IBM is well-positioned to launch a generative AI product suite that will match the capabilities of rival releases. However, instead of just developing an LLM or launching a chat facility, IBM has used its providence in AI to create a powerful suite of tools that support the entirety of AI-driven operations in the enterprise.
In particular, the company’s decision to give each element of the watsonx platform — across AI, data, and governance — equal attention in terms of updates and capabilities, demonstrates the maturity of IBM’s technology.
The company is demonstrating a clear understanding that with generative AI comes a need for an AI-friendly, scalable data store as well as for comprehensive, accessible governance tools tailored to LLM monitoring. Moreover, it provides AI development tools supported by multiple LLMs to cater to the unique needs of individual enterprises.
Gain insight into the way Bob Evans builds and updates the Cloud Wars Top 10 ranking, as well as how C-suite executives use the list to inform strategic cloud purchase decisions. That’s available exclusively through the Acceleration Economy Cloud Wars Top 10 Course.