ARMONK, N.Y. - IBM (NYSE:IBM) has expanded its artificial intelligence offerings by integrating the Mixtral-8x7B large language model (LLM) into its watsonx AI and data platform. This model, developed by Mistral AI and now optimized by IBM, is reported to potentially reduce latency by 35-75%, depending on the batch size, and increase data processing throughput by 50% compared to the standard version.
The improved performance is attributed to quantization, a process that diminishes the model size and memory requirements, which can accelerate processing speeds. This enhancement is expected to lower costs and energy consumption for businesses utilizing the model.
IBM's integration of Mixtral-8x7B underscores its commitment to providing a diverse range of AI models, including those developed in-house, by third parties, and open-source options. The company's multi-model strategy aims to meet clients' varying needs, offering them the flexibility to scale AI solutions across different business functions.
Mixtral-8x7B utilizes Sparse modeling and the Mixture-of-Experts technique to process and analyze large data sets efficiently, delivering context-relevant insights. This model is part of IBM's broader initiative to provide enterprise-ready foundation models that empower clients to leverage generative AI for innovation and improved business outcomes.
Kareem Yusuf, Ph.D., Senior Vice President of Product Management & Growth at IBM Software, emphasized the importance of choice and flexibility for clients deploying AI models tailored to their specific business requirements. The watsonx platform is designed to support a robust ecosystem of AI developers and business leaders across various industries.
Additionally, this week IBM announced the availability of ELYZA-japanese-Llama-2-7b, a Japanese LLM model from ELYZA Corporation, on watsonx. The platform also hosts Meta (NASDAQ:META)'s open-source models and other third-party models, with more expected to be added in the coming months.
The information in this article is based on a press release statement.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.