Stability AI Introduces Stable LM 2 12B: Setting New Standards in AI Language Models

  • Editor
  • April 9, 2024
    Updated
Stability-AI-Introduces-Stable-LM-2-12B

Stability AI has launched Stable LM 2 12B, a groundbreaking development in the artificial intelligence world. This latest addition boasts two models, each comprising 12 billion parameters and trained on a vast multilingual dataset encompassing English, Spanish, German, Italian, French, Portuguese, and Dutch.

The announcement was made through the official X account:

Stable LM 2 12B emerges as a dual offering: a base model alongside an instruction-tuned variant. Both models are designed to cater to a wide array of linguistic tasks, underlining Stability AI’s commitment to innovation in language technology.

In an official blog post, the stability AI team said:

Introducing the latest additions to our Stable LM 2 language model series: a 12 billion parameter base model and an instruction-tuned variant, trained on 2 trillion tokens in seven languages: English, Spanish, German, Italian, French, Portuguese, and Dutch.

This medium-sized model balances strong performance, efficiency, memory requirements, and speed, following our established Stable LM 2 1.6B framework as detailed in our previously released technical report. With this release, we’re extending our model range, offering a transparent and powerful tool for developers to innovate in AI language technology. Soon, we plan to introduce a long-context variant of these models, which will be available on Hugging Face upon release.

 

These models, now accessible for both non-commercial and commercial purposes via Hugging Face, signify a leap forward in making advanced AI tools available to a broader audience.

In parallel, Stability AI has updated its Stable LM 2 1.6B model. The enhancements focus on refining conversational abilities across the supported languages, further establishing its prominence in the competitive landscape of open large language models (LLMs).

Here is what people are saying:

Stable LM 2 12B is tailored for efficient performance across multilingual tasks, capable of operating smoothly on widely available hardware. Its instruction-tuned version shines in applications requiring high precision in tool usage and function calling, making it an invaluable asset for developers looking to push the boundaries of AI language technology.

People seemed to be excited about the new model:

When measured against contemporaries such as Mixtral, Llama2, Qwen 1.5, Gemma, and Mistral, Stable LM 2 12B demonstrates robust performance across a variety of benchmarks. Its prowess in both zero-shot and few-shot tasks showcases Stability AI’s commitment to excellence and innovation.

Here is some additional information released by the team:


With the introduction of Stable LM 2 12B, Stability AI continues to pave the way for advancements in AI language models. The planned release of a long-context variant further highlights the company’s dedication to enhancing the capabilities and accessibility of AI technologies.

About Stability AI

Stability AI stands at the forefront of AI technology, dedicated to developing cutting-edge solutions that drive progress in language understanding and generation. With a focus on creating efficient, open models, Stability AI empowers developers and businesses worldwide to harness the potential of AI for transformative outcomes.

For more of such latest news, visit our AI news at allaboutai.com.

Was this article helpful?
YesNo
Generic placeholder image

Dave Andre

Editor

Digital marketing enthusiast by day, nature wanderer by dusk. Dave Andre blends two decades of AI and SaaS expertise into impactful strategies for SMEs. His weekends? Lost in books on tech trends and rejuvenating on scenic trails.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *