Mistral AI unveils cutting-edge competitor to GPT-4 alongside revolutionary chat assistant

Paris-based AI startup Mistral AI is gradually building an alternative to OpenAI and Anthropic as its latest announcement shows. Founded by alums from Google’s DeepMind and Meta, Mistral AI originally positioned itself as an AI company with an open-source focus. Mistral AI’s business model looks more and more like OpenAI’s business model as the company offers Mistral Large through a paid API and usage-based pricing. Mistral AI claims that it ranks second after GPT-4 based on several benchmarks. The first benefit of that partnership is that Mistral AI will likely attract more customers with this new distribution channel.

Paris-based AI startup Mistral AI is making waves in the industry with its latest announcement. The company is launching a new flagship large language model called Mistral Large, positioning itself as a competitor to established models such as OpenAI and Anthropic. Mistral Large boasts impressive reasoning capabilities, similar to those of GPT-4 and Claude 2.

In addition to Mistral Large, the startup is also introducing its own take on chat assistants with Le Chat. This new service, currently in beta, is positioned as an alternative to ChatGPT.

If you’re not already familiar with Mistral AI, the company has gained recognition for its capitalization table, quickly raising a significant amount of funding in a short period of time. Officially incorporated in May 2023, Mistral AI secured a $112 million seed round just a few weeks later. In December of the same year, they closed a $415 million funding round, with Andreessen Horowitz (a16z) as the lead investor.

Founded by alumni from Google’s DeepMind and Meta, Mistral AI originally focused on open-source AI technologies. However, their larger models, including Mistral Large, are not released under open-source licenses.

Similar to OpenAI, Mistral AI has adopted a business model of offering their models through a paid API with usage-based pricing. Currently, querying Mistral Large costs $8 per million of input tokens and $24 per million of output tokens. By default, the model supports context windows of 32k tokens and is available in English, French, Spanish, German, and Italian.

For comparison, accessing GPT-4 with a similar context window currently costs $60 per million of input tokens and $120 per million of output tokens. This places Mistral Large at 5 to 7.5 times more affordable than GPT-4-32k. Keep in mind, however, that pricing in the AI industry is constantly evolving and can change at a rapid pace.

But how does Mistral Large actually perform compared to other models like GPT-4 and Claude 2? It’s a difficult question to answer, as Mistral AI claims to rank second after GPT-4 based on several benchmarks. However, there could be biased benchmark selection and discrepancies in real-world usage. To truly assess its capabilities, further testing and evaluation are necessary.

Mistral AI is also launching Le Chat, a chat assistant now available for sign up on chat.mistral.ai. According to the company, this is a beta release, so there may be some “quirks”.

While access to Le Chat is currently free, users can choose between three different models: Mistral Small, Mistral Large, and Mistral Next. The latter is a prototype model specifically designed for brevity and conciseness. In the future, Mistral AI plans to release a paid version of Le Chat for enterprise clients, offering central billing and moderation options.

In addition to its own API platform, Mistral AI has also formed a partnership with Microsoft. This collaboration means that Microsoft will now offer Mistral models to its Azure customers, adding to their existing model catalog. This partnership goes beyond just hosting the models, as Mistral AI and Microsoft have also begun discussions for future collaborative opportunities.

This partnership is a major milestone for Mistral AI, as it opens up new distribution channels and potential customers. For Microsoft, it allows them to expand their AI offerings and keep Azure customers within their product ecosystem. It may also help with any anticompetitive scrutiny they may face.

Overall, Mistral AI’s success and growth in such a short period of time is impressive. With its advanced language models and strategic partnerships, the company is steadily establishing itself as a major player in the world of AI. Only time will tell how Mistral Large and other Mistral AI models will continue to evolve and compete with the industry’s top performers.

Avatar photo
Max Chen

Max Chen is an AI expert and journalist with a focus on the ethical and societal implications of emerging technologies. He has a background in computer science and is known for his clear and concise writing on complex technical topics. He has also written extensively on the potential risks and benefits of AI, and is a frequent speaker on the subject at industry conferences and events.

Articles: 865

One comment

Leave a Reply

Your email address will not be published. Required fields are marked *