Model

Premium Subscribers Now Have Access to Grok Chatbot through X

Xai Grok Gettyimages 1765893916
Social network X is rolling out access to xAI’s Grok chatbot to Premium tier subscribers after Elon Musk announced the expansion to more paid users last month. The company said on its support page that only Premium and Premium+ users can interact with the chatbot in select regions. Last year, after Musk’s xAI announced Grok, it made the chatbot available to Premium+ users — people who are paying $16 per month or a $168 per year subscription fee. Earlier this week, X rolled out a new explore view inside Grok where the chatbot summarizes trending news stories. Last month, xAI open-sourced Grok but without any training data details.

Tesla’s Model Y Inventory Prices Slashed by up to $7K

Tesla Model Y
Tesla is dropping prices of unsold Model Y SUVs in the U.S. by thousands of dollars in an attempt to clear out an unprecedented backlog of inventory. The discounts come as Tesla once again made far more vehicles than it sold in the last quarter. The company built 433,371 vehicles in the first quarter but only shipped 386,810, likely adding more than 40,000 EVs to its inventory glut. It announced a $1,000 price hike was coming to the Model Y, its most popular vehicle, on April 1. He has largely blamed the struggle on high interest rates, all while his company dramatically cut prices on the Model Y and Model 3 throughout 2023.

Expanding capabilities: OpenAI’s enhancement of personalized model training initiation

Openai Pattern 04
OpenAI is expanding a program, Custom Model, to help enterprise customers develop tailored generative AI models using its technology for specific use cases, domains and applications. “Dozens” of customers have enrolled in Custom Model since. As for custom-trained models, they’re custom models built with OpenAI — using OpenAI’s base models and tools (e.g. Fine-tuned and custom models could also lessen the strain on OpenAI’s model serving infrastructure. Alongside the expanded Custom Model program and custom model building, OpenAI today unveiled new model fine-tuning features for developers working with GPT-3.5, including a new dashboard for comparing model quality and performance, support for integrations with third-party platforms (starting with the AI developer platform Weights & Biases) and enhancements to tooling.

AI Ethics Worn Down by Persistent Interrogations from Anthropic Scholars

Gettyimages 1424498694
The vulnerability is a new one, resulting from the increased “context window” of the latest generation of LLMs. But in an unexpected extension of this “in-context learning,” as it’s called, the models also get “better” at replying to inappropriate questions. So if you ask it to build a bomb right away, it will refuse. But if you ask it to answer 99 other questions of lesser harmfulness and then ask it to build a bomb… it’s a lot more likely to comply. If the user wants trivia, it seems to gradually activate more latent trivia power as you ask dozens of questions.

Simplifying Private AI Model Deployments with OctoStack: The Latest Solution from OctoAI

Gettyimages 2099855745
In its early days, OctoAI focused almost exclusively on optimizing models to run more effectively. With the rise of generative AI, the team then launched the fully managed OctoAI platform to help its users serve and fine-tune existing models. OctoStack, at its core, is that OctoAI platform, but for private deployments. Deploying OctoStack should be straightforward for most enterprises, as OctoAI delivers the platform with read-to-go containers and their associated Helm charts for deployments. For developers, the API remains the same, no matter whether they are targeting the SaaS product or OctoAI in their private cloud.

“Encouraging Women to Embrace AI: Insights from Kristine Gloria of the Aspen Institute on Embracing Curiosity in the Field”

Women In Ai Gloria
We’ll publish several pieces throughout the year as the AI boom continues, highlighting key work that often goes unrecognized. Kristine Gloria leads the Aspen Institute’s Emergent and Intelligent Technologies Initiative — the Aspen Institute being the Washington, D.C.-headquartered think tank focused on values-based leadership and policy expertise. What are some issues AI users should be aware of? What is the best way to responsibly build AI? How can investors better push for responsible AIOne specific task, which I admire Mozilla Ventures for requiring in its diligence, is an AI model card.

“Experience X’s Latest Upgrade: Grok-1.5 for the Futuristic Chatbot, Grok”

Xai Grok Gettyimages 1765893916 4
X.ai, Elon Musk’s AI startup, has revealed its latest generative AI model, Grok-1.5. Grok-1.5 benefits from “improved reasoning,” according to X.ai, particularly where it concerns coding and math-related tasks. One improvement that should lead to observable gains is the amount of context Grok-1.5 can take in compared to Grok-1. Context, or context window, refers to input data (in this case, text) that a model considers before generating output (more text). The announcement of Grok-1.5 comes after X.ai open sourced Grok-1, albeit without the code necessary to fine-tune or further train it.

New AI Developed by AI21 Labs Surpasses Competitors in Context-Based Task Comprehension

Gettyimages 1448152453
Increasingly, the AI industry is moving toward generative AI models with longer contexts. Ori Goshen, the CEO of AI startup AI21 Labs, asserts that this doesn’t have to be the case — and his company is releasing a generative model to prove it. Contexts, or context windows, refer to input data (e.g. Trained on a mix of public and proprietary data, Jamba can write text in English, French, Spanish and Portuguese. Loads of freely available, downloadable generative AI models exist, from Databricks’ recently released DBRX to the aforementioned Llama 2.

“Databricks’ $10M Investment in DBRX Failed to Surpass GPT-4’s Dominance in AI Generation”

Gettyimages 1177951659
You could spend it training a generative AI model. See Databricks’ DBRX, a new generative AI model announced today akin to OpenAI’s GPT series and Google’s Gemini. Customers can privately host DBRX using Databricks’ Model Serving offering, Rao suggested, or they can work with Databricks to deploy DBRX on the hardware of their choosing. It’s an easy way for customers to get started with the Databricks Mosaic AI generative AI tools. And plenty of generative AI models come closer to the commonly understood definition of open source than DBRX.