models

Aura: Deepgram’s Revolutionary AI Vocalizes Agents

Gettyimages 1127591772
Aura combines highly realistic voice models with a low-latency API to allow developers to build real-time, conversational AI agents. Backed by large language models (LLMs), these agents can then stand in for customer service agents in call centers and other customer-facing situations. Deepgram’s Aura combines human-like voice models that render extremely fast (typically in well under half a second) and, as Stephenson noted repeatedly, does so at a low price. “Everybody now is like: ‘hey, we need real-time voice AI bots that can perceive what is being said and that can understand and generate a response — and then they can speak back,'” he said. The Aura model, just like all of the company’s other models, were trained in-house.

Pienso creates AI model training tools without coding requirements

Gettyimages 1335295270
“So much of the AI conversation has been dominated by … large language models,” Jones said, “but the reality is that no one model can do everything. Pienso believes that any domain expert, not just an AI engineer, should be able to do just that.”Pienso guides users through the process of annotating or labeling training data for pre-tuned open source or custom AI models. “Pienso’s flexible, no-code interface allows teams to train models directly using their own company’s data,” Jones said. “This alleviates the privacy concerns of using … models, and also is more accurate, capturing the nuances of each individual company.”Companies pay Pienso a yearly license based on the number of AI models they deploy. It’s fostering a future where we’re building smarter AI models for a specific application, by the people who are most familiar with the problems they are trying to solve.”

“Discover a Revolutionary Earning Opportunity for AI Model Makers with NFT Platform Zora”

Gettyimages 1163715561
Zora co-founder Jacob Horne and Goens see crypto and AI as two complementary technologies that can benefit from one another. “Crypto wants information to be on-chain so that it can be valued and add value to the system,” Goens said. “And then AI wants information to be on-chain so that it can be freely accessed and utilized by the system. “We need systems that can help bring all of these things on-chain and that’s what we’re trying to do at Zora,” Goens said. This means these AI creators have the ability to capture value from their models’ outputs when people mint them and the payouts are split in half automatically.

Examining the Inadequate Insights from the Majority of AI Benchmarks

Gettyimages 176980461
Here’s why most AI benchmarks tell us so littleOn Tuesday, startup Anthropic released a family of generative AI models that it claims achieve best-in-class performance. The reason — or rather, the problem — lies with the benchmarks AI companies use to quantify a model’s strengths — and weaknesses. “Many benchmarks used for evaluation are three-plus years old, from when AI systems were mostly just used for research and didn’t have many real users. In addition, people use generative AI in many ways — they’re very creative.”It’s not that the most-used benchmarks are totally useless. However, as generative AI models are increasingly positioned as mass market, “do-it-all” systems, old benchmarks are becoming less applicable.

“Empower Your Business with Data Conversations on Numbers Station”

Gettyimages 890901380
Numbers Station, a startup that is using large language models (LLMs) to power its data analytics platform, is launching its first cloud-based product today: the aptly named Numbers Station Cloud, which is now in early access. With this service, virtually any user in an enterprise can analyze their internal data using Numbers Station’s chat interface. As Numbers Station co-founder and CEO Chris Aberger told me, he’s somewhat tired of talking about how the service allows users to “chat with their data,” because there is so much noise around that. Numbers Stations’ research shows that its approach results in significantly improved precision compared to more traditional text-to-SQL pipelines. “Numbers Station is at the cutting edge of enterprise AI for structured data,” said Sharad Rastogi, the CEO of Work Dynamics Technology from Jones Lang LaSalle.

Competition Ramps Up in AI Video Generation as Former Deepmind Members Reveal Haiper

Haiper Image
AI-powered video generation is a hot market on the back of OpenAI’s releasing Sora model last month. Two Deepmind alums Yishu Miao and Ziyu Wang have publicly released their video generation tool Haiper with its own AI model underneath. Video generation serviceUsers can go to Haiper’s site and start generating videos for free by typing in text prompts. He noted that it is “too early” in the startup’s journey to think about building a subscription product around video generation. While investors are looking to invest in AI-powered video generation startups, they also think the technology still has a lot of room for improvement.

Preorder the $349 2a Budget Phone from Nothing

16x9 5
The much-teased and oft-leaked Nothing Phone (2a) is now officially official, just under a week after it made its limited debut at the company’s MWC after party. Nothing’s third phone is the first that goes directly after the mid-tier/budget space, with a starting price of $349. The phone is currently only available here for developers looking to integrate third-party apps with the light up “Glyphs” on the device’s back. In the London-based firm’s home market, it’s available in both 8GB/128GB and 12GB/256GB models, running £319 and £319, respectively. At 5,000 mAh, the battery is larger than the ones found on both the Phone (1) (4,500mAh) and Phone (2) (4,700 mAh).

” New Models from Anthropic Outperform GPT-4

Claude2 Blog V1 1
All show “increased capabilities” in analysis and forecasting, Anthropic claims, as well as enhanced performance on specific benchmarks versus models like GPT-4 (but not GPT-4 Turbo) and Google’s Gemini 1.0 Ultra (but not Gemini 1.5 Pro). A model’s context, or context window, refers to input data (e.g. In a technical whitepaper, Anthropic admits that Claude 3 isn’t immune from the issues plaguing other GenAI models, namely bias and hallucinations (i.e. Unlike some GenAI models, Claude 3 can’t search the web; the models can only answer questions using data from before August 2023. Here’s the pricing breakdown:Opus: $15 per million input tokens, $75 per million output tokensSonnet: $3 per million input tokens, $15 per million output tokensHaiku: $0.25 per million input tokens, $1.25 per million output tokensSo that’s Claude 3.

“Introducing the Latest MacBook Air Models: Apple Unveils 13-inch and 15-inch Options Powered by M3 Technology”

Apple Macbook Air 2 Up Hero 240304 Big.jpg.large
Apple announced new MacBook Air models with 13-inch and 15-inch screen sizes with M3 Chip. The 13-inch model starts at $1,099 and the 15-inch model starts at $1,200. Both variants are available for pre-orders in the U.S. starting today with general availability slotted for March 8. Apple unveiled the M2 Macbook Air in 2022 and added the 15-inch model to the portfolio last year. Both Macbook Air models have 18 hours of claimed battery life, a 1080p webcam, Wi-Fi 6E connectivity, and support for two external displays.

Tim Cook Makes Bold Prediction About Apple’s GenAI Advancements for the Year Ahead

Appleevent.sep07keynote.tim Cook.03
Tim Cook says Apple will ‘break new ground’ in GenAI this yearApple CEO Tim Cook is promising that Apple will “break new ground” on GenAI this year. Some of the staff on the EV project were reassigned to work on various GenAI initiatives, according to multiple publications. $AAPL COOK: APPLE WILL "BREAK NEW GROUND" IN GENERATIVE AI THIS YEAR — *Walter Bloomberg (@DeItaone) February 28, 2024Apple, unlike many of its Big Tech rivals, has been slow to invest in — and ramp up — GenAI. During the company’s Q1 earnings call, Cook said Apple was working internally with GenAI but that it was taking a slower, more deliberate approach to customer-facing incarnations of the technology. Perhaps telegraphing Apple’s intensifying GenAI focus, engineers at the company have co-authored an increasing number of GenAI-related academic and technical papers.