workloads

Google Releases Tools to Facilitate Development of AI Models

Img 20190409 075753 1
But this year, whether to foster developer goodwill or advance its ecosystem ambitions (or both), Google debuted a number of open-source tools primarily aimed at supporting generative AI projects and infrastructure. Beyond MaxDiffusion, Google’s launching Jetstream, a new engine to run generative AI models — specifically text-generating models (so not Stable Diffusion). “We’ve heavily optimized [the models’] performance on TPUs and also partnered closely with Nvidia to optimize performance on large GPU clusters,” Lohmeyer said. The goal is to reduce the barrier to entry for getting generative AI models onto TPU hardware, according to Google — in particular text-generating models. And Optimum TPU doesn’t yet support training generative models on TPUs — only running them.

OpenStack Enhances AI Workload Support

Gettyimages 94123198
Dubbed ‘Caracal,’ this new release emphasizes new features for hosting AI and high-performance computing (HPC) workloads. Indeed, as the OpenInfra Foundation announced this week, its newest Platinum Member is Okestro, a South Korean cloud provider with a heavy focus on AI. But Europe, with its strong data sovereignty laws, has also been a growth market and the UK’s Dawn AI supercomputer runs OpenStack, for example. “All the things are lining up for a big upswing and open-source adoption for infrastructure,” OpenInfra Foundation COO Mark Collier told TechCrunch. That’s in addition to networking updates to better support HPC workloads and a slew of other updates.

“Maximizing Efficiency: A Look at Thoras.ai’s Automated Resource Allocation for Kubernetes Workloads”

Gettyimages 1319917047
“Thoras essentially integrates alongside a cloud-based service and it consistently monitors the usage of that service,” company CEO Nilo Rahmani told TechCrunch. They launched the company right after the first of the year and closed their pre-seed funding just a few weeks ago. In terms of AI, the company currently uses more task-based machine learning than generative AI and large language models (LLMs). “A lot of the problems that we’re facing are systemic issues, and there are a lot of numbers involved. They see LLMs being more useful in troubleshooting after the fact at some point as they fill out the product.