Data is the key to unlocking the full potential of AI models, but it can also be their downfall. Prejudices can seep into large data sets, leading to biased results that reflect societal patterns. Additionally, massive data sets can be chaotic, full of extraneous information and difficult for models to process effectively.
“Models are what they eat – models are a reflection of the data on which they’re trained. However, not all data are created equal, and some training data are vastly more useful than others. Training models on the right data in the right way can have a dramatic impact on the resulting model.” – Ari Morcos
In a recent survey, 40% of companies adopting AI cited data-related challenges as one of their top concerns, including the time and effort required to prepare and clean data. Another study found that data scientists spend about 45% of their time on data preparation tasks such as loading and cleaning.
Ari Morcos has been working in the AI industry for nearly a decade and has seen the struggles and challenges that come with data preparation for model training. That’s why he founded DatologyAI, a startup that aims to simplify and automate the data preparation process for AI models.
Morcos’ company offers tooling that can curate data sets for training OpenAI’s ChatGPT, Google’s Gemini, and other similar models. The platform is able to prioritize important data based on the model’s intended purpose, as well as suggest additional data and batch sizes for more effective training.
“With interest in GenAI at an all-time high, AI implementation costs are at the forefront of execs’ minds. Companies have collected treasure troves of data and want to train efficient, performant, specialized AI models that can maximize the benefit to their business. However, making effective use of these massive data sets is incredibly challenging and, if done incorrectly, leads to worse-performing models that take longer to train and [are larger] than necessary.” – Ari Morcos
Morcos, who holds a Ph.D. in neuroscience from Harvard, spent years at top AI companies like DeepMind and Meta before launching DatologyAI. Alongside his co-founders Matthew Leavitt and Bogdan Gaza, the trio is determined to streamline AI data set curation in all its forms.
The quality and composition of a training data set can greatly impact the resulting model’s performance, size, and domain knowledge. More efficient data sets can shorten training time and reduce compute costs, while highly diverse data sets can handle niche requests more effectively. And with AI implementation costs becoming a top concern for businesses, the demand for streamlined data preparation is higher than ever.
“Identifying the right data at scale is extremely challenging and a frontier research problem. Our approach leads to models that train dramatically faster while simultaneously increasing performance on downstream tasks.” – Ari Morcos
DatologyAI’s technology is exceptional, but does it truly live up to its claims? Some may be skeptical, citing prior examples of automated data curation gone wrong. However, Morcos argues that DatologyAI’s tooling should not replace manual curation entirely, but rather serve as a valuable resource for data scientists. With years of experience in the field and an award-winning academic paper focused on training models more efficiently, Morcos and his team are well-equipped to tackle the challenges of AI data preparation.
The company has received significant investments from top figures in the tech and AI industries, including Jeff Dean (Google), Yann LeCun (Meta), Adam D’Angelo (OpenAI), and Geoffrey Hinton (credited with developing important AI techniques). This impressive list of investors suggests there may be truth behind Morco’s claims and potential for DatologyAI to revolutionize AI data preparation.
While DatologyAI currently has a modest team of ten employees, including the co-founders, they have plans for significant expansion by the end of the year. And with their advanced technology and support from top industry leaders, DatologyAI may just be the key to unlocking AI’s full potential.