“Human Native AI: Revolutionizing the Marketplace for AI Training Licensing Deals”

Human Native AI is a London-based startup building a marketplace to broker such deals between the many companies building LLM projects and those willing to license data to them. Human Native AI also helps rights holders prepare and price their content and monitors for any copyright infringements. Human Native AI takes a cut of each deal and charges AI companies for its transaction and monitoring services. Human Native AI announced a £2.8 million seed round led by LocalGlobe and Mercuri, two British micro VCs, this week. It is also a smart time for Human Native AI to launch.

In the ever-expanding world of artificial intelligence, data is the fuel that feeds the fire. Large language models and other AI systems require massive amounts of data to be trained effectively, but it’s important that the data being used is ethically sourced and acquired with proper licensing agreements in place. This is where Human Native AI, a London-based startup, comes in.

Human Native AI is creating a marketplace to facilitate deals between AI companies and content rights holders. CEO and co-founder James Smith was inspired by his experience working on Google’s DeepMind project, which faced challenges with obtaining sufficient and high-quality data. He noticed that other AI companies were facing the same issue and saw a need for a marketplace to connect these two sides.

“It feels like we are in the Napster-era of generative AI,” Smith reflects. “Can we get to a better era? Can we make it easier to acquire content? Can we give creators some level of control and compensation? I kept thinking, why is there not a marketplace?”

Smith shared his idea with his friend Jack Galilee, an engineer at GRAIL, and after positive feedback, they decided to turn it into a reality. Human Native AI launched in April with a goal to help AI companies access training data while ensuring that rights holders are fairly compensated and have control over how their content is used.

  • Rights holders can upload their content for no charge
  • AI companies can connect with rights holders for revenue share or subscription deals
  • Human Native AI assists with content preparation, pricing, and monitoring for any copyright infringements
  • The company takes a cut of each deal and charges AI companies for transaction and monitoring services

Smith notes the positive response from both sides, with several partnerships already in place and more to be announced in the near future. This week, Human Native AI announced a £2.8 million seed round led by British micro VCs LocalGlobe and Mercuri, with plans to expand their team.

“I’m the CEO of a two-month-old company and have been able to get meetings with CEOs of 160-year-old publishing companies,” Smith shares. “That suggests to me there is a high demand on the publishing side. Equally, every conversation with a big AI company goes exactly the same way.”

While it’s still early days for Human Native AI, their marketplace is filling a missing piece of infrastructure in the growing AI industry. As AI companies continue to seek out large amounts of data for training, having an easier way for rights holders to work with them and maintain control over their content is crucial.

“Sony Music just sent letters to 700 AI companies asking that they cease and desist,” Smith reveals. “That is the size of the market and potential customers that could be acquiring data. The number of publishers and rights holders it could be thousands if not tens of thousands. We think that’s the reason we need infrastructure.”

In addition to aiding larger AI players, Human Native AI also hopes to level the playing field for smaller AI systems that may not have the resources to secure deals with major companies like Vox or The Atlantic.

“One of the major challenges of licensing content is the large upfront costs and the restrictions on who you can work with,” Smith explains. “We want to increase the number of buyers for rights holders’ content and reduce barriers to entry. We believe this is a really exciting opportunity.”

The potential of the data collected by Human Native AI is also worth noting. In the future, they plan to provide rights holders with valuable insights and help them determine fair pricing for their content based on past deal data on the platform.

Smith believes that this is a smart time for Human Native AI to launch, with the European Union AI Act evolving and potential AI regulation in the U.S. on the horizon. As ethical sourcing of data becomes increasingly important, having proof of proper licensing agreements will be crucial for AI companies.

“We are optimistic about the future of AI and what it will do, but we have to make sure as an industry we are responsible and don’t decimate industries that have gotten us to this point,” Smith emphasizes. “That would not be good for human society. We need to make sure we find the correct ways to enable people to participate. We are AI optimists on the side of humans.”

Avatar photo
Ava Patel

Ava Patel is a cultural critic and commentator with a focus on literature and the arts. She is known for her thought-provoking essays and reviews, and has a talent for bringing new and diverse voices to the forefront of the cultural conversation.

Articles: 888

Leave a Reply

Your email address will not be published. Required fields are marked *