realistic

Introducing the New Policy: Disclosure of AI-Generated Realistic Content on YouTube

Youtube Ios App 1
YouTube is now requiring creators to disclose to viewers when realistic content was made with AI, the company announced on Monday. YouTube says the new policy doesn’t require creators to disclose content that is clearly unrealistic or animated, such as someone riding a unicorn through a fantastical world. It also isn’t requiring creators to disclose content that used generative AI for production assistance, like generating scripts or automatic captions. They will also have to disclose content that alters the footage of real events or places, such as making it seem as though a real building caught on fire. Creators will also have to disclose when they have generated realistic scenes of fictional major events, like a tornado moving toward a real town.

Artisse AI secures $6.7 million in funding for cutting-edge AI photography app boasting unparalleled realism

Artisse Ai App
Models and influencers are among Artisse’s early adopters along with some businesses using AI photography for their ads. “I see AI photography as a new category that should be probably of a similar size, if not bigger than, photo editing apps,” he says. The investment, which was inbound, made sense because the fund has an influencer marketing arm and could help with marketing the app, Wu explains. Shopping from AI photos and turning them into physical prints are other ideas being explored. Artisse’s AI app is available on both iOS and Android.

YouTube Enforces Stricter Measures Against AI Videos that Depict Deceased Children or Crime Victims with Realistic Simulations

Youtube Ios App
YouTube is updating its harassment and cyberbullying policies to clamp down on content that “realistically simulates” deceased minors or victims of deadly or violent events describing their death. The policy change comes as some true crime content creators have been using AI to recreate the likeness of deceased or missing children. In these disturbing instances, people are using AI to give these child victims of high profile cases a childlike “voice” to describe their deaths. In recent months, content creators have used AI to narrate numerous high-profile cases including the abduction and death of British two-year-old James Bulger, as reported by the Washington Post. TikTok’s policy allows it to take down realistic AI images that aren’t disclosed.