From July 15, mass-produced, inauthentic YouTube videos may lose monetisation
New YouTube rules target ‘inauthentic’ videos starting July 15
By Newsmeter Network
New YouTube rules target ‘inauthentic’ videos starting July 15
Hyderabad: YouTube has announced a significant policy update set to take effect from July 15, aimed at tackling the growing issue of unoriginal and inauthentic content on its platform.
The update is directed at creators enrolled in the YouTube Partner Program (YPP), those who earn revenue through advertisements on their videos.
Identifying mass-produced and repetitive content
“YouTube is updating our guidelines to better identify mass-produced and repetitious content,” said the platform in an official statement. “This update better reflects what ‘inauthentic’ content looks like today.”
While the platform has historically required content to be original and engaging to qualify for monetisation, the upcoming policy change aims to enforce this standard more rigorously.
What counts as ‘inauthentic content’?
YouTube has not provided a clear definition for the terms ‘mass-produced,’ ‘repetitious,’ or ‘inauthentic.’ However, based on the policy direction, content that might be affected includes:
- Videos made using set templates with minimal editing effort
- AI-generated videos with synthetic voices and animations
- Recycled content that copies or mimics other creators’ work without adding significant value
This type of content is especially common in gaming channels, where creators often post faceless gameplay videos accompanied by AI-generated voiceovers.
Impact on virtual YouTubers and AI tools
The policy update has raised questions about its implications for virtual YouTubers—creators who use digital avatars instead of face cams. These channels, often featuring gameplay videos, have surged in popularity.
While many of these creators voice their own content and maintain high engagement levels, it remains unclear how YouTube will treat them under the new guidelines.
Earlier this week, some of these creators reported earning millions through virtual avatars. However, the lack of clarity around whether AI-generated voices or characters fall under the definition of ‘inauthentic’ has left many creators in limbo.
Monetisation at risk
Although YouTube has not explicitly banned AI-generated videos, the platform’s emphasis on originality may discourage creators from relying solely on AI tools to produce content, especially if they are uploading large volumes of similar videos.
This could affect the usage and subscription rates of popular AI services that offer automated avatars, voice synthesis, and script generation.
The company has yet to specify how it will detect or enforce these standards. As of now, monetisation may be denied or revoked for content that fails to meet the authenticity bar.
What should creators expect?
The update is expected to go into effect on July 15, at which point YouTube may release further clarification or enforcement guidelines. Until then, many creators are left waiting to see whether their current content strategies will be affected.
For now, the takeaway is clear: creators who wish to monetise must focus on producing original, high-effort, and authentic content, regardless of the tools they use.