Google has announced that it’ll be rolling its generative artificial intelligence software, Veo 2, into YouTube Shorts, the company’s TikTok rival. Veo 2 is Google’s video generator, which was launched on December 16, and the search giant claims doesn’t hallucinate as often.
Now, the software will be put to the test as it rolls out under “Dream Screen”. The idea is that you’ll be able to stitch together AI clips with your own content, or more than likely, post fully AI-generated short videos to the platform.
One example given shows the intended use, as a dog grows to impossible sizes after eating spilled protein powder.
It isn’t available everywhere yet, with YouTube stating that it’ll launch worldwide at a later date. Currently, you should be able to access it from the US, Canada, Australia, and New Zealand. Presumably, the EU region won’t see this feature for some time as Google “navigates” the region’s rules.
YouTube will label all clips using generative AI using its “SynthID” to avoid misinformation. However, Google – much like others in the AI space – has been coy about what it’s trained Veo 2 on.
Google opens floodgates with YouTube Shorts generative AI
Concerns around generative AI have been persistent since day one, as the world learned how these models are trained. In 2024, Nvidia was found to have effectively stolen large quantities of video from Netflix and YouTube to train internal generators.
OpenAI is currently wrestling with multiple court cases as its chatbot, ChatGPT, has potentially been trained on copyrighted works. Sam Altman, the head of OpenAI, has said in the past that this technology is impossible to develop without the theft of copyright.
With Google opening the floodgates to anyone generating AI content, it’s to be expected that at some point, copyright lawsuits will come out of the woodwork. It’s entirely new territory for the law, hence the length of some of these lawsuits.
Meanwhile, there are even deeper worries as to the effect on national power grids and the environment. According to the ACM Digital Library in 2024, training the text-based GPT-3 caused nearly 626, 000 pounds of carbon dioxide to be pumped into the air. This is five times more than the average car produces in its entire lifetime.
Amazon, Microsoft, Meta, xAI, and OpenAI are all racing to find alternatives to power these massive data centers. So far, only Microsoft has managed to secure nuclear power, while the others face struggles with red tape or non-existent power solutions.
Google’s efforts for integrating AI so far have all been blasted online. Claims of Search’s usefulness being reduced are commonplace, and businesses who didn’t agree to Gemini being brought online in their paid Google Workspaces have found removing it to be a hassle.