The transition to a “strategy-first” approach to Artificial Intelligence (AI) in broadcasting hinges on understanding cloud economics and system interoperability, not just generative capabilities, the Western Association of Broadcasters (WABE) heard at the organization’s recent gathering in Calgary.
Guillaume Aubuchon, Vice-President of Product Management at Avid, delivered a detailed presentation on the “Strategy, Risk and Real-world Impact” of AI, arguing that the true benefit lies in deep-seated operational efficiencies.
Aubuchon stressed that the economic shift to cloud is the critical enabler for widespread AI use.
“It’s unequivocal that cloud infrastructure has matured from an economic standpoint,” Aubuchon told the audience of broadcast technicians. “It is far cheaper to run [broadcast operations] in the multi-cloud collaboration structure now. Once you consolidate your infrastructure into the cloud, your data is centralized, and that accelerates all the AI pieces around efficiency and metadata extraction.”
Aubuchon cautioned against blindly adopting every new AI feature promoted by big tech firms, emphasizing that AI should be implemented prescriptively to solve specific business problems.
“The danger is…you don’t really need everything. You still need to look at your own workload…and decide which aspects of AI you really need that for,” he advised, noting that the primary targets for AI are high-volume, commoditized tasks like speech-to-text, translation, and transcription, which are easily run at scale with high accuracy.
Real-world impact and data interoperability
The core, lesser-known benefit of AI, Aubuchon said, is improving data interoperability and content lifecycle management.
He cited use cases including script and run down intelligence in newsrooms that see AI used to create vector embeddings of a broadcaster’s own archive, which combined with trending topics from external wire services, can automatically serve up relevant historical clips in real-time to increase content utilization and speed up the news cycle.
Intelligent tagging and integration are also being used to map metadata between disparate systems, acting as a “data adaptor” across proprietary formats to integrate with other systems, automatically. Automatic versioning of assets based on regional legal and compliance requirements can also be fed into the system, streamlining global distribution.
Mitigating risk with human oversight
The presentation also addressed necessary safety measures. While automation is key, Aubuchon insists that human judgment remains non-negotiable, particularly for quality control.
“We need human review as part of…automation, orchestration, and AI elements,” he said. “It’s automating the processes and AI that are those repetitive processes, and then you have to stop and say, ‘okay, a human being better review all of this to be true.'”
Aubuchon said the “human-in-the-loop” model ensures the final output maintains integrity and quality, safeguarding audience trust—a critical asset in broadcasting.
He concluded by underscoring that the shift to a cloud-native environment requires moving away from traditional capital expenditures to a SaaS (Software as a Service) consumption-based model, demanding new skill sets in broadcast engineering like cloud DevOps – combining software development and IT operations – and model building.


