Adobe is making media intelligence a core part of its video workflow strategy, seeing it as the base for future developments. Instead of just adding features, Adobe is building on existing technologies. This approach aims to solve ongoing workflow issues for broadcast professionals and content creators dealing with massive amounts of footage.
“Media intelligence is another foundation. Now that we’ve found the key to unlock that door, there’s a hallway of more doors in terms of what functionality can come from it,” said Meagan Keane, principal product marketing manager for Adobe Pro Video, at the 2025 NAB Show. “When we think big picture, all of the new features that you see that we’ve talked about at the show this year are actually either evolutions of sort of foundations that we built years ago,” Keane added. “If you think about speech-to-text for example, when we launched speech-to-text a few years back we said this is a foundation to unlock more capabilities. So then you saw text-based editing and now you see caption translation.”
Keane explained that search is just the start for media intelligence; organizing and using visual content, transcripts, and metadata are all potential next steps. “When you’re editing and you’re wanting to tell a story, you’re not necessarily wanting to go in and deal with the like, ‘Okay, where was that?’ Or, ‘How do I find it?’ Or, ‘I know that I have this whole bin of things, but how do I navigate through it?’” she noted. This is especially useful for organizations with large archives. The “Saturday Night Live” team, for instance, used media intelligence to process “thousands of hours of footage” for its 50th anniversary.
Adobe balances development across various areas. “We’re really working hard to strike a balance between core functionality, core workflows, color, sound, graphics, all of the craft editing things along with what we’re calling quality of life,” Keane said. “Those little things that editors just, they’re annoying, they get in your way, they’re a headache as well as stability and performance.” AI tools like Generative Extend help address these workflow issues. “That’s a headache for editors, where you’re like, I just, I don’t even need that much more, but it takes such a long time to solve it without this feature, and now that we have it, it really lets people continue with their creative flow versus being taken out of it,” she explained.
Adobe uses beta testing to get feedback. “Over the last few years, establish this practice of putting features and functionality into beta and then really listening to what the community has to say, you know, what feedback we hear, where people are like, you know, this isn’t exactly how I feel like it should work,” Keane said. This impacts development significantly. “All of the features that we’re talking about at the show today, we launched into beta, but they’re not the same features that we launched into beta because we’ve been able to evolve with community input and the feedback during those times,” she clarified.
Adobe’s tools support both traditional and short-form social video, with the latter presenting a common workflow challenge. “When you look at core workflows, they’re not that different,” Keane said. “One use case we see across every discipline is short form social content. That every company, whether you’re producing sports for broadcast, you also have some social entity, or you’re producing episodic for television, you also have some social entity.” This led to improvements in Generative Extend during beta testing. “One of the main things that happened with Generative Extend between when we put it in beta and launching it here is, it had to be 4K and we had to support vertical video because everybody’s doing social,” Keane noted.
Keane also mentioned Firefly services for broadcast clients. “At Adobe Summit last month, we announced Firefly services for video. So things like reframe, auto reframe, things like translate captions that enhance speech that you can put in a whole mass of media from an enterprise environment and run those services in an API state,” she said. These services automate tasks, helping deliver content “to all the different geos, to all the different platforms, to all the different framing.”
Customer feedback is key. “The most important thing is what we’re hearing from our community and what we’re hearing from our customers in terms of what they need. And that’s what drives us when we talk about big picture. How we’re designing our strategies, how we’re looking to the future is really in concert with our community,” Keane concluded. This approach extends to Adobe’s work with the Firefly team. “Our job on the pro video side is, what functionality makes the most sense to actually bring into the tool? Because not all of it will necessarily be relevant to our users,” she added.