At NAB 2024, Strada’s Michael Cioni talked with Johnnie about their new AI-powered cloud workflow for content creators. Strada’s “project management module” was developed to provide a more efficient workflow by performing the more mundane production tasks. Let’s see what it can do! CineD wrote about this venture in January (you can read about it here), so Johnnie was excited to talk to Michael in person at NAB 2024. Strada is a product currently being tested in beta (with 1500 people on their waiting list) that has the potential to relieve the most mundane tasks we have to do as editors and filmmakers. “The most important thing is being able to give an editor the most amount of time to do the creative cutting and not be syncing, transcribing, importing, transcoding, or even tagging,” as Michael states. It’s about the “taskification of the boring stuff.” The Tag Analyze module. Source: CineDThe dreaded task of tagging, for exampleThe problem with tagging, as Michael says, is that everyone on a production team will have different ideas about what keywords to use. Strada resolves this by analyzing each asset through multiple lenses and including all the possibilities – think ‘child’ and ‘teenager’. Just click “Tag Analyze” and the program automatically takes care of the rest. All search results will be displayed in a column to the left of the timeline, and the timeline will clearly indicate the exact locations of the tagged items.Strada automatically analyzes clips and then tags. Source: CineD Michael gave an example using the word, “cat”. The module not only identified clips featuring a cat but also pinpointed each appearance of a cat on the timeline. Search for any keyword, and the software directs you to every take containing it. Additionally, you are not restricted to a single tag – you can combine multiple tags for more refined searches. Being understood – foreign languagesAnother problem editors face is foreign language (Michael said about 40% of their beta sign-ups are outside the U.S.). Strada can translate and transcribe content in 100 languages, often eliminating the need for an editor to understand the language being spoken. Simply activate the translate feature, and the transcription will automatically display on the screen. How precise is it? Michael believes the analysis of sound will continue to improve as they are integrated. He also pointed out that some models are better at transcribing than others, so Strada may first transcribe in one language and then translate to another. Does it work? They are still testing in beta, but so far, so good. What definitely works is that clips can be transcribed in batches of 1000, and retain their timecode, clip name, and polyphonic wave file. As Michael pointed out – most programs don’t even know what a polyphonic wave file is, but Strada can handle it.At the Strada book at NAB 2024. Source: CineDA machine-learning toolMichael stressed that Strada is a machine-learning tool. It learns through your input and provides the output. In other words, Strada isn’t training on your data – “it learns what you shoot, your style and language, and then you can apply the appropriate machine learning models to expedite your workflow based on your productions,” according to Michael. Price and availabilityStrada will adopt a usage-based pricing model, as Michael says Strada recognizes that each production’s workflow is unique. They aim to ensure users only pay for features they are actively using. Strada is testing version 1 in beta through summer, and their objective is to have it available sometime toward the end of the year. For more information, please see their website. How do you feel about letting AI handle the ‘boring’ parts of editing? Would you be inclined to use it for things like tagging and translating? Let us know in the comments below!
We will be happy to hear your thoughts