One of the more interesting press releases coming  this week out of CES, was from RAMP, a company I’d not heard of before, but with some interesting technology if the press release is to be believed for automatically generated metadata. (I say let’s give them the benefit of the doubt!)
In among the regular PR guff, is one paragraph that actually describes what MetaQ is about:
MetaQ delivers strong return on investment for content producers. It can be applied to large content collections without the need to individually craft the related content experience for each video clip, while still allowing for manual editing as required. RAMP’s powerful proprietary technology automatically identifies people, places, and things in real-time within a video and delivers a cost effective approach to video content spotlighting.  MetaQ customers have experienced between 70%-200% improvement in end- user video consumption, creating advertising revenue opportunities.
That’s a good distance to automatic generation of metadata. People and places particularly are useful in feeding into smart tools for auto-string-outs. RAMP’s application is for building related content links, and for placing relevant advertising, but I tend to see things through the lens of pre-post! I believe that the trend toward automatically generated metadata will take some of the pre-post burden off editors, allowing them to focus on the crafting of the edit, not that assembling of string-outs.
There API and technology also includes Speech-to-Text transcription, which should be a very powerful package for automatically generated metadata.
3 replies on “Automatically Generated Metadata is here already?”
RT @philiphodgetts: Automatically Generated Metadata is here already?:It seems like Boston’s RAMP may already be part way http://t.co/9F …
@philiphodgetts soon they’ll be able to generate a unique timestamp for every frame you shoot! With reel info in it!
RT @philiphodgetts: Automatically Generated Metadata is here already?:It seems like Boston’s RAMP may already be part way http://t.co/9F …