Categories
Apple Metadata Video Technology

How serious is Apple about metadata?

During a recent thread here where I “infamously” suggested Apple should drop Log and Capture for the next version of FCP, one of the topics that came up was the use of metadata. Most commenters (all?) appeared – to my interpretation – to feel that reel name and TC were the “essence” of metadata.

And yet, if we look at the most recent work of the Chief Video Architect (apparently for both pro and consumer applications) Randy Ubilos we see that Location metadata is a requirement for the application. According to Apple’s FAQ for iMovie for IPhone if you don’t allow iMovie for iPhone to access your location metadata:

Because photos and videos recorded on iPhone 4 include location information, you must tap OK to enable iMovie to access photos and videos in the Media Library.

If you do not allow iMovie to use your location data, then the app is unable to access photos and videos in the Media Browser.

You can still record media directly from the camera to the timeline but, without the Location metadata, you’re pretty much locked out of iMovie for iPhone for all practical purposes.

There is no location metadata from tape capture! There’s not much from non-tape media right now, although some high end Panasonic cameras have an optional GPS board. However P2 media (both DVCPRO HD and AVC-I) as well as AVCCAM all have metadata slots for latitude and longitude.

Now, I’m NOT saying that Apple should force people to use metadata – particularly if it’s non existent – and this type of restriction in a Pro app would be unconscionable. I merely point out that this shows the type of thinking within Apple. In iMovie for iPhone they can create a better user (consumer) experience because they use Location metadata for automatic lower third locations in the themes.

Where I think it’s a little relevant is in counterpoint to some of my commentors: building an app that’s reliant on metadata is a different app than one relying on simple reel name and TC numbers.

Categories
Assisted Editing Item of Interest Metadata The Technology of Production

I’ve just uploaded some computer edited videos to YouTube

As well as showing the software in action, this series of videos show the results from the software. Each “edit” is based on a set of story keywords (logged with the clips) and a duration. Lower Thirds are automatic; story arc is automatic; b-roll is automatic; audio from b-roll is faded in and out and dropped in volume. All automatically and in seconds.

The project is about a young triple threat – singer, dancer, actor – Tim Draxl, discovered in Sydney when he was just short of his 18th birthday.

He played Rolf in a professional touring production in Australia in 2000 and his career has blossomed from there, and the three CD deal he has with Sony Universal: the first when he was 18!

Remember, these edits were done in seconds, from selects using Assisted Editing’s First Cuts software. And yes, this is my baby (along with Dr Greg Clarke).

The Sound of Music Edit

Without limits – about 13 minutes of material.

Four minute limit set. Edit is tighter and only the best material makes it to the edit.

 

Growing Up

Tim grew up partly in Australia and partly in Austria as his father worked as a ski instructor. This is the unlimited version of the “Growing Up” edit.

[Update: I forgot the 10 minute limit so one of the movies was too long and YouTube can’t distinguish between a 6 minute cut and a 4 minute cut, thinking they’re the same. Fortunately the videos are also available on our site. The Growing up unlimited  and six minute versions are available at http://assistedediting.com/FirstCuts/results.html]

And finally with a 4 minute limit.

Categories
Apple Metadata Video Technology

How is Apple using metadata in iMovie for iPhone?

I was finally watching the Steve Jobs Keynote from WWDC on June 7. (I know, but this was our second try – we get talking about stuff, what can I say?) I got to the iMovie for iPhone 4 demo and was blown away by the creative use of source metadata.

At 58 minutes into the keynote, Randy Ubillos is demonstrating adding a title to the video he’s editing in iMovie and iMovie automatically ads the location into the title. Not magic, but it’s simply reading the location metadata stored with images and videos shot with an iPhone and using that to generate part of the title. This is exactly how metadata should be used: to make life easier and to automate as much of the process as possible.

Likewise the same metadata draws a location pin on the map in one of the different themes. Exactly like the same metadata does in iPhoto.

In a professional application, that GPS data – which is coming to more and more professional and consumer video camcorders – could not only be used to add locations, but also to read what businesses are at the address. From that source and derived metadata (address and business derived from location information) we can infer a lot.

Check out my original article on metadata use in post production and for a more detailed version, with some pie-in-the-sky predictions of where this is going to lead us, download the free Supermeet Magazine number 4 and look for the article (featured on the cover) The Mundane and Magic future of Metadata.

Categories
Business & Marketing Item of Interest Metadata

Seven hours from feature request to product update!

Seven hours from feature request to updated application released: Sync-N-Link now uses log notes from video *or* audio. http://bit.ly/aqcxN7

I love being a small independent software developer: it’s great to be able to respond to customer requests promptly – and it makes the software better. Incidents like this one today make me also appreciative of the communication tools we now have

Some time, overnight our time, we had a new customer buy a copy of Sync-N-Link to sync rushes for 8 episodes of a new drama series: in Belgium! A few hours later he emailed to say that it was doing everything he expected, but their sound guy entered metadata (log notes) into the sound clips and Sync-N-Link (like Final Cut Pro itself) discards audio metadata in favor of the video metadata. (In a merged clip there is only room for one of each type of log note/metadata). The feature request was that the metadata from the audio could be preserved instead of from that from the video.

A good request. The ever efficient Greg Clarke, after morning coffee, got to work. At around 1:30 pm (Pacific) an update was published, ready for download, with the feature added. Not quite seven hours from feature request to released software.

I love that we can do that.

If you use any of our software let us know what more you want it to do. We can be very responsive!

Categories
Metadata

Why is metadata so important?

I have recently been scanning in slides and negatives from my young adult days, and I’m really wishing I’d entered more metadata at the time. That would have been too easy. Anyway, I thought it was worthwhile examining the different metadata I found/used in indexing these old slides.

Note: Apparently I believed I’d remember the people in my pictures for ever! While I remember the people, names that are not noted somewhere have evaporated over the 30 or so years in between.

The first, and most common, source of metadata was “Added” metadata: those notes that were made on the slide, a carrier or slide sheet carrier heading. Apparently in my early 20’s I could write in 6 point text, which I’m having trouble reading with much older eyes. Regardless, Added metadata has proven to be the most valuable.

The brief added metadata is usually combined with some “Analytical” metadata. Analytical metadata is metadata we get from analyzing visual content in the image. For example, this picture:

was one of a group of 20 on the same sheet with the single word ‘Burbank’ in the heading. In fact only two of the dozen images were in Burbank. But using Analytical metadata – the general location and the street sign near the traffic lights – the shot is clearly on Burbank Blvd, on the rise to the overpass over the 5 Freeway and railway lines. (The sign is for Front St.)

Co-incidence number 1: this is less than a mile from where i now live.

The other very useful metadata was stamped on the slides themselves: the month and year of processing, which locks down an approximate time scale. Also useful was the fact that I’d numbered all my files sequentially from the beginning of 1973 (my year in Japan). That sequential metadata made it much easier to identify specific times, in combination with the stamped-on date.

A combination of Added and Analytical metadata led me to the discovery that most of my 1976 trip was spent in the West San Fernando Valley. Identified was the location of an awards ceremony (by Church name) and a street shot, that was clearly looking across the SF Valley, was also named. Both turn out to be incredibly close to our Tarzana office and Woodland Hills home (2001-2005 before we moved to Burbank).

This jogged my memory that we had spent a lot of time in a school hall for rehearsal/training and enjoyed a close-by Denny’s. Taft High school (Ventura Blvd at Winnetka) has a Denny’s across the road.

So it is entirely possible that during my 1976 trip to the US, I spent the majority of my time around the area that was to become home a quarter of a century later; and some of the rest of my time near my 30-years-in-the-future home in Burbank.

Crazy. And without metadata I’d have never remembered.

Categories
Interesting Technology Metadata Video Technology

What is Transcriptize and what will it do for me?

Occasionally I do some work for my day job at Intelligent Assistance!, where we’re actively adding to our metadata-based workflow tools. This time taking the speech transcription metadata from the adobe suite and making it accessible to producers who want text or Excel versions, or even into FCP with the transcription placed into colored markers (one color per speaker). With Transcriptize you can also name the speakers, something not possible in the Adobe tools.

Here’s my interview with Larry Jordan where we announced it and the press release is below.

“Announcing Transcriptize on the Digital Production BuZZ”

Transcriptize expands the usefulness of Adobe Speech Transcription

Take transcriptions from Adobe Production Bundle to Media Composer, Excel and Final Cut Pro.

Burbank, CA (February 12, 2010) – Intelligent Assistance, Inc has introduced a new software tool that takes Transcription XML from Adobe Premiere Pro CS4 or Soundbooth CS4 and converts it text, Excel Spreadsheet or Final Cut Pro clip markers.

“Late last year, Larry Jordan asked if we could create something to make the Adobe Speech Transcription more available”, says Intelligent Assistance’ CEO Philip Hodgetts. “We thought that was a great idea and Transcriptize is the result, less than two months later.”

Transcriptize imports the transcription XML from the Adobe Production Bundle and allows editors and producers to name the speakers – something not possible in the Production Bundle. From there users have the option to:

Export a plain text file, suitable for the needs of a producer or to import into Media Composer’s Script Sync engine.

Export an Excel spreadsheet with a variable number of words per row – Perfect for a producer.

Open the XML from a Final Cut Pro clip and add the transcription to Markers where:

There are a variable number of words per Marker (including one Marker per speaker)

The speaker name is placed in the Marker name

Transcription appears in the clip Marker comment

Marker colors are used to identify each speaker (FCP 7 onward).

The transcription can be searched within Final Cut Pro.

Markers can be easily subclipped based on transcription content.

Transcriptize is available now from www.assistedediting.com/Transcriptize/. MSRP is US$149 with an introductory offer of $99 until the end of February 2010. NFR versions for review are available, contact Philip Hodgetts, details below.

Categories
Interesting Technology Metadata

What is new from Intelligent Assistance?

Sorry about the little haitus in posts. It’s certainly not because I’ve got nothing I want to talk about! (Ryan Seacrest’s $13 million deal for American Idol and why doesn’t Robert Iger’s outrageous salary go down when Disney’s profit drops 26%, but they’ll either wait for later today or tomorrow.)

The pause has been caused by a couple of reasons: number one of which is that this week (and the next two) I’m looking after myself. Partner Greg is in Australia for a Visa renewal and I’m once again realizing how much he does to make our lives easier. (Mine particularly).

Also, we’ve been releasing new software, updating older products and revising earlier books. In fact we’ve been doing so much that I can’t announce stuff in press releases yet!

About a month back I finished completely revising Simple Encoding Recipes for the Web 2009 edition. Anyone who purchased in 2009 should have received a download link. Announcements to everyone else are coming or you can buy the update for $4.95. (It’s a complete rewrite).

Last week the revision of The HD Survival Handbook 2009-2010 was finished and, again, those who purchased in 2009 will have received an email with an update link. All other previous purchasers will have received a $4.75 upgrade offer. It’s been about 30% rewritten, almost an additional 20 pages, so the upgrade price represents the value add that’s gone into it. The “upgrade” is the full new version, not changed pages. Also this year we went with Avid support – codecs, hardware and workflow. Given that’s now a 233 page US Letter book, it’s a huge project to revise. So much has changed in a year.

In between, Greg’s been working hard to release an updated First Cuts to First Cuts Studio by adding in the functionality of one of our new applications, exceLogger. Have I mentioned we love customer feedback? It’s made Sync-N-Link a much stronger product. Naturally we want the same feedback from customers of our other products. Good, bad or feature request, all feedback is welcomed. (Begged for!) exceLogger was a feature request for First Cuts for Final Cut Pro, and is available as a stand-alone application for those who just want to log in Excel but merge with captured media in Final Cut Pro.

BTW this now makes  First Cuts Studio great value: At $295 it includes Finisher (US$149) and exceLogger (US$69) – so the Auto-edit functionality of First Cuts is just $77!

Greg also developed two additional applications that fit perfectly in our metadata focus. miniME (Metadata Explorer) when we discovered (just four years after Apple told us!) that the QuickTime metadata from IT-based digital video sources (non-tape) is preserved in FCP but only visible in exported XML. So, Greg wrote me a simple tool to view the hidden metadata and export to an Excel spreadsheet. (That functionality is free in the demo version.) The paid version lets you remap that metadata into visible fields in Final Cut Pro.

Finally, the night we demonstrated miniME and exceLogger a friend of mine again suggested an idea for software that would report clips used in a Sequence – video or audio – as he has to provide reports to his clients, but equally useful for music reports. Greg worked on it for a while and this week we released Sequence Clip Reporter. (Yeah, we tried to find a better name but that’s descriptive and stuck.)

Now there’s a lot of work goes into writing software. There’s the work on the actual functions of the software, but then there’s questions about interface and how functions should work. Then there’s software updating to be added, serial number support to be added and feedback mechanisms added. All beyond the actual functionality.

Me, I get to design a new logo for each piece of software, write website and postcard copy, write a press release and send it out. Plus Help files need to be written so people can actually use the software. So, around any new software there’s a lot of work that doesn’t actually involve much software writing!

And that’s why posting has been sparse.

Categories
Apple Pro Apps Metadata

What about the hidden metadata in Final Cut Pro?

We’ve been working with a few people previewing, and getting feedback on, a new addition to our First Cuts assisted editing tool – basically checking some areas of Final Cut Pro that I haven’t explored for years and I had the most interesting conversation with Jerry Hofman.

Before I get to that though, let me ask (beg) for feedback on any of our software products. We want to keep making them better and love feedback, feature requests and especially problems. We respond quickly – this particular feature request was received on Friday 26th, discussed briefly during a Hollywood Bowl concert on Saturday night and was a preliminary feature by Wednesday!

Anyhow, in discussing this particular tool with Jerry (you’ll find out what it is soon enough!) I asked how much metadata from RED is imported to Final Cut Pro via Log and Transfer. Jerry, who uses RED a whole lot more than me (i.e. he uses it!) said “not very much”, which pretty much matched my understanding working with a whole bunch of RED clips with Sync-N-Link and never seen any of the color temperature, date or other information that’s in the RED metadata.

In sharing this conversation with my smart partner, and our main code writer, Greg Clarke, he commented “Oh, I do think Mr Hofman is mistaken!” (or words to that effect). Turns out Greg has been scrolling past this metadata for most of the last year. The difference is that Greg works with FCP XML exports, while Jerry and I were looking through the Final Cut Pro interface.

OMG! What a treasure-trove of metadata there is. And why didn’t we know of this? Surely someone from all the conversations we’ve had with people developing Sync-N-Link must know about this? (You’ll all come out of the woodwork into the comments and let me know you’ve known about it for years!)

So this morning Greg’s built me a tool for exploring this hidden (I prefer “secret” because it makes it seem more mysterious) metadata, turning it into an excel spreadsheet. I already had XDCAM EX media and P2 media along with RED clips and I was able to download some AVCCAM media shot with Panasonic’s HMC-150 camera.

There’s an enormous amount of Source metadata there. A lot of fields that seems to be unused even in the camera. Clearly, the current version of Final Cut Pro doesn’t have the flexibility to display items like ‘whiteBalanceTint’ or ‘digitalGainBlue’ settings in the original file. I guess this type of metadata is going to be challenging for Apple and Avid to deal with, as they don’t (currently) have displays in their application for the enormous amount of metadata that are generated with tapeless cameras. I’m just very thankful that it’s being retained, and that it’s available via XML (and associated with a Final Cut Pro clip ID).

There’s definitely metadata already  being produced that we can use to improve First Cuts – at least for non-tape media sources. But it’s also interesting to explore fields that are available but not being used.

Show all columns and you'll be surprised at what's available, or going to become available.
Show all columns and you'll be surprised at what's available, or going to become available.

BTW, you can explore yourself using Log and Transfer. Open any type of media that Log and Transfer supports and then, right click on the column header (like you would in Final Cut Pro) and select “Show all Columns”. The columns displayed will change according to the type of media selected.

So far, Sony’s XDCAM EX has the least amount of metadata and the least interesting metadata – barely more than the basic video requirements and information on the device: model and serial number.  RED footage has a lot of metadata, although most is focused on the technical aspects of the shot as you would expect for a digital cinema camera.

But take a peak at the source metadata from P2 Media! All the goodness like the date of the shoot (which FCP otherwise does not export) and time (as does RED) but also fields for ‘Reporter Name’ (awesome for a First Cuts – News product) and Latitude and Longitude. While they’ve been blank in every instance because I don’t think Panasonic are shipping any cameras with GPS built in yet, it does suggest that future Panasonic cameras are likely to contain GPS and store that data in with the media file. Anyone who’s a regular reader will know that means Derived Metadata! There are also fields for ‘Location Source’, ‘Location Name’, ‘Program Name’, ‘Reporter’, ‘Purpose’and ‘Object’ (??).

AVCCAM carries all the fields of P2, more or less, with the addition of a “memo” and “memo creator” fields.

It’s been fun exploring this ‘secret’ metadata. Now to find a way to make some use of it, or make it practical. Would anyone be interested in a tool that would not only read and explore this metadata, but allow some of it to be mapped to existing Final Cut Pro fields?

Categories
Metadata Random Thought

I think there’s a sixth type of metadata

When Dan Green interviewed me earlier in the week for Workflow Junkies, in part about the different types of metadata we’ve identified, Dan commented that he thought we’d get to “seven or eight” (from memory). I politely agreed but didn’t think there were going to be that many. I should have known better.

The “iPhoto disaster of May 09” is actually turning out to be good for my thinking! In earlier versions, iPhoto created a copy of the image whenever any adjustments were made. The original was stored, which explains why my iPhoto folder was almost twice the size of my actual library as reported in iPhoto. iPhoto 09 (and maybe 08, I skipped a version) does things a little differently.

When I changed images while the processor was under load, the image came up in its original form and then – a second or so later – all the corrections I’d made would be applied. It was obvious that the original image was never changed. All my color balance, brightness, contrast and even touch up settings were being stored as metadata, not “real changes”.

The original image (or “essence” in the AAF/MXF world) is untouched but there is metadata as to how it should be displayed. Including, as I said, metadata on correcting every image blemish. (The touch up tool must be a CoreImage filter as well, who knew?)

So, I’m thinking this is a different type of metadata than the five types of metadata previously identified. My first instinct was to call this Presentation Metadata – information on how to present the raw image. Greg (my partner) argued strongly that it should be Aesthetic Metadata because decisions on how to present an image or clip or scene, but I was uncomfortable with the term. I was uncomfortable because there are instances of this type of metadata that are compulsory, rather than aesthetic.

Specifically, I was thinking about Raw images (like those from most digital cameras, including RED). Raw images really need a Color Lookup Table (CLUT) before they’re viewable at all. A raw Raw file is very unappealing to view. Since not all of this type of metadata is aesthetic I didn’t feel the title was a good fit.

Ultimately, after some discussion – yes, we really spend our evenings discussing metadata while the TV program we were nominally watching was in pause – we thought that Transform Metadata was the right name.

Specifically not “Transformative” Metadata, which would appear to be more grammatically correct, because Transformative has, to me, a connotation of the transform being completed, like when a color look is “baked” into the files, say after processing in Apple’s Color or out of Avid Symphony. Transform Metadata does not change the essence or create new essence media: the original is untouched and Transfomed on presentation.

Right now we’re a long way from being able to do all color correction, reframing and digital processing in real time as metadata on moving images as iPhoto does for still images, but in a very real sense an editing Project file is really Transform Metadata to be applied to the source media (a.k.a essence).

This is very true in the case of Apple’s Motion. A Motion project is simply an XML file with the metadata as to how the images should be processed. But there’s something “magic” going on because, if you take that project file and change the suffix to .mov, it will open and play in any application that plays QuickTime movies. (This is how the Project file gets used in FCP as a Clip.) The QuickTime engine does its best to interpret the project file and render it on playback. A Motion Project file is Transform Metadata. (FWIW there is a Motion QuickTime Component installed that does the work of interpreting the Motion Project as a movie. Likewise a LiveType QuickTime Component does the same for that application’s Transform Metadata, a.k.a. project file!)

I think Dan might be right – there could well be seven or eight distinct types of metadata. It will be interesting to discover what they are.

Categories
Metadata Random Thought

What is the fifth type of metadata?

Right now I’m in the middle of updating and adding to my digital photo library by scanning in old photos, negatives and (eventually) slides. Of course, the photos aren’t in albums (too heavy to ship from Australia to the US) and there are not extensive notes on any because “I’ll always remember these people and places!” Except I don’t remember a lot of the people and getting particular events in order is tricky when they’re more than “a few” years old, or those that were before my time because a lot have been scanned in for my mother’s blog/journal.

Last time I wrote about the different types of metadata we had identified four types of metadata:

  • Source Metadata is stored in the file from the outset by the camera or capture software, such as in EXIF format. It is usually immutable.
  • Added Metadata is beyond the scope of the camera or capture software and has to come from a human. This is generally what we think about when we add log notes – people, place, etc.
  • Derived Metadata is calculated using a non-human external information source and includes location from GPS, facial recognition, or automatic transcription.
  • Inferred Metadata is metadata that can be assumed from other metadata without an external information source. It may be used to help obtain Added metadata.

See the original post for clearer distinction between the four types of metadata. Last night I realized there is at least one additional form of metadata, which I’ll call Analytical Metadata. The other choice was Visually Obvious Invisible Metadata, but I thought that was confusing!

Analytical metadata is encoded information in the picture about the picture, probably mostly related to people, places and context. The most obvious example is a series of photos without any event information. By analyzing who was wearing what clothes and correlating between shots, the images related to an event can be grouped together even without an overall group shot. Or there is only one shot that clearly identifies location but can be cross-correlated to the other pictures in the group by clothing.

Similarly a painting, picture, decoration or architectural element that appears in more than one shot can be used to identify the location for all the shots at that event. I’ve even used hair styles as a general time-period indicator, but that’s not a very fine-grained tool!  Heck, even the presence or absence of someone in a picture can identify a time period: that partner is in the picture so it must be between 1982 and 1987.

I also discovered two more sources of metadata. Another source of Source Metadata is found on negatives, which are numbered, giving a clear indication of time sequence. (Of course Digital Cameras have this and more.) The other important source of metadata for this exercise has been a form of Added Metadata: notes on the back of the image! Fortunately Kodak Australia for long periods of time printed the month and year of processing on the back. Rest assured that has been most helpful for trying to put my lifetime of photos into some sort of order. The rate I’m going it will take me the last third of my life to organize the images from the first two thirds.

Another discovery: facial recognition in iPhoto ’09 is nowhere near as good as it seems in the demonstration. Not surprising because most facial recognition technology is still in its infancy. I also think it prefers the sharpness of digital images rather than scans of prints, but even with digital source, it seem to attempt a guess at one in five faces, and be accurate about 30% of the time. It will get better, and it’s worth naming the identified faces and adding ones that were missed to gain the ability to sort by person. It’s also worthwhile going through and deleting the false positives – faces recognized in the dots of newspapers or the patterns in wallpaper, etc. so they don’t show up when it’s attempting to match faces.

Added June 2: Apparently we won’t be getting this type of metadata from computers any time soon!