Categories
Distribution Random Thought Video Technology

A podcast is not the same media form as a video podcast a.k.a. vlog

I’m totally on board with audio podcasts. They have effectively replaced the car radio for all but the shortest drives. Perhaps that’s because I prefer "talk radio" in the car and the podcasts I subscribe to are most akin to talk radio and on business related subjects.

I’ll put up with inconsistent quality in an audio podcast – after all, it’s only taking up part of my attention. That is the crucial point with audio. We can listen to podcasts and drive the car, or go to the Fitness Center or Gym (and I should listen to more podcasts there) or do housework. The audio is only taking up part of my attention span.

But add video to it and you’re now demanding 100% of my attention while the video is playing. I can’t drive and watch a video; I can’t watch it while moving around the fitness center because the screen needs always to be in front of me; I can’t watch it while I go for a walk or do housework. Video assumes I’m going to give it, if not 100% of my attention, my primary attention.

If it’s not necessary to watch the video portion to get the value from a program, then dump the video and just go with audio.

I don’t normally subscribe to the highly regarded and very popular This Week in Tech a.k.a. TWiT, but a friend recommended a specific episode and suggested I get the video version. That one hour program has sat on my hard drive for six weeks waiting for me to have time to watch it. In that time I’ve listened to more than 15-20 hours of very similar programming. I’m still wondering why my friend had me get the video version: it’s just four guys sitting around a table in an infinite black set recorded by three cameras.

The video adds nothing. Apart from a short glimpse at an iPod Nano (is it really that old?) there were no props; no visual aids; no graphics; nothing that justified all my attention. Put the four faces in the artwork for the feed and I’d have had the same benefit.

Just because we can deliver video as enclosures in an RSS file, doesn’t mean we should. Bad video is easy. Good video is hard, and consistent regular production is very hard to do. I was talking with Scott Sheppard of Inside Mac Radio and Inside Mac TV about the difference in what he’ll have to carry to a trade show to get interviews for his iPod video show, compared with his weekly and daily audio shows. For audio: Marantz solid state recorder, microphone. For video: video camera, tripod, a basic lighting kit, microphone, radio mics, receivers… Instead of fitting in a shoulder bag or backpack with his laptop he’s now wondering whether he’ll need a custom cart to lug around – or an intern. This is, of course, because he’s trying to make the show up to something that uses the video well. You can find more ranting on this subject in an earlier post of mine: What makes good visuals

There are some programs, like Tiki Bar TV – always high in the new subscribers ranking in iTunes – that really try. While it may not be scripted in detail, it has good production values and I think I see some Apple LiveType and Motion effects in there – appropriately edited in the Final Cut Studio.

Bad video is easy. Good video requires considerably more equipment, effort, talent, skill and, most importantly, the need for video. Because if we don’t, we’re going to bore the market to sleep before they adopt any form of non-mainstream content as being valuable.

So, even though audio and video podcasts are superficially similar by their use of RSS and enclosed media files, they are not the same medium. Video requires a much higher commitment from the viewer than audio does from the listener.

Here’s a question for all the budding video podcasters out there: Is your content so valuable, compelling and well produced that I’d be happy to pay $100 an hour to watch it? If you wouldn’t, consider that is exactly what you’re asking me to do. OK, I don’t pay that to watch a "Hollywood" movie (but then again, I rarely watch them) but my time has a value and you’re asking me to give up that value to watch your program. With audio, it adds value to my time – redeeming time that would otherwise be wasted.

One media adds value to my time; the other robs me of it. That’s why they’re not the same.

Categories
Distribution Interesting Technology Random Thought

Yahoo and TiVo hook up, world shakes a little

Although it doesn’t seem like a big announcement, Yahoo users who own a TiVo can now program their TiVo from the Yahoo TV guide. No big thing because TiVo owners have been able to do that for years, so it’s only a minor additional convenience until we read some of the fine print and plans for the near future.

And in the coming months, possibly before the end of the year, Yahoo’s traffic and weather content, as well as its users’ photos will be viewable on televisions via TiVo’s broadband service and easy-to-use screen menu.

So now Yahoo content (pictures, traffic and weather for now) is available on the television with television/PVR/DVR ease of use. Remember TiVo already have many components of the digital home experience with TiVo to go to take digital material from the TiVo drive to PCs, DVD-R and portable media players.

Where it gets interesting is to project forward. Yahoo consider themselves a media company evidenced by the formation of the Yahoo Media Group back in January 2005 and an orgy of hiring media executives for the division since then. The purpose of the Yahoo Media Group is to “significantly strengthening our content pillar”. In April they hired Shawn Hardin, who, according to ZDnet, “Hardin has previously worked in television and the Internet, holding executive positions at NBC and Snap.com.”

Although Yahoo has been vague about its plans for the Media Group, it’s built a team with a strong balance of “Hollywood” and “Web” backgrounds and it’s perfectly reasonable to expect that it has some big plans…

Oh, let’s not forget Yahoo Video Search. In competition with Google video Yahoo are building connections to a library of independent content. I’m sure the Yahoo Media Group are building a library of mainstream content: Santa Monica is not that far from Hollywood!

Now to my conjecture. TiVo has about 3 million subscribers. TiVo is linking with Yahoo to display Yahoo content via the TiVo device, direct to the living room. Aren’t there many people planning/desiring to bring Internet-delivered “TV” into the living room, right where the TiVo box is already sitting? Akimbo, DaveTV, Brightcove all seem to be going down that path with a dedicated box and service. Apple with its FrontRow and Microsoft with its Media Center PC are approaching it from the other direction.

But none have the penetration in the living room that TiVo does right now.

If the relationship with Yahoo goes just a little further then what’s to stop Yahoo delivering video content direct to the TiVo under some form of pay-for-view or subscription model? It seems self-evident to me that this is really about a much bigger play for the living room and part-and-parcel of the Yahoo Media Group’s efforts. Yahoo could launch with 3 million subscriber boxes already in place. A number that would bring much cheer to Akimbo, DaveTV or Brightcove’s investors if they ever achieve anything close to that. Apple and Microsoft (and its partners) have to get the computer moved into the living room, although don’t be surprised to find Apple taking a page from TiVo’s book and integrating a tuner, program guide and simplified iTunes into a future “Mac mini” with video output to sit under the TV. When they’re ready, of course.

Given that TiVo are somewhat struggling right now, it’s not far fetched to consider Yahoo buying TiVo? With a market capitalization of around $300 million TiVo wouldn’t be hard for Yahoo to swallow, or even Apple (as rumored earlier in 2005). Only time will tell, but when Yahoo buys TiVo, remember, you heard it here first!

Categories
Random Thought Video Technology

When a good format “wins” for all the wrong reasons

Although I’m definitely in the group of people that sees Blu-ray as the undoubtedly superior technology of the two high density optical disc competitors, and should be happy that the tide seems to be turning toward Blu-ray “winning” the war before a shot is fired or product released, it seems the reason Paramount and Warner Bros “defected” to Blu-ray was because of the Digital Rights Management (DRM) supported in that format is much more restrictive than for HD DVD.

Both formats support Advanced Access Content System (AACS) as the primary DRM and Blu-ray has two additional DRM control agents included. However the point of difference, and the reason Bill Gates said Blu-ray was “anti consumer” is because HD DVD mandates that all discs support Managed Copy, while Blu-ray leaves the option to activate Managed Copy to individual disc authors and studios – meaning in practice that no Blu-ray disc will be allowed to be copied to a hard drive or sent around a home entertainment network. Managed Copy allows this extended use, although the amount permitted beyond a basic single copy to a hard drive, is still up to the content owner.

Blu-ray’s non-requirement for compulsory Managed Copy is why 20th Century Fox came exclusively to Blu-ray and apparently why Paramount, and then today Warner Bros, opted to support both formats, leaving only Universal Studios supporting only HD DVD (while other parts of the Universal group are already in the Blu-ray group).

Forrester Research are declaring Blu-ray the winner but that, given Forrester’s previous record that could be the best news Toshiba has heard recently! Even though Forrester Research are predicting it, Blu-ray does seem to have the momentum now, mostly because of the draconian DRM they’ve chosen to apply.

Which is sad. Blu-ray has a longer future than HD DVD and more interactive tools in the specification. However, even with the momentum, the DRM will probably cause both formats to be declared Dead On Arrival in the face of more flexible media delivery, with reasonable DRM, from suppliers like Apple. Sure Apple’s doing 320 x 240 now but it could do HD “any time” they decided to and had the product for.

The net result of restrictive DRM will be that more programming opportunities open for independent producers who bypass the studio and networks going straight to their customers with new distribution models and payment methods.

Categories
Business & Marketing Distribution Interesting Technology

The power of disruptive technologies

A disruptive technology is one that most people do not see coming and yet, within a very short period it changes everything. A disruptive technology will become the dominant technology. Rarely are they accurately predicted because predictions are generally extrapolated from the existing understanding. For example, there’s no doubt that the invention of the motor car was a disruptive technology, but Henry Ford is often quoted as saying “If we had asked the public what they wanted, they would have said faster horses.”

It’s almost impossible to predict what will become a disruptive technology (although the likelihood of being wrong isn’t going to stop me) but they are very easily recognized in hindsight. Living in Los Angeles it’s obvious the affect that Mr Ford’s invention has had on this society. Although some would argue that it wasn’t so much the invention of the motor car that made the difference, but the assembly-line technique that made the motor vehicle (relatively) affordable.

In fact I think it’s reasonable to believe that a disruptive technology will have a democratizing component to it, or a lowering (or removal) of price barriers.

Non-linear editing on the computer – Avid’s innovation – was a disruptive technology but initially only within a relatively small community of high end film and episodic television editors. The truly disruptive technology was DV. DV over FireWire starting with Sony’s VX-1000 and Charles McConathy/Promax’s efforts to make it work with the Adobe Premiere of the day, paved the way for what we now call “The DV Revolution”.

Apple were able to capitalize on their serendipitous purchase of Final Cut from Macromedia and drop the work that had been done to make Final Cut work with Targa real-time cards and concentrate on FireWire/DV support. (It was two further releases before we saw the same level of real-time effects support as was present in the Macromedia NAB 98 alpha preview.) I think, at the time, Apple saw Final Cut Pro to be another way of selling new G3’s with built-in FireWire. The grand plan of Pro Apps came about when the initial success of Final Cut Pro showed the potential. But that’s another blog post.

DV/FireWire was good enough at a much lower price, with all the convenience of single-wire connection. We’ve grown from an industry of under 100,000 people worldwide involved in professional production and post-production to one almost certainly over 1 million worldwide.

Disruptive technologies usually involve a confluence of technologies at the right time. Lower cost editing software wouldn’t have been that disruptive without lower cost acquisition to feed to it. Both would have been pointless without sufficient computer power to run the software adequately. Final Cut Pro on an Apple IIe wouldn’t have been that productive!

In a larger sense DV/FireWire was part of a larger disruption affecting the computer industry – the transition from hardware-based to software-based. We are, in fact, already through this transition with digital video, although the success of AJA and Blackmagic Design might suggest otherwise. However, the big difference now is that the software is designed to do its job with hardware as the accessory. Back in the days of Media 100’s success, Media 100 would not run without the hardware installed, in fact without the hardware it was pretty useless as everything went through the card. Then when they rebuilt the application for OS X they developed it as (essentially) software-only. This paved the way to the current HD version (based on a completely different card) and a software-only version.

Ultimately, all tasks will be done in software other than the hardware needed to convert from format to format. In fact much of the role of today’s hardware is that of format interface rather than the basis for the NLE as it was in the day of Media 100, Avid AVBV/Meridien and even Cinéwave. Today’s hardware takes some load off the CPU but as an adjunct to the software, not because the task couldn’t be done without the hardware. This has contributed to the “DV Revolution” by dramatically dropping prices on hardware.

Disruptive technologies are hard to predict because they are disruptive. Any attempt to predict disruptive technologies is almost certainly doomed to failure, but like I said, that’s not going to stop me now!

We are headed for a disruptive change in distribution of media, both audio and video content. I wish I could see clearly how this is going to shake out, so I could invest “wisely” but it’s still too early to predict exactly what will be the outcome 5-7 years down the track. I feel strongly that it will include RSS with enclosures, in some form. It will have aspects of Tivo/DVR/PVR where the content’s delivery and consumption will be asynchronous. Apart from news and weather, is there any need for real-time delivery as long as the content is available when it’s ready to be consumed? Delivery will, for the most part, be via broadband connections using the Internet Protocol.

There is a growing trend to want to merge the “computer” and “the TV”, either by creating media center computers, by adding Internet connected set-top boxes (like cable boxes) or by delivering video content direct to regular computers. Microsoft’s Media Center PCs haven’t exactly set the world on fire outside the college dorm where they fit a real niche; Apple are clearly moving slowly toward some media-centric functions in the iLife suite where it will be interesting to see what’s announced at MacWorld San Francisco in January; and there are developments like Participatory Culture’s DTV and Ant is not TV’s FireANT for delivering channels of content directly to the computer screen. Both DTV and FireANT are based on RSS, with enclosures, for their channels, just like audio podcasting does.

On the hardware box front, companies like Akimbo, Brightcove and DaveTV are putting Internet-connected boxes under the TV, although DaveTV is having a bet both ways with computer software or set-top box.

Whether any of these nascent technologies are “the future of media” as they are touted by their developers, whichever way this shakes out it has important implications for our industry. No-one foresaw that the Master Antenna and Community Antenna systems of the 1950’s would evolve into today’s dominant distribution networks – the cable channels, which have now (collectively) moved ahead of the four major networks in total viewership. The advent of cable distribution opened up hundreds, or thousands of new production opportunities for content creators. This time many people foresee (or hope) that using the Internet for program distribution will take down the last wall separating content creators from their audience.

In the days of four networks, any program idea had better aim to appeal to 20-30% of the available audience – young, middle-aged or old – to have any chance of success. In an age where the “family” sat down to watch TV together (and even ate meals together) that was a reasonable thing to attempt. As society fragmented we discovered that there were viable niches in the expanded cable networks. Programs have been artistically and/or financially successful that would never have been made for network TV because of the (relatively) small audiences or because the content was not acceptable under the regulations governing the networks. The development of niche channels for niche markets parallels the fragmentation of society as a whole into smaller demographic units.

Will we see, or do we need, more channels? Is 500 channels, and nothing on, going to be better when it’s 5,000 channels? Probably, because in among the 5,000 (or 50,000) channels will be content that I care enough about to watch. It won’t be current news, that’s still best done with real-time broadcasting, but for other content, why not have it delivered to my “box” (whatever takes this role) ready to be watched (on whatever device I choose to watch it)? (Some devices will be better suited to certain types of content: a “video iPod” device would be better suited to short video pieces than multi-hour movies, for example.)

If the example of audio podcasting is anything to go by, and with just one year of history to date it’s probably a little hard to be definitive, then yes, subscriber-chosen content delivered “whenever” for consumption on my own schedule is compelling. I’ve replaced the car radio newstalk station with podcasts from my iPod mini. Content I want to listen to, available when I’m ready to listen. Podcasts have replaced my consumption of radio almost completely.

Ultimately it will come down to content. Will the 5,000 or 50,000 channels be filled with something I want to watch? Sure, subscribing to the “Final Cut Pro User Group” channel is probably more appealing than (for me) many of the channels available on my satellite system. Right now, video podcasts tend to be of the “don’t criticize what the dog has to say, marvel that the dog can talk” variety. Like a lot of podcasts. Not every one of the more-than-10,000 podcasts now listed in the iTunes directory is compelling content or competently produced.

But before we can start taking advantage of new distribution channels, for more than niche applications, we need to see some common standards come to the various platforms so that channels will discovered on Akimbo, DaveTV, DTV and on a Media Center PC. About the only part of this prediction I feel relatively sure of, is that it will involve RSS with audio and video enclosures, or a technology derived from RSS, like Atom (although RSS 2 seems to have the edge right now.)

In a best-case scenario, we’ll have many more distribution channels, aggregating niche markets into a big-enough channel for profitable content (particularly with lower cost production tools now in place). Direct producer-customer connection, without the intermediation of network or cable channel aggregators improves profit potential on popular content and possibly moves content into different distribution paths. Worst case scenario is that nothing much changes and Akimbo, DaveTV, Brightcove, Apple or Microsoft’s Media-centric computers go the way of the Apple Lisa – paving the way for the real “next big thing”.

Categories
Apple Apple Pro Apps Business & Marketing Interesting Technology Random Thought

Don’t panic! Apple adopts Intel processors

The confusion and furor surrounding Apple CEO Steve Jobs’ announcement at the WordWide Developers Conference that future Mac, after Jun 2006, will use Intel processors inside is totally unfounded. Nothing changes now, very little changes in the next year and longer term the future for the Mac got a little brighter. Although the decision caught me by surprise, as I thought about it, and listened to what was said in the keynote, I could see why it made sense.

If we look short term, the decision makes little sense. Right now a G5 (Power PC, aka PPC) PowerMac has very similar performance to the best workstations on the PC/Intel platform running Windows and the G5 will cost less than a similarly performing PC workstation. At the low end the Mac mini is competitively priced to a cheap Dell or other name brand. (Macs are not price competitive with off-brand PCs, the so called “white box”.) So, why put the developer community, and developers within Apple, through the pain of a processor shift?

For the future (“we have to do it for the children”) and because it’s really not that painful for most developers.

Right now a G5 PowerMac is very performance competitive with the best offerings from Intel. What Apple have been privy to, that rest of us haven’t, is the future of both Intel processors and PPC processors. Based on that future Apple decided they had no choice but to make the change. In the future, the performance-per-watt of power of a PPC chip will be “15 units of processing” according to Mr Jobs. The same watt of energy would give 70 units of performance on an Intel processor. Without knowing exactly how those figures were derived, and what it means for real-world processing power it seems like a significant difference. It was enough to push Apple to make the change.

Not that there’s anything wrong with the PPC architecture: IBM continue to develop and use it at the high end and PPC chips (triple core “G5” chips) will power the Microsoft XBox360. The sales of chips to Microsoft will well and truly outweigh the loss of business from Apple. It is, however, a crazy world: next year will see a Microsoft product powered by PPC and Macintoshes powered by Intel!

Steve Jobs demonstrated how easy it will be for developers to port applications to OS X Intel. In fact, he confirmed long-term rumors that Apple have kept OS X running on Intel processors with every development on OS X – Mr Jobs demonstrated and ran his keynote from an Intel Macintosh. For most applications a simple recompile in the Xcode developer environment will suffice – a matter of a few hours work at most. Moreover, even if the developer does not recompile, Apple have a compatibility layer, called Rosetta, that will run pure PPC code on an Intel Mac. Both platforms are to be supported “well into the future”.

During the keynote Mathematica was demonstrated (huge application, 12 lines of code from 20 million needed changing, 2 hours work) as were office applications. Commitments to port Adobe’s creative suite and Microsoft’s Mac Business Unit software were presented. Apple have been working on Intel-compatible versions of all their internal applications according to Mr Jobs. [Added] Luxology’s president has since noted that their 3D modelling tool modo took just 20 minutes to port, because it was already Xcode-based, and built on modern Mach-0 code.

Remember, these applications are for an Intel-powered OS X Macintosh. No applications are being developed for Windows. In fact, after the keynote Senior Vice President Phil Schiller addressed the issue of Windows. Although it would be theoretically possible to run Windows on an Intel Macintosh it will not be possible to run OS X on anything but Apple Macintosh.

Apple’s Professional Video and Audio applications might not be as trivial to port although most of the modern suite should have no problem. LiveType, Soundtrack Pro, DVD Studio Pro and Motion are all new applications built in the Cocoa development environment and will port easily. Final Cut Pro may be less trivial to port. It has a heritage as a Carbon application, although the code has been tweaked for OS X over recent releases. More than most applications, Final Cut Pro relies on the Altivec vector processing of the PPC chip for its performance. But even there, the improvement in processor speeds on the Intel line at the time Intel Macs will be released are likely to be able to compensate for the loss of vector processing. At worst there will be a short-term dip in performance. However with Intel Macintoshes rolling out from June 2006 it’s likely we’ll see an optimized version of Final Cut Pro ready by the time it’s needed.

[Added] Another consideration is the move to using the GPU over the CPU. While the move to Intel chips makes no specific change to that migration – Graphics card drivers for OS X still need to be written for the workstation-class cards – Final Cut Pro could migrate to OS X technologies like Core Video to compensate for the lack of Altivec optimizations for certain functions, like compositing. Perhaps then, finally we could have real-time composite modes!

Will the announcement kill Apple’s hardware sales in the next year? Some certainly think so but consider this: if you need the fastest Macintosh you can get, buy now. There will always be a faster computer out in a year whatever you buy now. If your business does not need the fastest Mac now (and many don’t) then do what you’d always do: wait until it makes sense. The G5 you buy now will still be viable way longer than its speed will be useful in a professional post-production environment. It’s likely there will be speed-bumps in the current G5 line over the next year, as IBM gets better performance out of its chips. We are waiting for a new generation of chips from Intel before there would be any speed improvement. If Apple magically converted their current G5 line to the best chips Intel has to offer now, there would be little speed improvement: this change is for the future, not the present.

So, I don’t think it will affect hardware sales significantly. As a laptop user I’m not likely to upgrade to a new G4 laptop, but then there will be little speed boosts available there in the next year anyway. But as a laptop user, I’m keen to get a faster PowerBook and using an Intel chip will make that possible.

Although I have to say I initially discounted the reports late last week because, based on current chip developments, there seemed little advantage in a difficult architecture change. With the full picture revealed in the Keynote as to the long term advantages and the minimal discomfort for developers, it seems like a reasonable move that will change very little except give us faster macs in the future.

How could we have any problem with that?

[Added] Good FAQ from Giles Turnbull at O’Reilly’s Developer Weblog

Categories
Interesting Technology Random Thought

E3 and what it means for the future of production

This week was my first visit ever to an Entertainment Electronics Expo, usually written as E3. Entertainment in this context means gaming – computer games. My desire to visit E3 was driven by a deep-seated feeling that I was ignoring an important cultural trend because it doesn’t intersect my life direction – I don’t play “video games” and I don’t have children, nieces or nephews nearby. But it’s hard to ignore an industry that reportedly eclipsed motion picture distribution in gross revenue last year. It hasn’t – Grumpy Gamer puts things into perspective.

That doesn’t mean that the gaming industry statistics are anything but impressive: Halo 2’s first day sales of US$125 Million eclipses the record first weekend box office (Spiderman 2002) of $114 milliion – a statistic the gaming industry likes to promote but isn’t as impressive when you consider whole-of-life revenue. Grumpy Gamer again. Gaming is a huge business and E3 had many booths that represent a multi-million dollar investment in the show by the companies exhibiting. E3, like NAB is an industry-only show (18+ only and industry affiliation required) and this year attracted 70,000 attendees (vs NAB 2005 with about 95-97,000) in a much smaller show area. This is big business.

But what has this got to do with “the present and future of production and post production”? There are three “game” developments that will, ultimately impact video production, post production and distribution. This is quite aside from the fact that, right now, video production for games is a big part of the expense of each game. Most games have video/film production budgets way above those of a typical independent film’s total budget. This presents an opportunity for savvy producers to team up with game producers for “serious games” – games aimed at education or corporate training. Video production alone is rapidly becoming a commodity service so working with a games company for their acquisition is a value add and opportunity in the short term.

Microsoft’s Steve Ballmer

Take today’s passive video content, add a little interactivity to it. Take today’s interactive content, games, and add a little bit more video sequencing to it. It gets harder and harder to tell what’s what…

In the longer term three trends will impact “video” production: graphics quality and rendering, “convergence” (yes, I hate the term too, but don’t have a good alternative), and interactive storytelling.

Graphic Powerhouse

We all owe the gaming community a debt of gratitude for constantly pushing the performance of real-time graphics and thus the power of graphics cards and GPUs. The post-production industry benefits from real-time in applications like Boris Blue, Apple’s Motion, Core Video in OS X 10.4 Tiger and other applications. Without the mass market for these cards they’d be much more expensive and would not have advances as quickly as they have.

The quality of real-time graphics coming from companies like Activision with their current release F.E.A.R. in a standard definition game, and the quality of their upcoming HD releases for next-generation gaming consoles is outstanding. In one demonstration (actual game play) of a 2006 release, the high definition graphics quality, including human face close-ups was really outstanding. Extrapolate just a few more years and you have to wonder how much shooting will be required. If we can recreate, or create, anything in computer simulation in close to real time (or in real time) at high definition, what’s the role of location shooting and sets? Sky Captain and the World of Tomorrow and Sin City have shown that it’s possible to create movies without ‘real’ sets, although both movies seemed to have needed to apply extreme color treatments to disguise the lack of reality (or was that purely motivated by creativity).

Actors could be safe for a little while, because of the ‘Uncanny Valley’ effect, although the soldiers in Activision’s preview were close-to-lifelike in closeup, as long as you didn’t look at the eyes – the dead giveaway for the moment. Longer term (five + years) realistic humans are almost certain to come down the line. At that point, where is the difference between a fixed path in a game, and a video production?

Convergence

NAB has, until this year, long been “The Convergence Marketplace” without a lot of convergence happening. However, the world of gaming has converged with the world of movies a long time ago. It is standard practice for a blockbuster movie release to have a themed game available – Star Wars III: Revenge of the Sith and Madagascar have simultaneous movie and game releases. Activision’s game development team were on the Madagascar set and developed 20 hours of game-play following the theme of a 105 minutes movie!

Similarly Toronto-based Spaceworks Entertainment, Inc. announced at E3 a Canada/UK co-production of Ice Planet the TV series and game are to be developed together, again with the game developers on the set of the TV series shoot. Although the game and TV series can be enjoyed independently the plan is to enhance the game via the TV show and the TV show via the game. Game player relevant information can be found throughout each of the 22 episodes of the series’ first season – the first of five seasons planned in the story arc.

Whether or not Spaceworks Entertainment are the folks to bring this off eventually it will happen that there is interplay between television and related game play. Television will need something to bring gamers back to the networks (cable or broadcast) if there’s a future to be had there. (Microsoft, on the other hand, wants to bring the networks to the gamers via the Xbox 360.)

Interactive Storytelling

The logical outcome of all this is an advanced form of interactive storytelling that could supplant “television” as we know it. Or not. Traditionally television has been a lean-back, turn my mind off medium and I imagine there will continue to be a demand for this type of non-involved media consumption in the future that won’t be supplanted by a more active lean-forward medium. However, the lean-forward medium will be there to supplement and, for many people, replace the non-involved medium.

Steven Johnson’s Everything bad is good for you makes some valid points that suggest that the act of gaming might be more important than imagined (and less bad for you). From one review:

The thesis of Everything Bad is Good for You is this: people who deride popular culture do so because so much of popcult’s subject matter is banal or offensive. But the beneficial elements of videogames and TV arise not from their subject matter, but from their format, which require that players and viewers winkle out complex storylines and puzzles, getting a “cognitive workout” that teaches the same kind of skills that math problems and chess games impart. As Johnson points out, no one evaluates the benefit of chess based on its storyline or monotonically militaristic subject matter.

In the same vein, and a little aside, I was amused by this comment posted in Kevin Briody’s Seattle Duck blog that hypothesises how we would have contemplated books had they been invented after the video game.

“Reading books chronically understimulates the senses…
Books are also tragically isolating…
But perhaps the most dangerous property of these books is the fact that they follow a fixed linear path. You can’t control their narratives in any fashion: you simply sit back and have the story dictated to you…”

How far will we go with non-linear paths? Most games today have fairly limited, linear paths to a single destination, with a lot of flexibility in the journey. I’m no fan of first-person shooter games but can imagine becoming more involved with another type of story. Don your 3D immersion headset, or relax in your lounge with the 60″ wall-mounted flat panel, and join me in this week’s episode of (say) Star Trek TNG. Choose your character and participate in the story appropriately. Clues as to your behavior would be an optional “cheat” track (not dissimilar to the podcasts accompanying this year’s Battlestar Gallactica episodes). The rest of the characters would guide the story and respond to your participation, to whatever outcome. Whatever character role you took, would control the perspective of the show that you saw (when involved a scene).

Is this a game? Is it “television”? Is it something else? Storytellers have adapted their stories for specific audiences from the first day there were stories. Roaming storytellers would adapt details for kings or commoners, for this geographic region over that (often for their own self-preservation) so adaptive (interactive) storytelling isn’t new, just new to modern electronic media. Do a search for “interactive storytelling” at google.com and you’ll find many links. I just found that my hypothesis above has a name Mixed Reality Interactive Storytelling! The marketing people will have to massage that into something that would capture popular imagination.

None of this will have much impact on typical production this week, next year, or the five years following that, but some time in the future, at least some elements will have crossed over. In a very real sense, I went to E3 last week to get a sense of “the future of video production”.

Categories
Apple Interesting Technology Random Thought

iTunes becomes a movie management tool

iTunes has been doing movies for some time now – trailers from the Apple Movie Trailer’s website have been passed through iTunes for full screen playback, leading many to believe that Apple were grooming iTunes for eventual movie distribution.

Well, iTunes 4.8 will do nothing to dispel the rumor mongers – in version 4.8 iTunes gains more movie management and playback features, including movie Playlists and full screen playback. Simply drag a movie or folders of movies (any .mov or .mp4 whatever the size) into the iTunes interface and they become Playlists.

Playback can be in the main interface (in the area occupied by album artwork otherwise); in a separate movie window (non-floating so it will go behind the iTunes main interface) or to full screen. Visual can be of individual movies or of playlists – audio always plays the playlist regardless of the setting controlling the visuals.

If one had to speculate (and one does, really in the face of Apple’s enticement) it certainly seems that Apple are evolving iTunes toward some movie management features. The primary driver of this development in version 4.8 is the inclusion of “behind the scenes/making of” videos with some albums. For example, the Dave Matthews Band “Stand Up” album in the iTunes Music Store features “an (sic) fascinating behind-the-scenes look at band’s (sic) creative process with the bonus video download.” The additional movie content gives Apple the excuse to charge an extra $2 for the album ($11.99 while most albums are $9.99).

There is a lot of “chatter in the channel” about delivery of movies to computers or a lounge room viewing device (derived from a computer but simplified). Robert Cringely, among others, seem to think the Mac Mini is well positioned for the role of lounge room device. Perhaps, others like Dave TV think a dedicated box or network will be the way to go. Ultimately it will be about two things: content and convenience.

Recreational Television and movie watching is a “lay back” experience – performed relaxed on a comfortable chair at the end of a busy day with little active involvement of the mind. Even home theater consumption of movies is not quite the same experience as a cinema (although close enough to it for many people.) It will take a major shift in thinking for the “TV” to become a “Media Center” outside of the College Dorm Room. We’re still many years from digital television broadcasting being the norm, let alone HD delivery to in-home screens big enough to actually display it at useful viewing distances. (If you want the HD experience right now on a computer screen Apple have some gorgeous examples in their H.264 HD Gallery. QuickTime 7, a big screen and a beefy system are pre-requisites but the quality is stunning.)

Apple do not have to move fast, nor be first, with the “Home Media Center” to ultimately be successful. Look at what happened with the iPod and iTunes in the first place. The iPod was neither the first “MP3 Player” nor some would argue “the best” but it had a superior overall experience, aided by a huge ‘coolness’ factor. So, even if Apple are planning an ‘iTunes Music Store for Movies” some time down the path, it’s not something I’d expect to be announced at MacWorld January 2006 or even 2007!

In the meantime, the new movie management features in iTunes are great. This is not a professional video asset management tool, we’ll have to look elsewhere for that (something I hope the Pro Apps group would be working on) but it is a tool for organizing and playing videos. I have collected show reels and other design pieces I look to for creative inspiration but until now there was no way of organizing them easily. Now I can import them all to iTunes, create play lists for “titles”, “3D”, “design”, “action” and so on for when I need inspiration. Movies can be in multiple play lists, just like music.

I can wait to see what Apple have planned in the future, in the meantime, I’m happy with a new tool in my toolbox.

Categories
Random Thought Video Technology

JVC confirms ProHD strategy

As reported previously in the Pro Apps Hub, JVC’s ProHD strategy is a marketing catch-all for all their HD offerings based on MPEG-2 transport streams. Included in the stragegy is the HDV KY-HD100U and the HDV+ GY-HD7000U.

Available in “early summer” the KY-HD100 is a professional-level HDV camcorder with solid camera-operator-friendly features that justify the ProHD moniker. Three 1/3″ CCDs sit behind a removable lens, although standard is a Fujinon 16x lens developed with JVC. The GY-HD100 records at 30p and 24P at 1280 x 720 resolution in HD and at 29.97 in DV. 24P is accommodated within the 60i standard framework by repeating a 2:3:3:2 pulldown like the only used by Pansonic in the AG-DVX100using MPEG2 “flags” to flag certain whole frames need to be duplicated, so it does 3 frames and then 2 frames (not fields like the DVX would do) and hence embeds 24p in a 60p video stream, but only records 24p frames of data to tape – quite clever really. This allows JVC to offer 24P even though it was not part of the original HDV specification, without deviating from the specification. [Thanks to Graeme Nattress for the correction.]

See Hub news on March 10 for more details on both cameras. At US$6295 JVC have come in well “under $10,000”.

Categories
Random Thought Video Technology

Panasonic announce P2 Camcorder

Panasonic generated a lot of buzz at NAB with the announcement of the AG-HVX 200 multi-format camcorder expected to sell for $5949?? without media. The AG-HVX 200 camcorder is a small-form-factor unit with a built in DV deck for DV25 recording and P2 solid state media support for DVCPRO 50 and DVCPRO 100 recording. Reportedly the FireWire output is also active for recording DVCPRO 50 or HD to a tethered FireWire deck. Panasonic are talking with Focus to develop support for the FireStore HD.

The camera itself is an impressive, multi-format, multiple frame rate device. In DVCPRO HD it supports 1080 i at 30i (60 fields), 1080 p at 24 frames/sec, 720 P at 60 or 24 Progressive frames. In 720 P mode is supports variable frame rates like the Varicam to the P2 media. It has 3 x 1/3″ native 16:9 CCDs.

Panasonic plan a bundle with two 8 GB P2 memory cards for US$9999 – an indication of just how far we have to go before solid state media becomes a viable proposition outside news gathering and other niche markets. While P2 media can be used directly as a source in many NLEs- Final Cut Pro adds native support for this media – it’s not viable to retain the P2 memory cards during editing. Most commonly the card’s contents is immediately dumped to hard drive. Panasonic announced a unit specifically for the purpose recently: the AJ-PCS060 portable hard drive with a P2 card slot. [Hub news February 14th]

Having the media on hard drive makes it instantly available for editing, but does not address the need for archive. Either the hard drives need to be permanently retained for archive or the media needs to be copied to another format for archive. This is more handling than most people are used to.

The AG-HVX 200 won’t ship until some time in the fourth quarter of the year, leaving JVC and Sony a long lead time for the competing HDV to become established. With 37,000 FX1 and Z1U units sold, according to the Apple presentation, in just the first six months, that’s a huge lead for Panasonic to catch up with, particularly since JVC will be shipping their KY-HD100U nearly six months ahead.

Categories
Apple Pro Apps

Apple’s NAB announcements [updated]

Although no new applications were announced, Apple upgraded all the Pro Video Apps with new versions of Final Cut Pro, Soundtrack Pro, Compressor 2, Motion 2, LiveType 2, DVD Studio Pro 4 and Shake 4.

In their Sunday morning presentation at Paris, Las Vegas, Apple announced upgrades across the their Pro Video line, consolidating the tools in the $1299 Final Cut Pro Studio. With Final Cut Pro alone priced at $999, the Studio becomes the purchase option of choice if you want Final Cut Pro and any of the other applications. In depth articles will follow, but here’s the 20,000 ft view.

The suite features improved integration across the suite with automatic asset updating from application to application but no dramatic changes to workgroup editing.

Final Cut Pro 5
Key new features are Multicam, Multichannel audio input and support for HDV and P2 media natively. Multicam allows up to 128 angles to be switched in a Multiclip. 4, 9 or 16 angles can be displayed and switched at a time. Final Cut Pro 5 supports tapeless media from Panasonic’s P2 and native IMX support (and keep an eye out for Panasonic’s new camera – P2 media and DV tape for the best of both worlds). MXF media from XDCAM is supported with a 3rd party plug-in from Flip4Mac (Telestream). Final Cut Pro HD works seamless with almost any type of media. HDV media is supported natively. It’s not clear whether or not media can be mixed in a Sequence without rendering. Since it’s not featured, probably not.
RTExtreme has been extended with a new Dynamic RT architecture that adjusts the amount of real-time according to the processor and graphics card speeds – as speeds increase, more real-time will become available. During playback Dynamic RT looks ahead in the timeline and dynamically adapts rather than suddenly stopping playback. Real-time speed change with frame blending is new to version 5.
Final Cut Pro now allows simultaneous import of up to 24 channels of audio. Final Cut Pro audio can now be controlled on any control surface that supports the Mackie Control Protocol meaning that Final Cut Pro mixing can be done a hardware mixer.!
Motion 2
Motion had the most dramatic update with new features that bring the application up to a truly material application for motion graphic design. New interaction techniques – including controlling parameters with a MIDI controller (did anyone say VJ?) – and Replicator for building patterns of repeating objects like flocks of birds. Replicator gives more control than a particle generator and comes with 150 patters with controllable parameters.
Rendering depth has been beefed up to 16 and 32 bits per channel float for those who need it. 32 bit processing is done on the CPU. Motion on Tiger supports more than 4 GB of RAM.
Motion also gains the third dimension with a new 3D distortion filter that allows pseudo 3D with beautiful transparency and effects in real time. A new GPU accelerated architecture lets 3rd parties access the GPU acceleration so Boris, Zaxwerks and DV Garage plug-ins now display in real time.
Soundtrack Pro
Although it shares part of a name with Soundtrack, Soundtrack Pro is far more positioned for a "regular editor" replacement for Pro Tools than simply for scoring music for video. Soundtrack Pro retains the loop editing functionality of Soundtrack, but adds waveform editing, sound design (including a library of sound effects) and includes more than 50 effects from Logic.
Soundtrack Pro comes complete with "search and destroy" tools for most common audio flaws – clicks & pops, AC hum, DC offset, phase and Clipped signal, plus tools for ambient noise reduction and automatically fill gaps with natural sound.
DVD Studio Pro
With an upgrade to version 4, Shake is HD ready with built-in support for H.264 encoding (adopted by both Blu-ray and HD DVD camps) and direct encoding from HDV without intermediate format conversion. Distributed processing using Qmaster for encoding and built-in AC3 encoding (no need to use A.Pack) and enhanced transition support headline DVD Studio Pro’s new features.
On the technical side, DVD Studio Pro 4 supports VTS editing for greater playback performance by allocating menus throughout VTS folders to overcome 1GB menu limitations. GPRM partitioning enhances the scripting options for highly interactive DVDs, for example jumps to motion menu loops to avoid repeating introduction transitions.
LiveType remains part of the Final Cut Pro package and is at version 2. Visually the interface does not appear to have changed. Most of the changes are under the hood with changes to the LiveFont format to support Unicode and vector fonts.

More soon.