Categories
Studio 2.0 The Business of Production

Cringley on Disrupting Hollywood: Steal the future!

As I’ve noted, Robert Cringley is writing a series of articles on how “Hollywood” (the studios, network broadcasters and cable companies)  might be disrupted by “Silicon Valley” (i.e. Apple, Amazon, Google, Netflix, et al). He notes the exact same problem that Intel (and everyone else) in disrupting Hollywood: that incumbent content controllers don’t want to disrupt their own very profitable existing ecosystem for an uncertain future.

Categories
Business & Marketing Studio 2.0 The Business of Production

One advantage of crowdsourced projects: they return investment!

One of the great problems with the traditional Studio System is the problem they have turning a profit on highly profitable movies. Of course, it’s just an accounting trick to line the pockets of the studios, at the expense of the talent who created the film.

Well, the partly-crowdsourced, Iron Sky, is now returning money to its crowdsourced funders out of the revenue of $10 million that the movie has already made.

Categories
Business & Marketing Studio 2.0 The Business of Production

Monetizing BitTorrent helps Australian newspaper keep down video costs

Monetizing BitTorrent helps Australian newspaper keep down video costs

“Sydney Morning Herald publisher Fairfax was spooked by the escalating cost of licensing video for its new TV site. So now it’s adding cheaper content by legalising BitTorrent videos on producers’ behalf.”

Categories
Item of Interest Studio 2.0 The Business of Production

Arrested Development on Netflix

Arrested Development on Netflix – how does it change the business model for TV production? http://t.co/t3cQpMDD

A good look at the business behind Netflix’s original productions:

Categories
Item of Interest Studio 2.0 The Business of Production

Episode 19: Starting over in a “Green Field” revisited

Episode 19: Starting over in a “Green field” revisited. http://tinyurl.com/4suc5bu New episode of The Terence and Philip Show: a clean start.

Categories
Interesting Technology Item of Interest Metadata Studio 2.0 The Technology of Production Video Technology

‘Interoperable Master Format’ for file-based workflow

‘Interoperable Master Format’ Aims to Take Industry Into a File-based World http://bit.ly/bvF6Vk

A group working under the guidance of the Entertainment Technology Center at USC is trying to create specifications for an interoperable set of master files and associated metadata. This will help interchange and automate downstream distribution based on metadata carried in the file. The first draft of the specification is now done based on (no surprises) the MXF wrapper. (Not good for FCP fans, as Apple has no native support for MXF, without third party help).

Main new items: dynamic metadata for hinting pan-scan downstream and “Output Profile List”:

“The IMF is the source if you will, and the OPL would be an XML script that would tell a transcoder or any other downstream device how to set up for what an output is on the other side,” Lukk explained.

The intention is to bring this draft spec to SMPTE, but first, ETC@ USC is urging the industry to get involved. “We need industry feedback and input on the work that the group has done thus far,” said ETC CEO and executive director David Wertheimer. “Anyone who has interest in this topic should download the draft specification and provide feedback to the group.”

Categories
Apple Business & Marketing Distribution New Media Studio 2.0

How do you get Disney to fund your next production?

It seems like an odd idea at first: could you fund a production – film or ongoing series – using iAds? After all, Apple have lined up $60 million in ad spend for the second half of 2010 and that would fund a lot of independent production! But how would it work?

First off iAds go in Apps for the iPhone/iPod Touch/iPad – or they will from early next month – and are an integral part of iOS 4. Any developer can add ads to their App simply and 60% of the revenue from ads goes to the App developer (or owner). That’s $36 million that’s going to be paid out to someone, why not your independent project?

I’ve long thought that the future of programming was Apps. An App, like a website, gives a single place for everything about your project: blog, previews, special content, upcoming events, merchandising etc. The advantage of not only having a website, but wrapping it an App is that the App will be a better fan experience, and it’s easy to add in-App purchasing of digital goods.

So, create an App for your project. This App will have:

  • An area where you can read the production blog;
  • Forums and chat around your project;
  • The Twitter feed from your project;
  • Connection into your Facebook presence;
  • Previews of scenes or trailers of movies;
  • The full project, with a little in-App purchasing (or not).
  • Calendar for screenings, parties and other events around your project, including signup (filtered for just the geography of the fan if they want, thanks to GPS on most of the devices)

Having everything to do with your project in a mobile app on iPhone or iPad makes it much easier for your fans, friends and followers to stay involved and participate. Involvement will improve. (Connecting with Fans and giving them a reason to buy is a basic tenet of independent production in the digital era.) Plus fans will likely be clicking on some of those ads if they’re well targeted, bringing revenue to the project.

Plus, there a minor security advantage. There’s no download function in Mobile Safari and Apps can’t download very much. Plus there’s no way to actually get anything downloaded within an App out of the App to a computer. That means your finished, high quality version could be viewed in the iDevices without much risk of it being distributed without authorization. (Recognizing though, that it will get distributed unless you project just plain sucks!)

Who’s going to be the first to give it a try?

Categories
New Media Studio 2.0 Video Technology

What are the four major trends in production?

Having just got back from an North East trip – New York, Boston and Meriden/North Haven CT – I’ve had a good opportunity to think and observe trends outside my own environment. I see four major trends happening across production and, despite the publicity and inevitable NAB push, I don’t think 3D stereoscopy is among them (at least not yet).

Stereoscopy is indeed a trend in feature film production with an impressive percentage of last year’s box office attributable to 3D movies, but it will be a long time before it’s more than a niche for non-feature production. In fact the supply of 3D content vs the number of theaters equipped to display, is probably going to limit 3D distribution to the major studios and their tentpole releases.

That said, this year’s NAB is likely to be full of 3D capable rigs, cameras and workflows. For what display?  Until the viewing end is more established production in 3D won’t be that important.

Right now the trends I’m observing are: more multicamera production; extensive use of green screen even for “normal” shots; 3D sets, objects and even characters; and a definite trend toward larger sensor cameras (both DSLR and RED).

Multicamera Production

The appeal is simple: acquire two angles on any “good” take. Of course reality television takes this to almost-ridiculous levels with up to 68+ hours recorded for every day of the show’s shoot. On more reasonable shows, Friday Night Lights shoots multicamera in real world locations for a very efficient production schedule.

While it no doubt saves production time, and therefore cost, it can limit shot availability (as one camera ‘sees’ another) or more bland lighting (to make sure each camera angle is well lit). Multicamera studio shoots – the staple of the sitcom – tend to be lit very flat, but Friday Night Lights doesn’t suffer for the multicamera acquisition.

All major editing software packages support multicamera editing. We’ve also seen an increase in requests for multicamera support in our double-system synchronizing tool Sync-N-Link.

Part of the reason that multicamera acquisition is becoming more practical is that the cost of buying or renting camera equipment has dropped dramatically, so that three cameras on a shoot are not necessarily a budget buster.

Green Screen (Virtual sets)

If you haven’t already seen Stargate Studio’s Virtual Back Lot reel, do it now. Before seeing it I had the sense that there was a lot more green screen used out there, but I had no idea that shows I’d watched and enjoyed employed green screen. The Times Square shot from Hereos, for example, did not feel at all composited. When simple street scenes are being shot green screen – things that could easily be shot in the real world – then you know it has to be for budgetary reasons.

Green screen (and blue screen for film) technologies are well proven. There are good and inexpensive tools that fit within common workflows to build the green screen composite. In other words, the barriers to entry are simply the skill of the Director of Photography on the shoot, and that of the editors/compositors in post.

When 70% of a show, like Sanctuary uses virtual sets, the necessity for anything beyond a good screen screen studio, with a good lighting kit and some smarts seems less important.

3D sets and enhancements

The third major trend goes hand-in-hand with the use of Virtual Sets: sets that are created in the mind of a designer and rendered “real” with 3D software. There are literally hundreds of thousands of object models available for sale (or free) online. You can hardly read a production story now that does’t feature 3D set or character extensions.

I should probably add motion tracking as another technology coming into its own, because it’s an essential part of the incorporation of actors into 3D sets, or the enhancement of character with 3D character extensions.

Larger Sensors

Fairly obvious to all, I would think, but the trend toward larger sensors includes the DSLR trend as well as RED Digital Cinema and the new Arri Alexxa. Wherever you look the trend is toward larger sensors with their sensitivity improvements, greater control of depth of field and drop-dead gorgeous pictures. Among other uses they make perfect plates for backgrounds in green screen work!

All four (plus motion tracking) trends contribute to reducing production cost, making more shows viable with ever fragmenting audiences.

Categories
Distribution Media Consumption Studio 2.0 The Technology of Production

What about the iPad and Media Production?

On October 31 last year Edo Segal wrote an article on TechCrunch with the title For The Future Of The Media Industry, Look In The App Store. The article is definitely worth a read but this jumped out at me:

But the entertainment industry has a vested interest in the success of this new type of convergence, as within it lies the secret to its continuing prosperity. The only way to block the incredible ease of pirating any content a media company can generate is to couple said experiences with extensions that live in the cloud and enhance that experience for consumers. Not just for some fancy DRM but for real value creation. They must begin to create a product that is not simply a static digital file that can be easily copied and distributed, but rather view media as a dynamic “application” with extensions via the web. This howl is the future evolution of the media industry.

It brings together some of the thinking I’ve been doing on how to challenge the loss of revenue from direct consumption or from advertising revenue when digital files of programming and music are so easily shared and copied. Techdirt.com like to summarize their approach as CwF + RtB = financial success: Connect with Fans and give them a Reason to Buy some scarce goods. Many musicians are already doing this and the results are summarized in the article The Future of Music Business Models (and those who are already there).

I agree that CwF + RtB is part of the future: we can’t charge for infinitely distributable digital goods but we can charge for scare goods (or services) promoted by the music.

But I’m not as sure that will work in the same way for the “television” business, which I define as being “television style programming professionally produced” even if it’s never broadcast on a network on cable. Certainly it will be possible to sell merchandising around programming, and everyone is encouraged to do that.

I’ve also written and presented – as long ago as my Nov 2006 keynote presentation for the Academy of Television Arts & Sciences – that producers and viewers have to be more connected, even to the extent of allowing fan contributions.

Well, last night I had something of an epiphany that bought together Edo Segal’s thoughts and my own as I contemplated the implications of the recently announced Apple iPad.

As a brief aside, I find the iPad to be pretty much exactly what I was expecting (although I thought maybe a webcam for video chat) and interesting. Although I don’t see where it would fit in an iPhone/Laptop world, I can see plenty of uses particularly for media consumption. (For example a family shares an iMac but each of the older children have their own iPad for general computing, only using the iMac for essays etc.)

But the iPad doesn’t really lend itself to static media consumption as it has been: where the producer sends stories fully finished and complete to viewers who passively consume. That’s when the import of Edo’s comment struck: there is more of a future in media consumption for those producers who create the whole environment.  This has definitely been done by many movies and shows but usually with more of a consumption-of-information about the show, rather than a rich interactive experience where fans of the show are as important as the producers.

The future of independent production and media consumption is an immersive environment (website, or better yet and iPad app) with:

  • Content
  • Community (forums, competitions)
  • Access to the wider story, side stories or “back story” in various media formats
  • Character blogs
  • Cast and crew blogs
  • Fan contributions and remixes.

Such an experience would be almost a cross between a typical television program and a video game environment. Sure programming is part of what can be consumed on the site; but there are competitions, games, back stories; additional visual material edited out of the program source, with additional shooting, using technologies like Assisted Editing.

Any unauthorized distribution of content will only be distribution the content, not the experience of the program in its full glory.

Now, there’s no particular reason why this couldn’t be largely done on a website, but it is as an immersive iPad app that I think it will really be fantastic. The iPad is very immersive and tactile. It presents no “border” (i.e. browser window and other computer screen elements) to distract from the programming. It begs to be interacted with because holding it in place to watch a 22 or 44 minute show doesn’t appear to be going to be all that great.

There’s one more selling point for the iPad: it allows in-app sales, so some of the “reasons to buy” can be sold very transparently without even leaving the app’s environment. Avatars, screen savers, certain games or activities might carry a small charge. Yes, even the media itself (or some of it) could carry a small transaction charge. Smooth, frictionless sales in an environment optimized to engage people in the story of the show.

Apple’s iTunesLP format is a very small start in this direction by building a micro-site for the album artwork. This is very powerful because it supports most modern web technologies in a tight package and interactive features (all, b.t.w., without Flash but looking a lot like Flash).

Edo has some further good ideas and I recommend reading the article at the top of this post.

Categories
Random Thought Studio 2.0 The Business of Production Video Technology

Why are most production workflows inefficient?

In my experience few productions – be they film or television – are well planned from a workflow perspective. It seems that people do what’s apparently cheapest, or what they have done in the past. This is both dangerous – because the production workflow hasn’t been tested – and inefficient.

In a perfect world (oh *that* again!) the workflow would be thoroughly tested: shoot with the proposed camera, test the digital lab if involved; test the edit all the way through to the actual output of the project. Once the proposed workflow is tested it can be checked for improved efficiency at every step. Perhaps there are software solutions for automating parts of the process that require only small changes to the process to be extremely valuable. Perhaps there are alternatives that would save a lot of time and money if they were known about.

Instead of tested and efficient workflows, people tend to do “what they’ve done before”. When there are large amounts of money at stake on a film or TV series it’s understandable that people opt for the tried and true, even if it’s not particularly efficient because “it will work”.

Part of the problem is that people simply do not test their workflows. I’ve been involved with “film projects” (both direct to DVD and back out to cinematic release) where the workflow for post was not set until shooting had started. In one example the shoot format wasn’t known until less than a week before shooting started.

Maybe there was a time when you could simply rely on “what went before” for a workflow, but with the proliferation of formats and distribution outputs, there are more choices than ever to be made.

Which brings me to the other part of the problem. Most people making workflow decisions are producers, with input from their chosen editor. Chances are, unfortunately, that neither group are very likely to truly understand the technology that underpins the workflow – or even why the workflow “works”. They know enough of what they need to know to get by but my experience has been that most working producers and editors do not actively invest time into learning the technology and improving their own value.

And when they’re not working, they’re working on getting more work. Again, not surprising.

But somewhere along the way, we need producers to research and listen to advisors (like myself) who do understand the workflow and do have a working knowledge of changing technology that can be make a particular project much more efficient to produce, but I have no idea how to connect those producers with the people who can help.

We’ve seen, in just a little under two years, how technology can improve workflows, just with our relatively minor contributions:

Rent a couple of LockIt boxes (or equivalent) on set and save days and days synchronizing audio and video from dual system shoots;

Log your documentary material in a specific way, and take weeks off post production finding the stories in the material (Producers can even do a pre-edit);

Understand how to build a spreadsheet of your titles and how to make a Motion Template and automate the production of titles (and changes to same).

If you know you can recut a self contained file into it’s scene components, how does that change color correction for your project;

Import music with full metadata.

These are all examples of currently-available software tools from my company and others that are working to make post production more efficiently. I wrote more about this in my Filling in the NLE Gaps for DV Magazine.

My question though, is how do we encourage producers to “look around and see what’s available” and open up their workflows to a little modern technology. To this end, Intelligent Assistance is looking to work closely with a limited group of producers in 2010 to find ways to streamline, automate and make-more-robust postproduction workflows. So, if you’re a producer and want to save time and money in post, email me or post in the comments.

If you’ve got ideas on how encourage producers move toward more metadata-based workflows? How do we get the message out?