Merry Christmas!

And what could be more appropriate for this post than a radio play version of the greatest Christmas movie ever? Well, a lot of things, I guess. However, a few days ago I came across the 1947 Lux Radio Theater adaptation of “It’s a Wonderful Life,” starring much of the original cast. The script is kept almost entirely intact, with just a few changes to better suit the radio format. I did a little poking around and found a higher quality mp3 file of the 60 minute show, which is 14mb. Merry Christmas, everyone.

Blender Releases Version 2.40

And just in time for Christmas. I’ve been fooling around with it a little bit, but I’m tied up with work so I don’t have time to do any real benchmarks or tests. Still, I can say that it feels much, much better, and there are a lot of new features, which are kind of a mixed bag. The fluid dynamics module looks absolutely fantastic. The particle strand hair tool looks abominable. The character tools are getting better, but the radiosity is still pretty crummy. Basically, I’m of two minds about this package.

It’s getting to the point where it’s usable. It can import and export all the main file formats, offer generally acceptable animation and modeling tools, and is on its way to developing a decent renderer (although you might be better off using an external engine). And I like how it’s free. I generally like free things. However, I think that, paradoxically, Blender’s greatest strengths and it’s greatest weaknesses are due to its open source status.

Don’t get me wrong, I love most open source software and the whole “open source” concept, as long as it’s not degenerating into anarchic idiocy or Marxist silliness. The other day I needed to update my cell phone’s contacts, and the free open-source solution worked much better than the buggy, bloated software that came from my provider. A distributed network of freelance programmers all updating the same program independently is great for some projects, like a reverse-engineered telephone tool, and not for others, like a professional animation program.

If I build my production pipeline around Blender, and run into a bug that holds my project up, there are no guarantees that I can get it fixed immediately since the programmers who created to problem aren’t on salary. On the other hand, I do have access to the source code, so I can hire my own guys to try to fix it, but who knows how well that’ll work? Now let’s say I wanted to integrate the rigid body system with that glorious fluid dynamics engine, which is a logical upgrade and something that most pro packages support. With Blender, those two libraries were created completely independently, and so making them talk to each other will be tough.

A few years ago I worked with a medium-small animation studio on a number of television programs, primarily using Newtek’s Lightwave. Newtek and the developers of the plug-ins we were using were always quick to solve any problems we might run into, sometimes getting us new upgrades overnight. That’s the kind of support you need during most productions. However well organized, open source doesn’t really offer that.

HD Cameras Revisited

Ok, I spoke too soon! It turns out the RED camera is in fact a real, bona fide, genuine project. Oakley founder Jim Jannard, noted camera enthusiast and photographer, is behind the development, although there don’t seem to be any progress reports or official press releases yet. Neverthless HD for Indies got an opportunity to talk to some of the engineers and it would seem that everything’s on the up and up – the hype-filled but vague website appears to just be a viral marketing gag.

This is great news, but I’m pretty sure this potentially very expensive camera’s release isn’t exactly imminent, so here’s a little news on equipment we can get any day now. Right now there are two cameras that the independent film community is most interested in; the Panasonic AG-HVX200, and the Canon XL H1. Both cameras have pro lenses and audio options, and HDV decks. I have an even lower opinion of HDV than DV, but the neat thing about each of these cameras is a way to get non-HDVed footage out. The Panasonic can record a less-compressed image to solid-state P2 cards, and the Canon appears to be capable of running an uncompressed(?) feed out on SDI.

Unfortunately, just how useful these options are is still a matter of some conjecture since these cameras are still pretty new. The HVX200 can pretty much only be found at trade shows and the Canon XL H1 has only just started shipping. At the moment there’s been a lot of back and forth discussion about the pros and cons of each camera based on what’s been demonstrated by the manufacturers (usually off of HDV, which isn’t that helpful) , but we have yet to see exactly what it can do when in the hands of folks like us. Keep watching.

New (Potential) Distribution Opportunities

Ok, I’ll take a little break from talking about post-production, and discuss a little bit about what happens afterwards. Scott Kirsner recently posted his notes from last week’s meeting of the Institute for International Film Financing, a group which matches investors and filmmakers, and also researches new marketing and distribution opportunities. Cited in this meeting were a number of new developments and experimental technologies that could be specifically advantageous for independent films.

There are a number of internet-based sales and rental outlets, new HD cable channels, and portable media devices which are changing the way that audiences watch films. MoDV sells SD cards containing compressed movies to be watched on PDAs, cell phones, and compatible media players. Tivo and other cable PVR companies are setting up video-on-demand systems. NetFlix is looking into a download-based rental scheme. All of these new options will have serious ramifications for filmmakers, but the big battle will, as usual, probably be between Apple and Microsoft.

Apple has scored a tremendous advantage with their new Video iPod. For the last few years they’ve made themselves the largest online distributor of movie trailers while building their iTunes Music Store. Now with a dedicated video player and a few downloadable TV shows (update: as of this morning there are new shows available), they’re the first on the block. But the viPod isn’t the most comfortable movie-watching machine, and Microsoft just released the Xbox 360 as the ultimate, all-in-one, HDTV, surround-sound, living room entertainment system. There isn’t a highly successful media store behind it yet, but XBox Live subscribers can already download HD movie trailers and music videos.

Then there is Sony. Sony is in a unique position. Not only do they have the upcoming Playstation 3 and Playstation Portable to compete with the XBox and viPod, they also own a huge amount of their own content as a film distribution company and a record label. They’re coming in late in the game, but they’re also providing the Blu-Ray DVD technology that Microsoft plans on using in the future. It’s a tough game to call, at this point. Check the archives of CinemaTech for more information on all the complex factors.

Matte Paintings

Now that we’ve covered basic compositing enhancements and sky replacements, let’s move on to matte paintings, which are the first visual effects ever documented, and have been used almost as long as movies have existed.

The concept is simple: an image was painted onto glass, leaving holes that live action elements could be seen through. These paintings could then be attached to the front of the camera, or superimposed onto the film later. They were traditionally used to extend sets, particularly on period films. Often the lower story of a historic building would be built, with the rest painted to carefully match the perspective and lighting of the shot.

Unfortunately, this technique required the camera to be stationary, or for the two elements to be combined and a simple pan, tilt, or zoom added optically. The ability to create a painting that matches reality on film is a unique art, and it takes great technical skill to match them together seamlessly. Here’s an example of how this worked, and a few mattes painted by the old masters.

As optical printers and finally digital compositing developed, more techniques were invented to enhance matte paintings. Photographic elements like smoke and rippling water could be added to the final composite. Digital compositing allowed even more freedom, with matte paintings mapped onto multiple 3D layers and tracked to moving footage.

Below is a quick matte painting I put together. I didn’t have any footage designed for anything like this, so this is just the painting on it’s own. I found a few pictures of the Blue Ridge mountains to use as reference, and whipped together eight quick layers in Corel Painter 9:


click to enlarge

And there they are. I painted the mountains pretty flat, because I planned to add some 3D rendered mist elements in between each of the ranges. Then I tweaked the color a little, added a glow pass, and put together a camera move that separated the layers. I could have done this using After Effects’ 3D layers, as shown in this excellent tutorial by Andrew Kramer, but it was a simple horizontal truck, so I just panned each one individually at a different speed. Check a video file of the pieces and finished animation here (1.4mb, mpeg4 avi).

Color Correction and Sky Replacement

I’d like to follow up yesterday’s post by giving a few examples of ways that simple and cheap digital effects can improve the average low-budget film. The most obvious and probably the most useful tool in post production is digital color correction. Almost every software-based non-linear editor can adjust the hue, saturation, and levels of any video clip. This is important for those filmmakers shooting on video who would like to emulate, or at least simulate, the warmth and depth of film.

And as handy as those simple controls are, compositing programs like After Effects and Shake can accomplish much more. Individual parts of a scene can by isolated and tweaked. Sharpening or blurring can emphasize or obscure certain details. Minor camera shakes can be fixed. Glows and blooms can be added to highlights. Specific colors can even be altered, turning green spring foliage into a warm autumnal environment.

But sometimes changing what already exists isn’t enough, and new image data must be added. Video captures much less color depth than film, so when the camera is exposed for an actor’s face, the sky behind him will probably be completely blown out. With the right equipment and good polarizing filters, some of the sky’s color may be salvageable, but with highly compressed 8-bit DV video, it’s more likely to be pure white. This is where sky replacement can come in.

For years now, Hollywood effects houses have been adding new sky and cloud elements to locked-off plates when the images need more dramatic kick, or they needed to simulate weather that wasn’t available on the set. However, this is now a very simple procedure for anyone with a basic working knowledge of almost any compositing package, and skies can now be tracked to moving and even hand-held shots. Andrew Kramer has an excellent tutorial on how this is done.

Adding an image as simple as a blue sky or a few clouds can instantly make an image much richer and more filmic, and with a little planning on the set, you can shoot images that will integrate with dramatic sunsets and moving clouds, as shown below: