New (Potential) Distribution Opportunities

Ok, I’ll take a little break from talking about post-production, and discuss a little bit about what happens afterwards. Scott Kirsner recently posted his notes from last week’s meeting of the Institute for International Film Financing, a group which matches investors and filmmakers, and also researches new marketing and distribution opportunities. Cited in this meeting were a number of new developments and experimental technologies that could be specifically advantageous for independent films.

There are a number of internet-based sales and rental outlets, new HD cable channels, and portable media devices which are changing the way that audiences watch films. MoDV sells SD cards containing compressed movies to be watched on PDAs, cell phones, and compatible media players. Tivo and other cable PVR companies are setting up video-on-demand systems. NetFlix is looking into a download-based rental scheme. All of these new options will have serious ramifications for filmmakers, but the big battle will, as usual, probably be between Apple and Microsoft.

Apple has scored a tremendous advantage with their new Video iPod. For the last few years they’ve made themselves the largest online distributor of movie trailers while building their iTunes Music Store. Now with a dedicated video player and a few downloadable TV shows (update: as of this morning there are new shows available), they’re the first on the block. But the viPod isn’t the most comfortable movie-watching machine, and Microsoft just released the Xbox 360 as the ultimate, all-in-one, HDTV, surround-sound, living room entertainment system. There isn’t a highly successful media store behind it yet, but XBox Live subscribers can already download HD movie trailers and music videos.

Then there is Sony. Sony is in a unique position. Not only do they have the upcoming Playstation 3 and Playstation Portable to compete with the XBox and viPod, they also own a huge amount of their own content as a film distribution company and a record label. They’re coming in late in the game, but they’re also providing the Blu-Ray DVD technology that Microsoft plans on using in the future. It’s a tough game to call, at this point. Check the archives of CinemaTech for more information on all the complex factors.

Matte Paintings

Now that we’ve covered basic compositing enhancements and sky replacements, let’s move on to matte paintings, which are the first visual effects ever documented, and have been used almost as long as movies have existed.

The concept is simple: an image was painted onto glass, leaving holes that live action elements could be seen through. These paintings could then be attached to the front of the camera, or superimposed onto the film later. They were traditionally used to extend sets, particularly on period films. Often the lower story of a historic building would be built, with the rest painted to carefully match the perspective and lighting of the shot.

Unfortunately, this technique required the camera to be stationary, or for the two elements to be combined and a simple pan, tilt, or zoom added optically. The ability to create a painting that matches reality on film is a unique art, and it takes great technical skill to match them together seamlessly. Here’s an example of how this worked, and a few mattes painted by the old masters.

As optical printers and finally digital compositing developed, more techniques were invented to enhance matte paintings. Photographic elements like smoke and rippling water could be added to the final composite. Digital compositing allowed even more freedom, with matte paintings mapped onto multiple 3D layers and tracked to moving footage.

Below is a quick matte painting I put together. I didn’t have any footage designed for anything like this, so this is just the painting on it’s own. I found a few pictures of the Blue Ridge mountains to use as reference, and whipped together eight quick layers in Corel Painter 9:

click to enlarge

And there they are. I painted the mountains pretty flat, because I planned to add some 3D rendered mist elements in between each of the ranges. Then I tweaked the color a little, added a glow pass, and put together a camera move that separated the layers. I could have done this using After Effects’ 3D layers, as shown in this excellent tutorial by Andrew Kramer, but it was a simple horizontal truck, so I just panned each one individually at a different speed. Check a video file of the pieces and finished animation here (1.4mb, mpeg4 avi).

Color Correction and Sky Replacement

I’d like to follow up yesterday’s post by giving a few examples of ways that simple and cheap digital effects can improve the average low-budget film. The most obvious and probably the most useful tool in post production is digital color correction. Almost every software-based non-linear editor can adjust the hue, saturation, and levels of any video clip. This is important for those filmmakers shooting on video who would like to emulate, or at least simulate, the warmth and depth of film.

And as handy as those simple controls are, compositing programs like After Effects and Shake can accomplish much more. Individual parts of a scene can by isolated and tweaked. Sharpening or blurring can emphasize or obscure certain details. Minor camera shakes can be fixed. Glows and blooms can be added to highlights. Specific colors can even be altered, turning green spring foliage into a warm autumnal environment.

But sometimes changing what already exists isn’t enough, and new image data must be added. Video captures much less color depth than film, so when the camera is exposed for an actor’s face, the sky behind him will probably be completely blown out. With the right equipment and good polarizing filters, some of the sky’s color may be salvageable, but with highly compressed 8-bit DV video, it’s more likely to be pure white. This is where sky replacement can come in.

For years now, Hollywood effects houses have been adding new sky and cloud elements to locked-off plates when the images need more dramatic kick, or they needed to simulate weather that wasn’t available on the set. However, this is now a very simple procedure for anyone with a basic working knowledge of almost any compositing package, and skies can now be tracked to moving and even hand-held shots. Andrew Kramer has an excellent tutorial on how this is done.

Adding an image as simple as a blue sky or a few clouds can instantly make an image much richer and more filmic, and with a little planning on the set, you can shoot images that will integrate with dramatic sunsets and moving clouds, as shown below:

Visual Effects, News and Importance

It’s been kind of a slow week for news, and I’ve been a little too busy to write up any tutorials, reviews, or articles, so here’s a quick summary of website I visit regularly. is a site maintained by Australian effects expert Ian Failes. Ian combs the webnet daily for news and articles about visual effects in recent and upcoming feature films, and frequently posts exclusive interviews with VFX technicians.

Over the last couple of days, vfxblog has posted links to the new issue of Cinefex (including a Peter Jackson interview), a number of articles from Film and Video magazine, a podcast about using moving cameras from Scott Squires’ blog, and the latest video production diary from King Kong.

I’m going to try not to turn this website into a purely effects and animation, but that is my background, and it is something that I find interesting. And regardless of how interesting or boring the latest whiz-bang effects are to most readers, even the basic and humble independent film can benefit from a little digital manipulation. Over the last ten years, special effects have gone from an expensive way to show things that can’t be filmed, to a more affordable way to present things that are merely cost-prohibitive.

And even the most technophobic of independent filmmakers should be aware of some of the new advancements that can add value to their final films without expanding their budgets too much. After all, some of the recently available technologies are the entire reason that we have a budding independent industry, and affordable equipment, so have a look through some more of the links at left.

BBC Video Standards

I’m following up my last post on video scopes, signals, and levels with a link to some more information. For those of you who would like to learn more about video quality and broadcast standards, check out the BBC’s page on “policies, requirements, standards and best practice guidelines.” Available for download are PDFs and Word files on exactly what is required for BBC programming, as well as guides on widescreen and HD. Most television stations will have slightly differing lists of requirements, but BBC broadcast technicians are nothing short of legendary, and the Beeb’s rules are generally known as the highest industry standard around the world.

Admittedly, this data is less important to readers who are film snobs or hoping to avoid broadcast outlets, but anyone working with video of any form would do well to look over these lists. A rough working knowledge on what makes a quality video signal can be just as helpful (and easier to learn) as what makes a quality image. If you can’t afford to hire a video engineer who already knows this stuff, have a quick look. It can also be helpful to compare these stats to the specs of any video hardware you are planning to purchase. You can also read some more in-depth technical details on the Society of Motion Picture and Television Engineers website, but that is much heavier going.

An Introduction to Video Scopes

There are a number of different types of video scopes, and various editing and compositing software packages will offer different diagnostic tools for analyzing images, but the underlying concepts are the same for measuring brightness and color. Below are examples of how standard NTSC color bars look when displayed on each scope.

The Waveform Monitor displays the luminance, or the black and white levels in your picture. Each white dot in the scope represents the luminance, or gray-value, of a pixel in your video image.

The display directly corresponds to the image from left to right. That is, looking left to right on the scope corresponds to looking left to right in the image. This means you can look at a scope and tell immediately where the dark and bright images are and roughly where they are located in the image itself.

Traditional waveform monitors are oscilloscopes configured for television monitoring, which measured the raw voltage of the video signal to check that all the pulses and scans of the signal are occurring at the proper times.

The main purpose of measuring the voltage was to make sure that the white levels didn’t exceed 100% or fall below 7.5%, as that would cause problems for the analog video signal. With digital video, top levels should not exceed 110%; signals that are too high will clip and blow out, and details that fall below black will disappear.

In Appreciation of

I’d like to point all my readers to the site of my good buddy Mike Curtis. When I say he’s my buddy, we’ve really only exchanged a couple of emails, but the quality and comprehensiveness of his fantastic video and film tech website has made him an invaluable friend to me and pretty much everyone in the independent film industry. HD for Indies is the undisputed, number one, best source for news and information about the latest HD gear, specifically from a filmmaker’s perspective. In addition to the daily news posts, there’s an archive of 1600+ articles to search through, so it’s a good place to start.

I’m not a big believer in the “Apple is the Savior of Movies” gospel, but unlike a lot of tech writers, Mike actually practices what he preaches, and is currently starting up an Austin-based post facility for HD color correction and “sweetening.” For an introduction on where things currently are with HD filmmaking, recently posted an excellent interview with Mike, and here are parts one and two.

Lenses: P+S Technik Mini35 Review

One of the reasons that video tape never looks quite like film is because of the lenses. As important as the actual film itself is, the image will only be as good as the lens it has been shot through. This isn’t to say that all video lenses are low-quality, but they certainly handle the image differently. The reason for this is that a lens for a film camera must present an image onto a 35mm wide piece of film. This is a big image, and it takes a big lens. Most professional video cameras have 2/3″ CCDs, which is quite a bit smaller. Cheaper cameras may have 1/3″ or 1/6″ CCDs, which require even smaller lenses.

Small lenses have several problems. For starters, they are difficult to manipulate precisely, and imperfections can be more obvious. The most obvious difference, however, is in the depth of field; or how much of the image is in focus. A film lens is larger, and has a larger aperture. The wider the iris is, the fuzzier the background will be when the foreground is in focus. A smaller video lens will have a much sharper image. However, it is difficult to simply put a film lens on a video camera; a complex adapter is needed. recently reviewed the Mini35, made by P+S Technik, which is a complete system for using 35mm lenses, matte boxes, filters, and tripod heads with a MiniDV camera such as the XL1 and the DVX1000. It works quite simply; the film lens projects and image onto ground glass plate, which the video camera sees. A variable-speed motor spins the plate to that less grain is visible. The whole setup is bolted onto a sleigh that turns that camera and lens into a complete unit.

The results are far more cinematic-looking, but there are a few drawbacks. For example, the weight and bulk of the adapter make it difficult and uncomfortable to shoot handheld, and a lot of light is needed to compensate for the ground glass. Also, the price tag is around $6000, but the Mini35 is the top of its game, and cheaper, similar solutions may be available.

Lighting Breakdown

Following on my recommendations for lighting rigs, here is a series of tutorials on lighting. It was been written by Richard Harris, who is an animator and painter, so most of his examples explain how to recreate realistic lighting conditions in 3D and on canvas. That might be less useful to videographers who are looking for practical lighting solutions, but he does a good job of explaining the terminology, and has a number of examples of exactly what different lighting setups look like.

Lighting Rigs for Film and Video

It’s been a while since I dug into my e-mailbox for questions, and I apologize. First on my list is a query from a team of brothers who are interested in investing in a lighting kit for their production company. At the moment, they are shooting most of their projects of DV, but their long-term plans involve feature projects shot on film. They wrote to ask whether the lighting gear that their video projects required would still be usable on their films.

Fortunately, film cameras and video cameras have similar technical lighting requirements. If you buy a simple lighting rig for video use, those lights will be very suitable for lighting a 35mm film shoot (depending on the film stock and lenses that you may be using). The only real differences between the two formats are than they will generally have different styles of lighting. If you have to light an entire set rather than one interviewee, you will obviously need more lights. However, the color temperature and intensity of a standard halogen 650 watt light are ideal for film.

Owning your own lighting kit is very helpful, for a number of reasons. Firstly, if you have the gear, you have the freedom to experiment with it and practice different styles. Secondly, it lets you be more flexible in shooting, and you can jump onto projects quickly without having to organize gear. Unfortunately, professional lights and light stands are not cheap. Fortunately, they are pretty sturdy, so if you can find used lights in reasonable condition, they should be more than adequate. Starting videographers shouldn’t need anything much more complex than a simple three-point lighting rig.

This will be a key light, which is the main light that illuminates the subject. Then there is a fill light, which is softer, and placed on the opposite side as the key light to fill in the shadows. Then, you have the kicker, or backlight, which sits behind the subject and just puts a little edge-lighting unto the subject. The key and fill lights should be either 650 watts, or 1000 watts. These are the two main sizes of light used in video – film shoots may require much larger lights for larger film sets or location shooting.

The kicker light, on the other hand, doesn’t need to be anything that special. A regular hardware store clamp lamp with a 75w or 100w is all that is needed for backlighting. As you experiment with lighting techniques, you’ll have better ideas of how to use diffusion, colored gels, natural sunlight, reflected or “bounce” light to get the results that you want.