A few weeks ago, I wrote about the importance of building production pipelines around the workload of the studio. I then wrote in my article about Adobe’s Production Studio that it was powerful and flexible enough to be ideally suited to a number of different production environments and workflows. This prompted a few emails asking for examples of what I meant. Let’s take a moment to reflect on some recent projects and daydream about some potentially upcoming ones…
Earlier this year I completed a promotional video using the tools in the Adobe bundle. Because it was a relatively simple and short project, I whacked out the whole thing myself on two computers. I began with video shot on the Canon XL2, and edited this together with stock footage in Premiere. Some of the materials needed only minimal color tweaking to get agood match, but the majority required considerable adjusting. For example, I was combining interlaced video, 24p DV, and telecine’d 35mm into what was meant to be a 30p master.
Premiere enabled me to edit these divergent framerates in one timeline, which I could then import into After Effects for fixing on a shot-by-shot basis. After considerable deinterlacing, color adjustments, glow passes, sky replacements, and gradient overlays, it was time to change the mostly fullscreen shots into widescreen. Because the final was meant for exhibition on 16:9 widescreen displays, I used InstantHD to stretch the video into 16:9 rather than a letterboxed 4:3. At this point I also created a number of technical animations in After Effects as well, using Photoshop and Illustrator for some of the elements.
The music, sound effects, and narration were edited and mixed in Audition, and mastered to a full 5.1 surround track. The audio and video files were combined in Encore, and I was able to create a seemingly random set of loops that was controllable by DVD remote without using the menu. This is non-standard approach but it was something I specifically needed for the job spec, and so I was impressed that Encore gave me enough control over the authoring process to do this. All in all, the project was a great success, and the close integration between the apps sped the process up considerably.
Admittedly, this was a one-man show, but larger projects requiring more staff could get the same benefits. In a previous, much larger project, we used Premiere and After Effects together across several workstations seamlessly, sharing assets even in the previous, non-Dynamic-Linked versions of the software. Starting with a naming convention that allowed the interchange of DV video files, highly compressed MP3s, and transcribed text files, we created a simple proxy system to maximize productivity. With a semi-automated backup and renaming tasks, untreated video files were replaced by the final versions within the edit as they were completed.
With the metadata of Adobe’s bridge, this could be even more streamlined. My ideal setup for a video studio working on television programs and documentaries would consist of several Adobe seats, and possibly a video server; at the very least, distributed video storage shared over fiber. For this studio, the camera of choice would be the Canon XL H1, and post-production would either stick with HDV or a somewhat-less compressed intermediate codec for editing and effects. The backbone of the workflow would a be content management system that would allow editing and graphics stations to share assets simultaneously.
This would be even more important for film work. My ideal low-budget, post-production film studio would be slightly different. For starters, production would likely use the Silicon Imaging SI-1920HDVR camera, which looks like a fantastic machine. While it may not be as ambitious as the RED camera, it does have several advantages, one of which is that it exists today. Post production would be based on an uncompressed or lossless codec, which would change the editing process slightly. Rather than editing full HD video, DV proxy files with identical timecode and metadata would be used. While Premiere can, with the right hardware, easily edit uncompressed HD, full quality on-line editing is not required for feature work.
The content management system would pass the large, uncompressed files to the colorists and compositors for sky replacements, effects and matte paintings, and give lower-res, lower-bandwidth files to the editor and sound engineers. Weta Digital uses a similar system, but it’s a hugely complex system of virtual files and real-time processing. For my low budget films, auto-managed directories and file tracking would probably be sufficient. For final output, the Premiere edit project would be repopulated with the final full-rez video files and import the final audio mix.
A step up to the next level would require more computing hardware, because HD just doesn’t cut it. If I were to start shooting a feature next month, it would be on 35mm. Next year, I’m not so sure about. The film would be scanned in at 2k (some effects plates at 4k), and occupy a massive amount of storage. My proxy servers would need to be seriously upgraded and I would probably need a few more quality levels for all the post staff. At this point I would upgrade from Premiere to an Avid editing solution, and add a few Nuke seats to my compositing department.
I would likely also delegate my final color pass to a real daVinci system, but the pipeline itself would probably be similar, if larger and more faceted, to the workflow of the video studio. Logistical concerns would be different, and the size of the project would mean more redundancy in every area. The final output would probably involve a master print on film, and the output from the Avid would require a different compilation method. However, After Effects, Photoshop, and Audition would be used very heavily in pre and post production, and I’m sure Premiere would see plenty of action in rough on-set edits, animatics, and miscellaneous areas.
With careful planning and good research you’ll be able to build your pipeline ahead of time rather than spend your post production budget trying to pound a round peg into a round hole. The film projects are just a couple of examples that I dreamed up to fit projects that I already have roughed out on paper. I’m sure I’d make a lot of changes if I were actually building these studios, but they are good starting points. Feel free to keep the questions coming for more specifics. And to all the people who commented on how cluttered and busy my screenshots were, I know. I have a big screen, and I like them that way.