The Artifice of Videography at 24 Frames per Second

From Rodney Graham's Torqued Chandelier Release, 2005, 35mm film loop shown on a custom 48 FPS projector

As of November 2012, the last packages of light-sensitive film vanished from the racks of my local department store. The meager supply of 35mm roll film and disposable cameras disappeared, and with it came the reality that the changeover from analog to digital image acquisition is finally wrapping up. Equally visible changes are happening in the cinemas, where video projectors have slowly replaced film projectors. However, there is one curious, rarely questioned holdover from the analog era that persists among many motion photographers to this day.

The current trend of using digital filters to artificially age or alter one's snapshots has been criticized extensively, but this editorial is not about the artifice of premature aging or planned glitches. It is about an odd trait of motion picture film that lives on in the many digital cameras, video cameras and smartphone apps whose superior functionality quickened the decline of film in stores and cinemas.

From an advertisement for the Canon 5D Mark II

In the nascent years of motion picture photography, there were no hard-and-fast rules on how many images should be captured per second. With both cameras and projectors hand-cranked by individual operators in those early days, the amount of frames seen every second by viewers in the silent film era could vary between 12 to 26 FPS (frames per second) under typical viewing conditions. By the 1930's, after decades of wildly varying frame rates, the addition of sound dictated a constant playback speed to avoid variations in audio pitch (ever try to play a record by moving the turntable with your finger?) The standard of 24 FPS was established as the minimum rate by which professional movie cameras and projectors would record and project still images to accurately create the illusion of motion. It was the lowest frame rate by which films could be seen without a pronounced flicker; any higher speed would be a waste of film. By this time, early television experimenters were working out their own technical guidelines, but they were bound by a different set of constraints.

TV engineers in countries with 60 Hertz electrical grids adopted video transmission and recording systems that operated at 30 FPS, and for 50 Hertz countries, 25 FPS.  (This even halving of frame rates was to reduce interference from the power grid.) Both systems employed a bandwidth-saving technique called “interlacing” that split each frame into two interpolated fields, which further increased the perceived frame rate. When these video standards were adopted mid-century, the frame rate of professional motion picture film was already set, leaving film and video with their own distinct “look.” Film quickly became the more expensive option for moving image recording soon after the debut of videotape in the late 1950's, but the lower quality of standard definition video was obvious to all.

For many viewers of television and cinema, 24 FPS could easily be discerned from 30 or even 25 FPS, and the slightly blurred action of 24 FPS became one of the most recognizable traits of the "film look." Thusly, 30 FPS became associated with low-budget, shot-on-video movie productions. Film was the preferred medium of any auteur seeking high technical standards and mainstream credibility. For several decades, video was a last resort, and film was a badge of honor. For the aspirational filmmaker of the 1990's, various workarounds became available to make their video recording resemble film. 25 FPS video cameras intended for the European market were used to acquire footage and the video was slowed down in post-production by 4% to 24 FPS. This method was useful to ensure a smooth migration from video to film, should the budget-minded director be lucky enough to land a screening at a film festival, of which many were strictly film-only.

In the early 2000's, affordable video cameras appeared on the market that could acquire footage in 24 FPS natively. Filmmakers could finally bypass the work of adapting their video for film output in post production. At this point, however, video projection systems were improving and the need for a film print became less essential. To the delight of many, it became easy to make video content that was passably film-like, even with no prospective need for an actual film copy. In the past decade, high definition video finally began to match or exceed the image quality of 35mm movie film as seen in a typical cinema. With the installation of HD video projectors in theaters, viewers will not be exposed to glitches such as the wiggly vertical lines seen when film is scratched or errant pieces of thread jittering at the bottom of the frame; These artifacts are happily eliminated and forgotten by even the most ardent cheerleaders of film, but some still insist on 24 FPS in a loop of cyclical reasoning.  

We are indoctrinated to associate this frame rate with high-budget, sophisticated productions, but it will not change the fact that the "film look" of 24 FPS video is ultimately a compression artifact. Continued use of this standard for contemporary purposes is little more than an Instagram-style "nostalgic filter," as garish and unexamined as a sepia-toned screenshot of an instant message conversation from a smartphone. There is no practical advantage to shooting new 24 FPS video, save for hand drawn or stop-motion animation. With the end of film and the economic constraints that it imposed, I'm left to wonder what exactly 24 FPS video is aspiring to imitate.

Jesse England is a media artist who resides in Pittsburgh, PA.