As you may have noticed lately, it's been difficult to have any serious discussion about the present and future commercial movie industry without falling into conversations about the specter of artificial intelligence. Hollywood's simultaneous embrace and fear of AI fuels nearly every conversation about the commercial film world, from the macro looming threat that human artisans are en route to obsolescence to the micro suspicions that what we're watching might not be "authentic" (witness last year's whipped-up Oscar kerfuffle over The Brutalist with its Hungarian-language voiceover correction and architecture
mockups). The epistemological quandaries of what constitutes a verifiable "human" touch in our art, popular or otherwise, is an entire can of worms that we'll be rummaging through for decades to come. But questions around how the latest technologies are integrated into our moving image media, and how that affects our ideas of what that media can—or, trickier, should—be, have forever been a part of the cinematic question: its life and continually heralded death.

The essence of film has always been inseparable from the trappings of the technology that makes it possible, and nearly every new advancement has come with skepticism if not dire warnings. The advent of sound in 1927, the mainstream usage of color stock in the 1930s, the widening of aspect ratios and film gauges for shooting and projecting in the 1950s, the rise of video technologies in the 1960s and 1970s. the development and application of computer-generated special effects in the 1970s and 1980s, the mainstream expansion of IMAX in the 1990s, the digital revolution of the 21st century—these are only the most widely known and seismic examples of industry game-changers, each of which proved more or less definitive despite resistance in their day. Such industry alterations happen so gradually that it's often difficult to notice when they have become the accepted norm—but they always do.

For every major sea change, there are countless examples of smaller technological advances that have had profound if less obvious repercussions for our cinema, from Hollywood blockbusters to more avant-garde spectacles. The developments of different methods of sound recording, from sound-on-disc to sound-on-film, to portable nagras;of the multiplane camera for animation, of the infrared film camera, of lens calibration, of advancements in stop motion, of portable camera equipment, of blue and green screens; of AVID, of bullet time, of motion capture and de-aging; of miniDV, GoPros, and 360 cameras. The list goes on. The question is always how we use the tools we've been gifted or cursed with.

For this new symposium, we asked our contributors to pitch an idea for an essay centered around a film that somehow utilized or enabled a technology—relatively new or more widely available at the time of its making—that was indivisible from the experience, meaning, or aesthetics of the film itself. The film could come from any point in cinema history, from the distant past to our rapidly advancing present. How did the film itself work with or even against this technology? Is the film an example of progressive advancement or does technology uncover the medium's essential conservatism? How much should the discussion of technology be a part of our critical evaluation of a film and of the history around it?

The questions are endless, so we left them for the writers to uncover. The results are wide-ranging, from silent magic films from the turn of the twentieth century to the textures and possibilities of digital sound in the twenty-first. And there's so much in between. Check back for new articles weekly.