So you saw Billy Lynn’s Long Halftime Walk, right? The latest film from double Oscar winner Ang Lee, the man behind celebrated films such as Brokeback Mountain, Life of Pi and er, Hulk?.No? Well, you’re not alone. The film hasn’t managed to recoup its budget and was met with middling reviews.
But it featured a notable technical innovation: it was shot at 120fps (frames per second). For those of you that know that the industry standard is 24fps, you’ll understand what a big deal this is. But regardless, it suffered alongside the film, becoming another failed cinematic innovation.
While there are many examples of successful attempts to further the medium: sound, colour and more recently 3D, not all innovations have set the world alight. Many have flickered briefly, but faded into obscurity, anomalies on the cinematic evolutionary timeline, genetic mutations that never quite took hold.
Film is cyclical, genres and styles of filmmaking go out of fashion and come back around, but it also faces recurring challenges. The rise of 3D over the last decade has been a largely successful attempt to fight off the resurgence of television and the emergence of online streaming; this is not the first time cinema has had to combat television. Cinema used to be the sole visual medium; with no competition it reaped the rewards. Adjusted for inflation, Gone With The Wind overtakes Avatar as the highest grossing film ever with a staggering $3.4 billion.
But when television became a mainstream installation in homes across the U.S during an era of renewed post-war optimism, cinema was tasked with keeping crowds queuing around the block. Why bother leaving the house, when you have a screen to watch at home? Cinema’s main answer to get people of their sofas was the invention of widescreen. This meant increasing the aspect ratio from the academy 1:37:1 to the much wider 2:35:1.
If those numbers mean nothing to you, think of 1:37:1 as essentially square, the screen size on your old bulky TV, and 2:35:1 as what is more commonly known as ‘letterbox.’ Widescreen offered a much greater image than TV could, a simple solution and, long-story-short, it was a success, but this is supposed to be about failures. Well, it was another innovation of this era that tried to go further than Widescreen that failed: Cinerama.
Cinerama aimed to create a larger, more immersive image. It achieved this by using three cameras shooting through a single shutter, creating a panoramic image, imitating peripheral vision. Each camera used six-perforation 35mm film rather than the standard four, and matched wide-angle lenses to create an incredibly detailed image. The three separate images are then projected in-sync on a large curved screen; basically a triptych without the joins. Still here? It may sound complex, but this was actually the streamlined version.
Inventor Fred Waller went through many development stages, initially using eleven cameras to try and capture one image; no surprises that it didn’t work out. It was even developed into a training simulator during World War II, but involvement from Kodak and eventually Mike Todd (who went on to develop his own 70mm format) began the streamlining into a three-camera process.
Cinerama aimed to create immersion by mimicking human vision, the two extra screens acting as what you would see in your periphery. With noble intentions, it certainly started well for Cinerama. The first feature, This Is Cinerama, an introductory experience to the format, opened in 1952 and despite playing in only one cinema, it became the highest grossing film of that year.
It became a phenomenon. An initial review raved “The new motion-picture projection system known as Cinerama was put on public display for the first time last night before an invited audience… with due account for the novelty of the system, it was evident that the distinguished gathering was as excited and thrilled by the spectacle presented as if it were seeing motion pictures for the first time.” But it was only playing at that one theatre for a reason. Practicality.
The complexity of the projection and sound made it difficult to install, but worst of all, any willing cinema would have to install a special screen. It may have been worth the investment, were there enough films-to-screen, but the troublesome nature of shooting in the format severely limited the number of films released after This Is Cinerama.
The process had many technical problems. The camera was extremely heavy and noisy – if you’ve ever heard a film camera running, multiply that by three. When a lead blimp was put over the camera, the overall weight was 800lbs. The three cameras also increase the chance of failure – if one camera’s film became damaged or was incorrectly exposed, it rendered the other two useless as well, and this is before you even get the problems with the image itself.
When you try to shoot separate frames concurrently, it creates problems. Despite Cinerama’s success at mimicking the field of human vision, there were drawbacks. Conversation was particularly troublesome. If you had two people talking in different frames, they appeared to be looking in different directions, not at each other. This could be overcome by having the actors looking slightly away from each other, so when viewed on film, they would be looking at each other.
Also troublesome was if an actor had to walk across the set, crossing over where any of the lens’ fields of view met, they would become slightly distorted. This meant they could drop slightly from one frame to the next or arrive sooner or later than logic dictates. Also, while capturing images in-situ worked at distance, the fields of view would crossover if something came too close, which means that close-ups weren’t possible. Obviously this is a huge problem: you can’t shoot using just wide and mid-shots, but Cinerama persevered regardless.
This Is Cinerama was a travelogue format, using a loose narrative and journey to show off the technology. It continued in this vein for the next decade, showing off exotic locales unbeknownst to the American public. It wasn’t until 1962 that it saw its first use in narrative features, with The Wonderful World of The Brothers Grimm and How The West Was Won. The latter was definitely fitting of Cinerama’s scope, its capturing of classic American vistas and action sequences was stunning, but it still fell foul of technical problems.
The lack of close-ups is particularly telling. The film feels strangely detached, almost theatrical when characters are placed so far away from the camera, taking up so little of the screen. The lighting also seems theatrical, since the camera’s field of view means lights can’t be hidden on set, obviously problematic for interiors, which had to be mainly lit from behind the camera. Cinematographer Joe LaShelle notes, “Everyone ended up being broad lit and you couldn’t use back light or anything. If you wanted to light to fit the mood of a scene you just couldn’t do it.”
How The West Was Won would be the high point for Cinerama, the last feature it was used on. Its many technical problems eventually reduced it to a travelling road show and Waller’s death in ‘54 meant it never developed from its initial stage. There are now only three cinemas in the world equipped to project Cinerama. Christopher Frayling notes, “Like the monorail in cities of tomorrow, it was an idea of the future that soon became history.”
“First they moved (1895)! Then they talked (1927)! Now they smell!”
These were the proclamations of advertisements for 1960’s Scent of Mystery. You would think that would be a logical progression, first movement, sound, then smell. Though you’ve no doubt noticed, smell is a sense cinema has not mastered.
The idea of combining smell with screen had been around since the early 20th century; Walt Disney even considered it for Fantasia, but it wasn’t utilised until Scent of Mystery. Developed by Hans Laube and backed by Mike Todd Jr., who took up his father’s mantle after his death, Smell-O-Vision used a series of ordered perfume containers, which were cued to be released via piping into the auditorium by marks on the film’s projection. But this process was extremely expensive to install, costing up to $1 million, leading to only three theatres utilising Smell-O-Vision for the premiere of Scent of Mystery, more than This is Cinerama managed, but hardly worth boasting about.
The film scents were used to emphasize places and objects, alongside combining story points, such as a key character being identified by pipe tobacco. But trying to get an audience to smell a film rather than just watch it proved to be a difficult task, despite such costly installation, it didn’t open to favourable reviews. Bosley Crowther: “Patrons sit there sniffling and snuffling like a lot of bird dogs, trying hard to catch the scent.” And perhaps more damningly from comedian Henny Youngman: “I didn’t understand the film—I had a cold.”
So the film’s scents didn’t catch the wind (so to speak), though technical problems were not the only reason. There was in fact another scent-aided film released just weeks ahead of Scent of Mystery in late 1959, Behind The Great Wall, their process dubbed AromaRama (an obvious play on Cinerama). This process released scent through an auditorium’s air conditioning system. Its inventor Charles Weiss was fairly confident of its success: “We believe, with Rudyard Kipling, that smells are surer than sounds or sights to make the heartstrings crack.”
But the film didn’t open to favourable reviews, perhaps hampering any chance of success for Scent of Mystery. There have been many instances of films on the same subject being released in the same year; one is bound to have an impact on the other. The poor reception for both films killed scented cinema stone dead, but it was never going to be a smooth process to incorporate.
Technical problems, such as noises made by the releasing of the scents and audiences getting whiff of the scents at different times due to the natural dissipation, were obvious oversights, but it failed on a creative level as well.
Crowther described it as having artistic benefit of nil, and despite Weiss’s lofty aspirations for the technique, you have to wonder of its creative benefit.
Sure, smell can be a very evocative sense, but that’s usually associated with memory, which could easily backfire. The release of a certain scent may have a deeper resonance for some people and become distracting; you don’t want your means of immersion to become an aid to distraction. And then there’s the decision of which scents to include – you can’t add smell to every frame, there are some things in films you don’t want to smell. Smell-O-Vision would certainly make the triumphant escape in The Shawshank Redemption memorable for the wrong reasons.
The use of scent-aided films has cropped up every now and then; John Waters parodied the past failures with ‘Odorama’ for Polyester in 1982 and Robert Rodriguez used it for Spy Kids: All The Time in The World in 2011, though both these films used scratch and sniff cards, which create more technical problems, namely trying to scratch a designated space on the card whilst in the dark. But the technique has clearly never taken off; Smell-O-Vision got up people’s noses for all the wrong reasons.
You’ve no doubt heard of THX, the main name in cinematic aural innovation, but it had a precursor, Sensurround. The process came to light in the 1970’s, most famous for its use on ‘74’s Earthquake. Sensurround used lower frequencies to create a rumbling effect, an obvious enhancement for the actual earthquake scenes. However, this was another system with drawbacks.
The use of Sensurround was a success for Earthquake, but the use of such low frequencies created problems. The following was part of a promotional poster, warning viewers of the effects of the system: “Please be aware that you will feel as well as see and hear realistic effects such as might be experienced in an actual earthquake. The management assumes no responsibility for the physical or emotional reactions of the individual viewer.”
There were many reports of negative impact created by Sensurround; the effect to try and recreate the feeling of an earthquake was often too realistic. Earthquake;s premiere theatre, The Chinese Theatre in Hollywood, became damaged due to the vibrations and had to install a safety net above the auditorium to protect the audience from falling debris. That would be a health and safety nightmare nowadays.
There was even mentions of sickness and nosebleeds from audience members, Mark Collins reveals: “It wasn’t unusual for plaster, pictures and draperies to be shaken off the walls. All the speakers were right, left and lined up down-front, so they would cause a great sound pressure and oscillation that would directly affect the audience.” But the power of Sensurround was so great, its problems weren’t just restricted to the auditorium.
In one location, the vibrations were so strong they killed goldfish in a pet shop in the mall the theatre was in. At another theatre, the next-door hotel was evacuated, as it was believed a real earthquake was occurring, and it was even reported that Liza Minnelli had to stop rehearsing due to a performance of the film in a theatre next door. It’s not hard to see why Sensurround didn’t catch on.
Sensurround did accompany a few more films -Midway in ’76 and Rollercoaster in ’77- but obviously didn’t take off. Despite being a technical success, able to amplify aspects of a film to great effect, Sensurround’s often-dangerous side effects were clearly too big a drawback and it was overtaken by other, safer systems, like THX.
What? 3D, it isn’t a failure, right? Well, 3D had a life before Avatar, which reignited the format. While 3D may seem new, it in fact pre-dates cinema, with stereoscopy being a popular photographic technique as early as the 1850’s. But it was first used theatrically in the 1950’s, like Widescreen, to combat television.
The first full theatrical releases in 3D were Bwana Devil in ’52 and House of Wax in ’53 (yes the Paris Hilton version is a remake). These films were box office hits, the enticement of 3D clearly paying off. But it was again a technique with technical problems, both in shooting and exhibition.
3D’s sudden popularity led it to be used on Hitchcock’s Dial M for Murder in ‘54; there were struggles early on with the process. Hitchcock was so disappointed with the early footage, he destroyed the initial rushes and started again, stating: “They looked so odd–skimpy, un-finished–.” He was also not a fan of the 3D camera: “It’s a big, gross, hulking monster. It’s heavy and immobile and frightening.”
Hitchcock, the technical master he was, was able to overcome early niggles, but it was the exhibition that created more problems.
Though reviews for the film were good, the projection was met with technical issues, and of course the audience had to wear glasses, still a problem today.
When audiences were presented with either 3D or 2D screenings, the decision seemed to be pretty unanimous, Mildred Martin noted: “The first audiences proved to be a jury that could not only make up its mind, but could make it up in a hurry. In exhibitors’ own terms, DIAL M literally died.”
Even in the short time between Bwana Devil and Dial M, 3D’s popularity had significantly dwindled, Bob Furmanek and Greg Kintz elaborate, “3D movies were on their last legs. The studios had stopped shooting in the process in late October (’53) and the few films still on the shelf were released flat or had very limited 3D bookings.”
Technical problems, again, was 3D’s downfall, projection issues and glasses largely at fault. Tt wasn’t until the process became digitised that popularity began to rise again. James Cameron’s new, easier to exhibit 3D has seen the format succeed, though again popularity is starting to wane. It may require the help of Cameron to revive interest when his Avatar sequels are finally released, reportedly with glasses-free 3D.
Billy Lynn isn’t the first film to be shot in HFR (high frame rate). That honour goes to The Hobbit: An Unexpected Journey, captured at 48fps. The idea behind HFR is to reduce motion blur and increase detail, problematic in 3D releases, Sean Kelly notes in American Cinematographer: “The smooth motion effect of 48-fps-3D provides for a much more comfortable and appealing viewing experience; the lack of strobing and artifacts helps the viewer relax into a more immersive experience…The 3D eye fatigue is minimized without sacrificing depth.”
But HFR creates obstacles, as cinematographer Andrew Lesnie notes: “The increased picture clarity that comes with shooting 5K images at 48 fps brings joys and horrors simultaneously. The need for attention to detail pervades every aesthetic aspect.”
Though despite its intentions to create greater viewing experiences, David Bordwell notesL “Response was mostly sceptical, with critics complaining of the hypersharp, “soap-opera” effect onscreen.” This may be attributable to digital itself, which at high resolution creates a different aesthetic to film; digital being a stable image, and film constantly moving. But shooting HFR on film would be costly as The Hobbit shot over a petabyte of data, equal to 26 million feet of film.
Though there have been examples of HFR on film, one was Douglas Trumbull’s Showscan, which Roger Ebert describes: “I have witnessed a believable 3D illusion… 70mm film at 60fps. It created the illusion of depth not by leaving the screen but by seeming to recede within it”. But also noting: “It was too expensive for theatrical films”. You may recognise Trumbull’s name as he was behind many of the visual effects on 2001 and director of Silent Running and Brainstorm, the latter effectively ending his Hollywood career.
He has since spent time creating technical innovations, such as Showscan, but switching to digital led to a new process called MAGI cinema. This process was used for Billy Lynn; Trumbull outlines its intentions: “What we try to do here is to create a whole new viewing experience. An experience where people are not just viewing the movie but being a part of it.”
MAGI goes further than The Hobbit attempted, capturing at 120fps, in 4K and 3D. After Life of Pi, Lee thought this was the way forward: “I thought pursuing a higher frame would help me find answers…. I thought that the media, the higher frame rate, served the content and that presented a great opportunity.”
But MAGI also requires special installation for exhibition, so like many formats before it, its premiere saw an extremely limited release in its intended vision. It also opened to polarising reviews, the higher frame rate proved extremely divisive, with many praising it, but also as much derision.
In the pro column, Owen Gleiberman said: “you could almost reach into the screen and touch the person there. I have no doubt that’s how people felt about movies a hundred years ago, when they first saw them, but our eyes (to put it mildly) have adjusted to the facsimile of reality that movies create, and “Billy Lynn,” kicking the facsimile up three notches, springs that primal magic trick all over again.”
Alongside polarising reviews, the film didn’t even recoup its $40 million budget with so few theatres equipped to show it properly. MAGI has certainly suffered from Billy Lynn’s disappointment. Perhaps if it had been used on Life of Pi, it would have been met with widespread acclaim, but it seems destined to be forgotten.
Cameron plans to use HFR on the Avatar sequels, but for now the format isn’t winning any suitors. If you look at Billy Lynn’s faults, it’s easy to see how past examples predicted it failure. All cinematic innovations that have failed share common flaws: they are too expensive or difficult to produce and they are too expensive or difficult to exhibit, i.e. they didn’t make enough profit.
Successful innovations like sound, colour and 3D’s resurgence have been box office hits. Perhaps if Billy Lynn had succeeded like Life of Pi, MAGI would be a new sensation. But these innovations seem doomed to be resigned to a footnote in cinematic history, flashes in the pan that never quite took off, relegated to museum curios.