I was planning on rebooting my blog, which I've not touched in two years, with a clever piece about the overuse of the term "reboot." But Ridley Scott won't let me keep my mouth shut. It's opening weekend. I just saw Prometheus in IMAX 3D. I can't keep quiet.
So, yeah, it looks cool. It's well designed. It's scary at times. It's got what is easily the best automated abdominal surgery scene in film history. Big whoop. I didn't like it. I have problems. Spoiler alert.
This will not be an Alien easter egg hunt. Although the Alien fan in me wanted more prequel satisfaction than I got, that's immaterial. Prometheus isn't a prequel to Alien; it's a prequel to a prequel. Fine. I won't judge Prometheus based on that criterion of hopeful fandom. It's its own movie, for better or worse, and I'll engage it as such. I won't even talk about how the first 45 minutes is a near carbon-copy of the utterly mediocre Alien Versus Predator. Nope, I'll just talk about what's not working here. Now, where to start, then? How about 200 years ago?
Since Mary Shelley brought us Frankenstein (subtitled "The Modern Prometheus," mind you) in 1818 (!), one of the central tenets of science fiction has been the wayward scientist. How often have we seen it in films? The scientist(s) either makes a misstep or miscalculation, practices science unethically, is exploited by a more sinister overseer, is evil him/herself or, in the case of Frankenstein, simply overreaches in the pursuit of knowledge. No matter the motivation, calamity ensues. We know the drill. The last version of the scientist gone awry -- the overreacher -- is perhaps the most poetic and dramatic in nature, by virtue of the fact that it is the most morally ambiguous. Shelley's Victor Frankenstein is a virtuous, brilliant man, who quests -- perhaps egotistically -- in search of knowledge for knowledge's sake. There is a secret of life, and his intellect demands that he find it. In doing so, he engages in practices which, while some might call abhorrent, are not evil, and do no direct harm to Victor's fellow man. The result of his labors, however, is so instantly horrifying to him that he snaps to his senses and tries, in vain, to turn his back on his work. He spends most of the novel either consciously avoiding or reluctantly confronting the repercussions of -- depending on how you look at it -- his error, or egotism, or lapse in ethics, or moral inferiority... whatever you want to call it. Students, theologians, and literary critics have debated how to read Victor for nearly 200 years because Shelley gave us a rich, layered text from which we can draw. We can all agree that Victor is a genius... it's where his genius leads him that we find inner and outer conflict. That's drama, kids. That's good writing.
Compare this, then, to the dilemma of the f*%#ing idiotic protagonists in Prometheus. I feel for Victor Frankenstein because I respect him. But these dunces are so unforgivably illogical, unrealistic, and one-dimensional as to be totally uncompelling. Look, I get it: it's a slimy monster flick dressed up as a slick space opera. As a devout fan of Alien, I am more than ok with that. That doesn't mean our standards for dramaturgy have to drop. These characters shouldn't be mere fodder for fangs and claws; this isn't a straight-to-video mutant horror film. We should be "with" them, not merely observers of their demise. And yet, I call "bulls#!%" at nearly every decision by the professional scientists in this film. Don't say, "What's the big deal; you're nitpicking." In another film, this might be quibbling over minor points, but not here. "Don't go in there... He's right behind you... Turn the light on," etc. is for midnight grindhouse fare, not for a film by the man who made Alien, Blade Runner, and The Duellists. In a sweeping, major motion picture that is so fundamentally about scientific research itself -- about the big questions of our origins and existence -- yes, I will take major umbrage with the scientific method, or lack thereof, onscreen.
The film lost me in one instant fairly early on and never got me back. Here we are: we were right, we've arrived on this distant planet, we have made the most important discovery in recorded science, we are standing in the ruined halls of our extra-terrestrial progenitors. What's our next move? Let's celebrate by taking off our helmets. WHAT? Excuse me? But the air is breathable, you say? So what! Keep your f*%#ing helmet on, you assholes. You're scientists. No one, not even the nerdy biologist, says so much as, "But we don't know what kind of contaminants might be in the air." The most we get is a muffled, "That's not a good idea," or something to that effect, from the non-scientists back on the ship. I see this not only as a mistake made by the characters for which there'll be hell to pay; I see it as a lapse in judgment on the part of the filmmakers, mostly because there's no reason for this action. It's not the catalyst needed to get the biological terror unfolding (that happens through other means), so it serves no purpose other than to make me think that these chaps are borderline retarded. But what's the value in making me think that of these characters? Remember: I care about Victor Frankenstein because I respect him. These people, I don't even buy as being remotely sensible. For a film that asks such philosophical questions -- "who created us" and, more importantly, "WHY" -- I am left wondering the same on this point. WHY would you even want to take off your helmets? They weren't a hindrance. There was no macho pissing contest to impress a lady. Our handsome young archaeologist just does it, and everyone follows. Even the crude and comparatively uneducated grunts in Aliens have a better sense of protocol than these supposed egg-heads.
This scene is soon followed by some character development I just don't buy. Scientists numbers 3 and 4 -- our nerdy biologist (skeptical, but excited to be there) and punk-rock geologist (sociopathic, in it for the money) -- very suddenly and inexplicably get spooked and decide to head back to the ship. Apart from a holographic playback of ET's running the now peaceful corridors, and the petrified remains of one such dude, there's been no cause for alarm, and yet these two guys decide: "Most important discovery of ALL TIME? No thanks, we'll split. Mummy's curse, or something like that." Even if I bought their unlikely superstition, their convenient separation from the group is implausible. They get lost and stranded underground, despite the fact that Mr. Ian Curtis Geologist is the expert on the mapping hardware. They're isolated by the script solely for the purpose of delivering a horror scene later on, in which they're attacked by a starship cobrasnake dildo monster. Dr. Nerdster Pointdexter, previously such a coward that he walked away from a monumentally historic expedition when faced with no perceivable threat whatsoever, is now happy to make puppy-love cooing sounds and reach out to touch a hissing and self-evidently hostile alien organism. BULL... S#!%. We call that bad writing.
In order to get me to buy illogical character actions, you have to convince me that the characters are not in their right mind(s), that they are strained, frightened, or driven to a point of irrationality. These scientists, however, have started off at that point, which, frankly, makes me hope they all die. I doubt that Mr. Scott would say that I'm supposed to feel that way about our "heroes." Now, beyond illogical actions, there's the issue of illogical emotions. That may sound oxymoronic, but here's what I mean: after the first surface expedition, scientists 1 and 2, our beloved couple, are depressed. They wanted to meet their makers live and in-the-flesh. I'm sorry, but I wanted to smack them. I was going with the movie for a while there, losing myself in the idea of being with that group and finding the holy grail of scientific discovery. Live aliens or not, this is, as I've mentioned, THE MOST IMPORTANT DISCOVERY IN THE HISTORY OF CIVILIZATION. One might expect the scientists, therefore, to react as scientists would: ecstatically. Yet Dr. Dragon Tattoo and Tom Hardy, Jr. have to remind themselves that it's a pretty neat find, whilst battling self-pitying ennui and drowning sorrows in a bottle of booze. Imagine if you discovered Ramses' tomb... would you be disappointed because some gold had been stolen? What if you unearthed the fossilized remains of the Missing Link... would you curse your damn luck for finding a specimen with so many partial and fragmented bones? This is a time for celebration. This is historic. This is the first day of a lifetime of research and fame beyond any scientist's wildest dreams. Revel in it. That way, we'll feel for you, we'll care about you, we'll have something at stake when all the horror ensues. Right? Nope. Instead, I just want the aliens to put you out of your misery.
We might even forgive these scientists their poor research methods and helmet blunders if they were smart enough to appreciate the magnitude of what they stumbled upon. But no, there are no character stakes in this film. There are only the mechanics of warm flesh and that which kills it. Compare this to the first Alien. The ethical question of whether or not to abandon protocol in order to save a crewman's life is repeatedly a point of contention. The mostly non-scientifically trained crew of the Nostromo are principled and/or idealistic enough to get into some heavy debates over life and death. Ripley defies Dallas' order and won't open the airlock, in the interest of quarantine. Later, Dallas says he'll take responsibility for what happens to Kane after ordering a surgery. Duty, ethics, protocol... this makes for emotional consequences. In the Director's Cut of Alien, Lambert confronts Ripley with a slap in the face because she'd have left her to die in the airlock, yet we know Ripley was right to have done so. Even without that scene, Lambert treats Ripley with passive disdain for the rest of the film. We call this good writing. It is a tense plot point. It is not entered into cavalierly, with the reckless abandon of oh, say, taking off a helmet for no apparent reason. The closest we get to this subtle treatment of ethics in Prometheus is when Snow White's Evil Mother doesn't want to let an infected man back on the ship (sound familiar?). She threatens him with a flamethrower. Will we get a tense stand-off that will affect the balance of power on the ship, as in Alien? No, the poor dude will rather suddenly ask her to burn him, with no discussion, despite his girlfriend's brief and futile protests. And will Noomi resent Charlize for the rest of the film for making s'mores of Logan? Nah, it's whatevs.
Fast-forward: a horseshoe-shaped ship crashes and rolls on its side. Normally, I'd forgive a well-made and engaging flick for having the two remaining souls run that gauntlet in a straight line, rather than veer a few yards to either side to avoid being crushed. But, in light of all the incredulity that preceded this moment near the film's end, it's just another thing that pisses me off. Ridley Scott is usually better at the mechanics of action scenes like this. Logic falters here, as it does throughout Prometheus. This failure may be most evident when exhibited by the one character who should be more logical than any other. David, the android, is easily the film's most compelling character, not least for Fassbender being awesome at everything he does. He manages to ride the line between evil and nobility beautifully, in the tradition of the franchise's artificial humans. And yet, his most consequential action in the film is neither logical nor warranted. When asked by his hyper-sleeping boss, Wrinkly Weyland, to "try harder [to get the plot moving]," he decides to spike Handsome's ennui booze with alien anthrax and let the slimy monster chips fall where they may. Later on in the film, David seems rather concerned with the preservation of human life, but, for now, he'll risk the lives of everyone on the mission, and compromise the safety of not only the ship, but his raisin of a boss as well, by playing Iron Chef with primordial soup.
This serves no rational function other than to create more horror scenes for the movie's purposes. But for Weyland's purposes, the strategic weakness of this plan surely would be evident to a being so clairvoyant and brilliant as David. In Alien, by contrast, Ash's destructive, crew-expending actions make sense for one simple reason: "Bring back lifeform, priority one." But David's objective is to arrange an audience between Guy Pearce and the Master Race, not to facilitate biological warfare. As it turns out, David's goal of awakening
the gods is accomplished simply. He need only further explore the
expedition site and find a slumbering, albino Yul Brynner. Done! You
think he'd have tried that option before unnecessarily wreaking
biological havoc on his fellow crewmates. Oh, but wait, the film needed
an android murderer and more monsters in the middle there. We call
this... you know what we call this.
Side note: speaking of logic, explain to me the decision to cast a
40-something-year-old actor as a 90-something-year-old character, when
no youthful flashback scenes are required by the script (viral TED Talk
promotional film notwithstanding). I'm not saying it's verboten casting
(Toshiro Mifune is a revelation in Kurosawa's I Live in Fear),
but it's puzzling. If casting Lance Henriksen as yet another member of
the Weyland dynasty on film would have been too on-the-nose, then surely
someone else might have made more sense than the dude from Memento, as good an actor as he may be.
In any event, whether it's for power or fame, for an attempt at curing cancer or the pure quest for knowledge, we know fiction's scientists overstep their bounds. It's the first step in a lot of great stories. But Dr. Otto Octavius, Moreau, Tony Stark, Peter Venkman, or Victor Frankenstein... they all would have kept their mother-loving helmets on if they'd found themselves on LV-223 in Prometheus. They'd also have exhibited a modicum -- if not a boatload -- of awe or reverence for the titanic revelations and relics discovered therein. They'd also have had the common sense to hightail it out of there as soon as they saw the predicament into which their disregard for scientific method had placed them. Alas, in this film, those sensible decisions to flee are made too late, by characters written with brains just slow enough to keep the horror unfolding a few steps ahead of them. Prometheus is a film about our genetic engineers, and the supposed ingenuity we exhibit in unraveling their code. A great film -- an Alien or a Blade Runner -- similarly can have us marveling at the engineering of such a masterpiece, decoding technique, form, and style for decades to come. Films such as those weave themselves into our cinematic mythology -- stories and characters as dear to us as the original overreacher of myth, Prometheus himself. I will, in all sincerity, put Lt. Ellen Ripley in the pantheon of great tragic heroes, alongside the Titan who stole fire from the gods, alongside Victor Frankenstein. As for Prometheus' Dr. Elizabeth Shaw? She's a well-intentioned, unthinking, solipsistic pawn in a transparent screenplay's attempt at replicating Shelley's lightning in a bottle. I call bulls#!%, I call sham, I call wasted potential.
Saving grace: I LOVED seeing a blue & tan Border Terrier nipping at the heels of Guy Pearce. She was my own Buster's doppelganger, and I'd rather watch a sequel about her and Jones the Cat than see what I can only assume will be a similarly disappointing Prometheus 2: Rise of the Alien Begins.
Get 'em while they're hot! I finally figured out the Cafe Press beta issues, so the world can finally avail itself of my Lost-frustration-induced t-shirt design, the Dharma Red Herring Station. Choose your poison.
Was this inevitable? According to the LA Times, storied producer Richard D. Zanuck has confirmed the fears of internet fanboys. From the article:
Zanuck and Spielberg spoke a few years ago about going back to [Jaws] with the digital paintbrush of CG effects to create a more horrific predator. They decided, for the time being, to leave the film alone, although Zanuck says he is intrigued by the notion of adding 3D effects to the 1976 classic for a theatrical re-release.
I don't feel the need to gripe about Hollywood adulterating, soiling, and repackaging our favorite celluloid memories just to turn a profit. In this age of remakes, we expect that, and any objection I could offer here would just be redundant cyberspace chatter. George Lucas has recut, amended, "enhanced," and digitally micturated upon -- then subsequently resold -- the Star Wars films so many times that even hard-core fans have trouble counting the number of versions in existence. Business is business. What I'd rather do is talk about the formal consequences to the work itself when films are manipulated.
The example of Star Wars is from one end of the spectrum, in which a film's primary author has been able to revise a previous work ad nauseam thanks to new technology or to the creative freedom that comes with financial success. On the other end of the spectrum, a film can pass through the lacquering hands of just about anyone before reaching an audience. In the bygone days of VHS, we watched films that had been pan & scanned (read: "butchered and mangled"). Most of us who watch silent films do so with musical accompaniment courtesy of Joe Schmoe's synthesizer, set to "organ." Then, of course, there's the abhorrent practice of colorizing monochromatic movies. But even the maestro, Ray Harryhausen, has spoken out in favor of colorization, with the caveat that he be involved, and James Cameron himself was famously a proponent of director-supervised pan & scan for standard TV displays. So cinematic revision practices run the gamut, and should never surprise us. And, as the old saying reminds us: "Only the projectionist has final cut."
Therefore, let's not waste time expressing our dismay that Zanuck & Spielberg would even consider a CG and/or 3D facelift for what is considered, with little exception, one of the greatest horror films of all time. Of course they would. If there's a market for rerelease, Hollywood will brainstorm how best to saturate it (I make no judgement call here; I'm in favor of good films being seen by new audiences). Let us instead engage the film, not as commercial property, but as film, and convince Mssrs. Zanuck and Spielberg that an altered Jaws would be an inferior Jaws, not simply because we fear change in principle, but because these particular changes would do the film disservice.
And, before proceeding, let's come clean and admit it: the movie isn't without its flaws, if for only one reason. The shark always looked fake. But that's by no means an Achilles heel. Obviously, it never hurt the box office, and more advanced shark FX films (Deep Blue Sea, e.g.) haven't come close to eclipsing Jaws in the public's consciousness, so a fake shark is no more necessarily a liability than a convincing shark is necessarily an asset. In fact, the shortcomings of the titular character in Jaws may be indirectly responsible for some of the film's strengths (more on this later).
Now, in the case of converting Jaws to 3D, there are two operating principles which govern our skepticism. First of all, there is the question of contextual anachronism -- that is to say, imbuing a work with technical devices not available at the time of the work's original completion (colorization is the simplest example of this). 3D technology has advanced to a form far-removed from the blue-and-red anaglyphics that would have been available to Spielberg at the time of production. Therefore, the process of converting mid-70's film stock to contemporary digital stereoptics would necessarily give the film a look of incongruence. As when we watch Casablanca in color, or we see Buster Keaton run at the wrong speed (because the frame rate of silent era cinema is rarely compensated for in projection), the most fundamental elements of the moving picture (what makes film film) are being sullied. Even if you're not a student of film, your eyes perceive that something is off; the textures are mismatched. Some people don't mind this. Some people just want color. Some people can't stand to see black bars at the top and bottom of their standard TV's. Some people would watch every movie in 3D if they could, because they think it's neat. Yes, and some people unironically believe that a velvet Elvis is a tasteful, expressive work of art that ties any living room together.
Lucas's Special Edition Star Wars films offer a striking example of this contextual anachronism phenomenon. Our eyes know how objects appear in 70's films. Subconsciously, we recognize a palette of light, color, grain, and a vocabulary of movement that is unique to any given feature we watch, or even to any given reel of film. In short, we know that a digital Jabba the Hut does not, in a purely mechanical sense, belong in the original Star Wars:
Even if we concede that the computer-generated elements in this sequence are of notably poor quality (especially by Industrial Light & Magic standards) the fact remains that the same FX seem less out-of-place -- at least, as concerns the visual mechanics -- in the more recent Star Wars Episode I, a film made in the digital age (check out Jabba at around 1:40):
As concerns these two films, I deliberately am not accounting for perplexing intangibilities like overall quality (cough, hack). I merely suggest that the effect of running analog film through a digital workflow -- for anything more than restorative or delivery purposes -- has the potential to create anomalous results, like running a piece of papyrus through a laser printer. It's a cinematic paper jam waiting to happen.
This anomalous effect can be put to use to create uniquely expressive effects, but then we're veering off into derivative art, not revision. By silkscreening different colored Mona Lisas onto canvas, Andy Warhol creates something which is -- love it or hate it -- a new work, one that could not be achieved with precision through the use of oils alone.
Now, if Zanuck were suggesting a mere reediting of Jaws, we could still gnash our teeth, but he would not be suggesting the use of a device that would contextually make the film seem the anomalous product of another era. Take, for example, James Cameron's Aliens: Special Edition, a mostly improved version of the original film that offers no more than a few new scenes and recut sequences here and there (Coppola gave Apocalypse Now the same treatment with Redux, to much poorer reviews). There are no "enhanced visual FX;" it's just editing. That may beg the question, "Well why couldn't Cameron and Fox get it released right the first time?" A fair point, to be sure, but we can't claim that anything feels treated, or subtly off. The medium of the film -- its celluloid footprint -- has not been altered. Our eyes remain content, even if our hearts might yearn for the more familiar versions.
3D conversion, on the other hand, will not fool even the least trained of eyes. We know that we are watching a corruption, however exhilarating it may be. If recent 3D film conversions like Alice in Wonderland and Clash of the Titanshave seen outcry at having been put through the stereoscopic ringer, then what hope does a 35-year-old film like Jaws have? Even Hollywood brass is wising up. Dreamworks CEO Jeffrey Katzenberg told Variety, "If we as an industry choose this 2D to 3D post-production conversion, it's the end. As quickly as it got here, that's how fast it will go away." So even the most perfectly executed 3D conversion, it seems, will be contested at best. The only people who enjoy it will be those who go home from the theater to admire their velvet Elvises.
But let us forgive filmmakers their toys. Let us suppose that Spielberg always wanted to make Jaws in 3D. So what? The fact is, he made a 2D film; and this leads us to the more fundamental governing principle of why any major revision of a film is dangerous. That is: it is the inherent limitations of film itself which inform the director's process, and, therefore, the end result of any cinematic undertaking. In other words: any given film is the product of all that which is and is not possible in filmmaking. To revisit the process with a new set of rules is to destroy and recreate the original work, for better or worse.
It is beside the point to assert that had he the option, Orson Welleswould have shot Citizen Kane in color; the fact is, it's gorgeously shot in black and white contrasts. Chaney's Phantom of the Opera never glowered at Christine with a scream; he was always eerily silent. The original King Kong isn't a motion-captured digital construct; he was a pioneering example of stop-motion visual FX work. All Quiet on the Western Front was not given a 5.1 surround THX-certified soundtrack full of explosions; it was expertly mixed to contain an entire war in one sound channel. One might say that changing such facts would be inoffensive technical enhancement. To that, I ask, "Well, if it's just minor adjustment, then why do you need it to enjoy the film?" More importantly, that stance diminishes the work of some great filmmakers and craftsmen to mere trifling. It implies that cinematographers who shot black and white were doing something less valid, expressive, or interesting than what can be done in color. Tell that to Karl Freund or John Alton. There is no need to digress here into a seminar on color theory, in which one could easily make the case that monochromatic imagery is far more interesting and explicative than color. Suffice it to say, the consideration of outdated filmmaking techniques as lacking, in any way, is as absurd as saying that all paintings became obsolete at the invention of the photograph. Tell that to Da Vinci... or Andy Warhol, for that matter.
Every tool in the evolving bag of tricks of filmmakers -- from actors' performances and lighting setups to pancake makeup and dolly tracks -- has seen evolution over the course of cinema's history. Whether it's a lens choice or a line of dialogue, every frame of film contains within it a thousand decisions on the part of a production team. But what director, in the history of film, has ever made one of those decisions, saying, "One day I will be allowed to revisit this scene with a computer, so for now, it's ok if it's not so good?" Even an early director who saw the horizons of sound or color would not allow that hypothetical prospect to affect their decision making on a current setup. For years, Hollywood studios made some major motion pictures in color, and others in black and white; even the Oscars had separate categories for cinematography. Were the directors of photography who shot black and white during those years lighting for future color conversion? I doubt it. They lit for what they were working with.
This all comes back to limitations. Limitations, in every form, are omnipresent in filmmaking, as they are in any art. A particular film stock may only be so fast, requiring so much light that you don't have. The dame in that big musical number may be unable to hit high C. The budget doesn't allow you to recreate the Hanging Gardens of Babylon. The catering blows and the crew is about to walk. The shark looks fake. The flat screen keeps the objects in the frame from bursting into the faces of the audience. You see where I'm going with this.
In fact, the most fundamental limitation of film -- the four sides of the frame itself -- is directly related to 3D and the audience's sense of self. If we were to take away the boundary of the frame, and have a film, photograph, or painting fill our horizontal and vertical field of vision entirely, then the composition of every shot would be ruined. In fact, we would have to redefine what composition means (and painters would have to use a lot more paint). By extension, a film originally intended to be exhibited in 3D is one thing, but when we convert a 2D film -- when we add an effectively limitless Z axis to what will still have a finite X and Y -- then what are we corrupting? In The New Yorker, Anthony Lane did a thorough and entertaining piece on the technical, sensory, and social history of stereoscopic imagery in the movies. He does well to discuss Edwin S. Porter's The Great Train Robbery (from 1903 mind you!), in which a gunman points his revolver at the camera and fires. The stories say that viewers fainted or ran screaming from the theaters. That's an image with 3D aspirations; in it, the filmmaker acknowledges the presence of the audience directly, incorporating the viewer and the film into one perceived spacial continuum. Jaws has its counterpart to Porter in the monster's "comin'-at-ya" leap to the lens. Whether it's a bullet or a shark, the Z axis between screen and us is invaded. One wonders, if filmmakers ever break the 2D bonds of a four-sided frame completely -- if they achieve true virtual reality -- how such a scene would play. Avatar on a 3D IMAX screen is certainly a step in that direction, but it's still a far cry from a completely immersive experience of being in a film. But does anyone really want to experience swimming with Jaws? Perhaps, but that's not the film Spielberg made.
Taken to its logical extreme, the 3D tech boom surely leads towards that fully immersive, 360˚ virtual reality. So where does upconverting classic films end? Ruined composition aside, what will the simple experience of a cut feel like when we're surrounded by image? How hi-fi can we stretch our relatively low-fi films? And why do we even need to? Dozens of action films from all throughout the 20th century have been remixed with 5.1 surround sound for home theaters (they even gave Disney's Snow White the treatment). But these sounds were never intended to fill such a space; are they big enough for their new britches? Does a technologically "upgraded" film enhance the original experience, or merely distract us from the filmmaker's intended vision? Check out the clip below, a commercial spec piece by Joseph Kosinski, director of the upcoming TRON: Legacy, in which we literally get to walk around inside a film (Kubrick's The Shining).
I think the technical term for that piece is "f#$%ing cool as balls," but can we, in our right minds, suggest that this is how Kubrick wanted us to watch The Shining? Lane, in his New Yorker article, cites Sergei Eisentein's clairvoyant prediction of 3D cinema and its advantages, and supposes what a number of classic moments in film history would look like if given the 3D treatment. But, he says, it doesn't matter:
There’s just one hitch. The scene works fine as it is... the posthumous application of 3D would not sharpen -- and might even vulgarize -- its moral thrust. Is that not, after all, how we have learned to read a painting since the time of Giotto? We know that perspective is a trick, and that a flat surface stands for a denser and more far-reaching world, but it is an illusion of which art... has availed itself with unstinting intelligence, relying on our instinct to decipher the code. What 3D movies say to us is: You have been fooled. You were duped, all this time, into thinking that a window was a world.
As long as films remain in that window, 3D will be just another bell or whistle -- not necessarily a gimmick, per se, but certainly just another tool in the kit. If Spielberg wants to add that tool to his Jaws kit retroactively, then he might as well go the full hog and recreate the shark digitally, because he will be asserting his freedom over the chains that bound him when he shot the film in 1974. But by doing this, he will be walking away from the fundamental battle he fought during production -- the lightning in the bottle that created such a memorable film -- namely, his difficult leading man:
Bruce, as he was affectionately named (after Spielberg's lawyer), never worked right. He went left when he was supposed to go down, up when he was supposed to go forward. He rusted. He malfunctioned. He was a diva. And even when he did work, he looked like what he was: a foam rubber shark. Frustrated, Spielberg shot around him. As Zanuck says:
In desperation, we came up with so many good ideas -- like the floating barrels, for instance, that were shown on the screen to suggest this shark beneath them, underwater -- and we did it because we didn't have the shark. In the script, the shark is on Page 1 when the girl gets eaten. It became more terrorizing than anything we could have hoped for. If we had CG then we would have had the shark in every frame.
So Zanuck admits that the limitations of the process drove them to a more effective result than they would have achieved if CG had been an option, and yet now he wants to bring digital technology into it? His own logic suggests that this would weaken the film. Jaws was made by embracing its limitations, rather than resisting them. Take the lesson from George Braque, who, along with Picasso, invented cubism: "Out of limitations, new forms emerge." On the one hand, this can be read as an excuse; Braque could no more paint like Rembrandt than Bruce the shark could accurately mimic a real great white shark. But that technical deficiency bred invention, and something new; cubism shows us something that Rembrandt can't, just as Jaws offers something that Shark Week footage does not. By the time Spielberg does allow the shark to play, we're so afraid of it that we can forgive his obvious shortcomings. Don't buy that? Then why are we still talking about Jaws 35 years later? Indeed, "don't show the shark," has become a staple tension-building device of monster films, so even if the shark had looked better, would seeing more of it have been a good thing?
The implication of revising any film with more advanced technology is that there is something to be gained in the process, i.e. that the original has room for improvement. But logic like that suggests that the 1983 sequel film, Jaws 3D, has something going for it that the original doesn't. If the film's marketing campaign is to be believed, then that something is "terror." But give me a person who finds Jaws 3D more terrifying than Jaws, and I'll give you a velvet Elvis.
This is, perhaps, unfair. Jaws 3D used technology put to shame by today's standards, and relied on chestnuts like severed arms and uncannily suspended jawbones for its thrills. Narrative stops in the name of cheap thrills. And, 3D aside, how can we even begin to compare this film to Jaws?We can't, of course. There's more to a movie than bells, whistles, and severed arms. It usually starts with something called a script, but that's another blog entry. My point is that technological revision like this is just varnish; what's underneath is far more substantial. In short: if it ain't broke, don't fix it. That's not to say that technique isn't important, or that shoddy attention to detail is ok, but, as John Waters says, "Technique is just failed style." Asserting that any film would be improved by converting it to 3D is to distract us from and discredit the work of the original film. Hitchcock's Dial M for Murder, while not his best film by a longshot, is still an effective thriller in 2D, though it was also released in anaglyphic 3D. Burton's Alice in Wonderland was converted to 3D as an afterthought, and released in two versions. It may be more nifty in 3D than in 2D, but is it a better film? Avatar was composed for 3D specifically, by a filmmaker with a savant-like understanding of the minutiae of the medium. Yes, I'd argue that's a better movie in 3D than in 2D. So yes, as concerns films made and released during stereoscopic periods in the history of major motion picture film distribution, I believe that there's room to question which version is better, with the answer being different from film to film. But Jaws was made well after the 1950's anaglyphic 3D boom, and well before the new 3D wave, so how does it figure in? I put forth that it doesn't. So leave well enough alone.
When we make these sweeping revisions to films, we seem to say that our best work at the time wasn't good enough. We discredit our own film history, and the films themselves. And so I say that converting Jaws would damage it, if only by implication. Even if done well, even if it's an exhilarating experience, the very fact of the new film would be a spit in the face of the original. Of course, I have to admit that I'm no Spielberg. Who are we, the proles, to tell a filmmaker what to do with his or her baby? After all, could he not surprise us and make something better? Few would argue that George Lucas ever improved a film of his in one of his subsequent passes, but is it possible to enhance (using more than mere editing) a film at all, years after it's been released? Even Spielberg tried it, with a new cut of E.T., including CG effects where none had been before. I guarantee you, that's not the version of E.T. my children will watch.
Indeed, I can think of only one exception that proves the rule. Ridley Scott's Blade Runner: The Final Cut is more masterpiece than I thought a masterpiece could ever be. It's a cut of the film very close to the mislabeled Director's Cut, but it includes digitally cleaned up and enhanced shots, removal of continuity errors, and, briefly, some newly generated CG shots. Why does it work? Why does the new material feel so unobtrusive (so unLucas)? I can only credit Scott's restraint. Each and every decision he makes in the Final Cut seems so in tune with the original film's vision, so a-part-of that film's vocabulary, that we can only assume these truly were the shots he would have achieved with a little more time and money. He has only added in that which he felt the film needed in the first place. Case in point: the infamous Zhora snake dance scene. Chronicled in the making-of book, Future Noir, and on Blade Runner's DVD extras, this was to have been an elaborate stop-motion animated sequence in Taffy's club. Drawings survive, but the scene was deemed too ambitious to film, and unessential regardless. Given the opportunity to create and insert a CG version of the scene for the Final Cut, Scott passed. That was the decision he made in 1982, and he stood by it nearly three decades later. While the spectre of CG Jabba may bemoan the lost opportunity to realize the snake dance, I thank the heavens that Ridley Scott is not George Lucas. I can only hope that Spielberg follows Scott's lead on the subject of digital revision, 3D conversion, and dancing serpents. Really, what's next, techno remixes of the King? Aw, nuts.
Your humble blogger's short film, Alice Jacobs is Dead, will screen this month at the Saturday Nightmares Horror Expo in Jersey City. In attendance will be our leading lady, Adrienne Barbeau. And the master of zombie cinema, the Godfather himself, George A. Romero, appears shortly after us. That's right... we're opening for George A. Romero! Come all. It's minutes from Manhattan on the PATH train. Tickets and other info here.
Have you heard about the Norway Spiral yet? Aliens or an AWOL Russian missle? You be the judge, but I can't get excited about it. I just will be too heartbroken when it's revealed to be a boy in his father's weather balloon, sticking spinning fireworks in his a$%hole and mooning Scandinavia. When the little green men appear on camera, then I will believe.
I'm truly bad at updating regularly. So, in an effort to revive my flatlined blog, let's talk about the resurrected dead. Nice segue, huh?
What do Jesus Christ, zombies, and Frankenstein's Monster all have in common? Well, apart from their devilish good looks, they're all undead. This brilliant, if blasphemous, diagram was sent to me by RL, an old family friend:
I could teach a semester at Geek U on this diagram, and I salute its author, whoever (s)he may be. The zombie-Christ joke is an old one, and, between Mary Shelley and James Whale, Frankenstein's Monster has been linked to Biblical lore since his inception, so this is nothing new, but it's still fun to talk about. I'll eschew the pitfalls of an outright affirmation of JC's followers being called "mindless," or the notion of JC himself being "feared." By a similar token, I'm not so sure we can call zombies "followers" (wouldn't that require will?). Nevertheless, the gist of this diagram is undeniably sound, and would make Mr. Venn proud.
The central premise of the image is perhaps the most interesting/controversial: it presupposes that Christ is a monster, or, at the very least, an entity which logically can be mentioned in the same breath as three commonly accepted monster tropes (walking dead, reanimated cadaver, vampire). The devout Christian will, no doubt, initially reject such a notion. But let's not be so hasty. What, after all, is a monster? Dictionaries offer pretty concrete, if a bit pedestrian, definitions, the mean average of which seems to be something like Random House's no. 2:
any creature so ugly or monstrous as to frighten people.
Basic, sure. But does this include a sexy, genteel vampire? How about an invisible man? What about a pod person from Body Snatchers? And is "creature" only animal? Isn't the Terminator or a golem a monster? How about a triffid? We could debate all of these, but isn't "an ugly living thing" a bit narrow? Broadening the definition has its problems too. Film theorist Noël Carroll wrote a whole book, The Philosophy of Horror, in an attempt to tackle the question, and calls a "monster":
a being in violation of the natural order, where the perimeter of the natural order is determined by contemporary science.
I like this, but it means that Superman is a monster, and a great white shark isn't. So we have problems again, at least as concerns horror fiction. I'll side-track here, and mention that Yours Truly's father, a frequent commentator on NPR, just discussed movie monsters on-air, and spent much of his time wrestling with definitions. It's worth a listen:
Guess who his personal researcher was. Anyway, let's, for the sake of argument, take the Carrollian view and just agree that a Monster (with a capital M), is just the other. It could be animal, vegetable, or mineral, and it may even look human, but it is somehow not human. By that notion, The Christ surely is a Monster, even if he is a good one (Monsters, Inc, anyone? Don't forget Superman). Endowed with supernatural powers by an omnipotent force, he is humanoid (if not actually part human), but unnatural. A zombie, once human, has been transmuted by infection into something no longer human (at least, as long as zombism remains a fictional medical condition). Dracula, once human, has been transmuted by supernatural forces into something no longer human. Frank (as I'll call him, after his father), once human (or parts from various humans), has been transmuted by science fiction into something no longer human. So yes, to all you who've got religion: JC is a Monster, logically consistent with zombies, vampires, and science fair projects gone wrong.
Now, I'll ask all you vampire fiends to forgive me, because I'm most interested in the blue circle. Sorry, but "resurrected from the dead" is just the most fun unifying trait on the diagram. So I'm sorry, Count Dracula, if I don't discuss you much. The undead is just more fun for me to talk about, and Frank has been grossly ignored by today's Vampire and zombie-crazed media. Indeed, Victor Frankenstein's creation is more social castaway than are zombies (who have no will to integrate into society... only the will to eat it) or Dracula (who ostracizes himself from society, and is generally agoraphobic). Frank longs for humanity's company. He loves man. Shelley's original Monster has an aesthete's soul, an articulate tongue, and a deep sense of morality (though his rage may get the better of him). Rage aside, does that sound like anyone you know? An outcast and scapegoat during his own time, JC turned his cheek to his adversaries, tormentors, and those who just didn't like him. Frank, on the other hand, was more prone to opening his can of whoop-ass (as Berni Wrightson so beautifully illustrates):Apart from Frank's violent tendencies, he has a poignancy and a loveability that is somehow Christ-like. Created by a force greater and wiser than himself, he is set forth into a world that does not fully understand him, by which he will be shunned. James Whale took the Christ metaphor to very obvious extremes, especially in Bride of Frankenstein (see first picture, above). Whale's Monster is even less prone to violence than is Shelley's, and only causes people harm by accident or in self-defense. Shelley likens the Monster to Adam rather than JC, but either way, Frank does seem to have a lot in common with the Judeo-Christian notion of one of God's special projects. As Colin Clive's scenery-chewing Dr. Frankenstein cries, "In the name of God, now I know what it feels like to be God!"
And what does JC have in common with the flesh-chewing living dead? Come on... he's just asking to be bitten by the masses: "Eat of this, for it is my body." We could take that further, if you like. We could consider infected zombie flesh the Eucharist. "Eat of this, and through it find redemption." In this state of [mindless] redemption, you feel no more pain, no want, no worldly woes, and you are compelled to spread your newfound freedom to as many as possible (there's your pink circle on the chart). I have no idea who generated the image below, but it seems relevant here:
How good is that? Of course, eating-JC-to-become-a-zombie isn't the main association made by the diagram. The more basic premise is that JC, like a zombie, rises from the dead. If that's all a zombie is (which, granted, is a contemporary, Romero-inspired definition), then JC certainly counts. And if that's the case, then 2004 was a great year for zombie films, between Shaun of the Dead, Snyder's Dawn of the Dead, and The Passion of the Christ. The last shot of Gibson's Passion sees the dead Christ stand up in his tomb and walk out of frame. Whether by undead infection or divine intervention, the dead comes back to life. I buy it.
So what's my point? Hasn't all this been rehashed on the web for years? Sure. I guess I just needed something to blog about. I wish I could spin some deeper connection between Frankenstein's Monster and zombies. But, apart from the general "resurrected from the dead" label, I don't really see it. If nothing else, JC and Frank are sentient beings. Zombies, on the other hand, despite Maestro Romero's recent forays into "self-aware" zombie territory in his last few Dead films, are most frightening and effective when the attack is a wave of mindless, hungry walking dead. JC, too, comes with his apostles, and seeks to create a movement. But Frank, poor Frank, is definitely a loner. He can't convert the masses as can a zombie, and no one loves him as they love JC. No respect... he's the Rodney Dangerfield of the resurrected dead.
The trailer for my film, Alice Jacobs is Dead, is now online (embedded below). In the interest of shameless self promotion, I'll point out that we just won Best Horror/Suspense Film at the San Diego Comic Con International Independent Film Festival. Hope you all enjoy. Check the main website for updates.
I recently rewatched Tod Browning's Dracula, and was displeased to find that it really isn't all that. It's slow. It's silly. It's not really scary, even, I'm guessing, by 1930's standards. I know there are stories of women fainting in the aisles, but I have to assume that such stories are apocryphal. Compare this 1931 adaptation of Bram Stoker's novel to F.W. Murnau's Nosferatu (1922). The latter holds up infinitely better, is far more vibrant a film, and is as scary as ever. Bela Lugosi's Dracula is elegant, iconic, and sexy in a Eurotrashy way, but Max Schreck's Count Orlok (Dracula) is purely horrifying. I digress. Back to Bela... I shall forever love Tod Browning for the film he made after this one: Freaks. And I will give credit where it's due; Browning creates some wonderful images -- classic, even. Much of this certainly must be due to the work of cinematographer Karl Freund, a German who shot Leni Reifenstahl's Tiefland, Murnau's The Last Laugh, and a little Fritz Lang flick called Metropolis -- ever heard of it? Perplexingly, he ended his career shooting episodes of I Love Lucy. That's right: the man who captured Metropolis also shot I Love Lucy. That's one of my favorite tidbits of film history. Freund later directed the original version of The Mummy (also a disappointment), and is widely known to have co-directed Dracula. So why, in the hands of this master, did I see the following frame when I watched Dracula?What the hell is that slab of paper? Obviously, it's serving to shape the light, right? But it serves no narrative function, I assure you, and I didn't see it in subsequent shots. I've never seen such an obvious blunder in a studio film. Freund let me down. Tod Browing let me down. Bela Lugosi... is dead.
Did the people on the Mall, at Dr. King's feet in 1963 know that they were witnessing one of the most important moments in modern oratory? Ineffectual as it was in swaying the election, can we call Obama's 2004 Democratic National Convention address a truly great speech? Were the troops at Tilbury as moved by Elizabth I's words as history has them? Maybe. Or maybe history sweetens the memory of speeches. Perhaps hindsight can sweeten the words of speeches themselves. Accounts of reactions to the Gettysburg Address are conflicted, but many say that Lincoln's words met dispassionate, indifferent, and bored ears (remember that the crowd at Gettysburg had, before Lincoln's brief address, just suffered through a two hour oration by former Congressman Edward Everett). But eventually, if not immediately, the Gettysburg Address became the standard -- a compassionate and lofty tract of idealistic political philosophy, expressed through unpretentious, direct language.
On March 30th, 2009, Wynton Marsalis gave a speech at the Kennedy Center in Washington, D.C. Far from the brevity of Lincoln's meditation, but not quite the dirge of Everett's two hours, it is easily the most moving and profound piece of oration I have heard since July of 2004. The occasion was Arts Advocacy Day, and Marsalis gave the prestigious Annual Nancy Hanks Lecture. This is not a predictable battle cry in support of arts funding. Nor is it a sermon in promotion of one cultural agenda over another (although, granted, his personal tastes and biases inevitably shine through). There is something grander, yet simpler, at work here. This is a philosophical rumination (a "ballad," as Marsalis labels it) on the interconnected nature and indivisible oneness of all artistic expression, and, more to the point, on that phenomenon as the defining basis for what makes us "who we are." The "who" in this case is all of us, but most specifically, Americans.
We do not yet have the benefit of hindsight to tell us if this speech will be remembered or replayed in perpetuity. Nor is there likely to be any quantifiable effect of this speech on American cultural policy and arts patronage. But I suspect that this speech will have a lasting, formative effect on me, and if it reaches a few more, then it's certainly doing some good. This blogger's parents were in the audience, and my mother described it as "one of the great events of [her] life."
Now that I've built it up, and heightened your expectations, how could it possibly live up? Well, relax. Is it a perfect speech? I doubt such a thing exists. You may or may not care about the issues Marsalis covers. His tour through American history and arts may do nothing for you. You may disagree with some of his implications about contemporary art. I certainly did now and then. Then again, he's the lauded, world-renowned musician, educator, and impressario, whereas I'm a fanboy blogger. So I defer to Wynton in the end. The full -- and rather long -- speech is below. Double-click for fullscreen with playback controls.
Came across this photo in the interwebnets. George Cukor, John Wayne, Myrna Loy, and Steven Spielberg. WTF? What a strange and wonderful assemblage. Any guesses on the pageboy doo on the extreme right? Almost looks like Louise Lasser, or Ursula K. Le Guin (but what the hell would she be doing there?).