All forms of art have their origins in tribal ritual. Everyone participated. Those outside the circle of creation were energetically fused with the spectacle. Even the motionless were part of the dance. The stories being told were not personal narratives, but group histories that sometimes sang the legends of individual heroes. But there was no individual expression; neither was there individual reception.
There is a current myth about the North American church and the blues. It is told that the church considered blues the devil’s music. Those who believe this myth also believe the guitar is an evil instrument. The truth is that the blues was a vehicle for individual expression, and had no place in the group ceremonies. The church was a place for the tribe to sing together about things that concerned them all equally. It was not the place for an individual to dominate the meeting with tales of his own personal exploits and troubles. Blues was forbidden in the church because it threatened to replace the communal ritual with a performer-audience dynamic. The organ was preferred to the guitar because it was more suitable as an instrument of group accompaniment, not because it was inherently holy.
Most music is written for the dance, not for individual listening pleasure. To sit at home listening to a Strauss waltz through headphones is as ludicrous as watching a digitalized movie on a 42 inch flat screen in your bedroom. The waltz belongs to the ballroom, and the movie belongs to the movie theater. Removing either to a place of individual experience takes away their very reason for existing.
The movies were the popular art of the twentieth century. Most people born after 1990 are not likely to have experienced the art form as it existed during its flourishing years. Sure, they go to the movies, and their experience of doing so is just as valid as those of any previous generation. And I wouldn’t argue against any of those who would maintain that theirs was the superior viewing experience. But I will argue that the place movie-going has held as communal ritual that strengthened cultural unity has been seriously diminished in this century.
And I stress the term “movie-going” as opposed to “movie-watching,” as it is evident that more people are watching more movies now than ever before. But most of this movie watching would be more accurately described as television-watching than movie-going, even when the venue is public and the screen gigantic. Since televisions became an indispensable piece of furniture in most households, movie theaters have fought a war with television for the movie-watching audience. And in the 21st Century, television stands as the victor in that war.
From videotape to bluray discs, the spectacular increase of quality in image for home viewing has given many the impression that technology has lifted film presentation to a state of perfection. But blurays are not part of film technology. Unlike cinemascope, Dolby sound, or 3D, bluray was not invented to improve the movie-going experience, but as an advancement in television technology. Since the conversion of teevee broadcasting to digital, the picture has been getting cleaner and crisper, until now, with bluray, some movie directors even claim that the image is closer to what the cameraman saw through his camera on the set than what was ever captured through the film process. Even if this is true, the aesthetic questions remain. Each decade of film-making is easily identifiable by the film stock that was used at that time. Metro color looks different than Deluxe color and Technicolor looks different than Kodak. With bluray remastering, all the movies now look like blurays. I recently watched Woody Allen’s “Manhattan” and Fritz Lang’s “Metropolis.” They looked like they were made in the same studio on the same day. Now, I enjoy watching classic films on television in such immaculate condition, but these movies look nothing like they did when projected in movie theaters at the times of their release. Claims are made that film history is being preserved, but it is, on the contrary, being destroyed.
I like the sound of digitally recorded music on digital playback equipment, but when analog recordings are remastered for digital playback, they no longer have the sound of the original recordings. . The entire history of recording has been sacrificed to a transient technology. I feel the same about movies shot on digital equipment. They have a transient look that is appropriate to the technology through which they were made, but to take older movies and subject them to such visual enhancement is a more brutal decimation of an art form than was the remastering for CD of music recorded on analog systems.
One of the worst directors who ever soiled the silver screen is making a big deal about shooting his latest picture on 70 mm film. there have been a lot of crappy 70 mm films. Now we will have one more. In my opinion, this Quentin Tarantino creature is shooting on film to draw attention, not out of love for the medium, because if he really loved the medium he never would have betrayed it by working as a video store clerk, but as a stunt to morally elevate himself above the directors who have given into the pressure of shooting digitally. But the truth is that real directors have found ways to deal with the new technology without being defeated by it. Monte Hellman’s “Road To Nowhere,” for example, looks pretty damn good for a digital feature. Now, I am assuming that since the post-production for Tarantino’s “The Hateful Eight” was done by Digilab that all the film was digitalized for the post-production work, then transferred back to film. So what the hell was the purpose in shooting on film anyway? Furthermore, most people will see it in a digital presentation on a home screen, one step further from the film-going experience.
We have lost the movies. But television is better than ever. Except, of course, for the visual diarrhea that Tarantino splashes into our living rooms.