29 04 2012

Artists, critics, and academics insist that the defining factor for any object or action to be art is intent.  Even in a postmodern mindset where anything—any act, any work of cultural production, or any object (any thing)—can be art, what makes that thing art is the intent that it is art.  This, of course, is rooted in the Modernist ideology of authority.

Modern thought places the utmost importance in authority, because it is through authoritative figures, statements, and processes that we can determine Truth.  And capital-T “Truth” is the utmost authority.  For this purpose, fields of study are singled out and highly educated experts spend their time investigating and advancing their knowledge of these fields, producing work that is True Science or True Music or True Art.  By designating himself as an Artist, a person then declares his intent to make art.  From then on, what he decides is art—what he intends art to be—is just that.  His justification is manifest in his position as an Authority on Art, an authority granted by specialization and expertise.

In the period of High Modernism (namely, the movements of Abstract Expressionism and Minimalism), the intent of being art was enough justification for a thing to be art. During the Postmodern period, however, simply being art was not enough justification for an object.  Beginning in the late 1960s, art gained (or re-gained) the requirement of meaning.  In order to have impact, the work needed to do more than just be art, it needed to mean something.

Barbara Kruger, Your Body is a Battleground, 1989

In some circles, this “meaning something” depended on shock—a tool inherited from early Modernist painters who seemed intent on forcing the advancement of society, which was another topic of Modernist importance.  High Modernists like Picasso and Pollock aimed their shock inward—the shock of non-representational painting pushing art to a more advanced, more specialized place.  However, like Realist painters such as Manet, Postmodern artists like Barbara Kruger and Ed Kienholtz aimed their shock outward, putting society itself in the crosshairs.

Activist artists like Judy Chicago, Mel Chin, Guillermo Gomez-Peña, and Sue Coe made artwork with dual intent:  to be art and to disrupt.  The requirement for rupture seems to have become inherent, especially in art produced and justified in an academic setting.  Disruption  may not always be readily apparent, and thus artist’s statements emerge as a way to explain what is disruptive about a particular work or a particular artists’ oeuvre.

What is peculiar about the supremacy of rupture as a requirement of art is that the intent of rupture seems to have the capability of being granted after the fact.  Artists who do not intend to their work to be disruptive in the present to be dismissed, and artists who created rupture in the past, whether or not they set out to do so, are elevated.  A reader commented on my last post (Thomas Kincade is Dead.  Long Live Thomas Kincade) on Facebook, arguing against my comparison of Kincade to Andy Warhol:

Even if he claims that he did not intend to, Warhol’s imagery (as banal as it was) at the time forced an examination of the boundaries of art (rupture). That’s pioneering. Kinkade’s imagery (although his methods of production and commercialism could be argued as similar to Warhol’s) does not hold the same power of rupture, just based on content alone.

Warhol was famously non-committal about his intentions regarding meaning in his work.  He made works with a popular appeal in a businesslike way that seemed to challenge the accepted specialized, reified nature of art. Critics, history books, and hero-worship have assigned the intent of rupture to Warhol, not Warhol himself.  If intent is all important in the status of an artist, is assigned intent just as powerful as declared intent?

It appears that this is the case.  The reader concluded her comments by writing, “I believe Kinkade’s illuminated cottage scenes are more along the lines of an allopathic art—an easy sell.”  Kincade was about business and selling, and Warhol was about critiquing the art world and/or society. However, Warhol’s own statement on the matter was that “Being good at business is the most fascinating kind of art.”

The figure of Andy Warhol has been ascribed the role of sly critic of mass consumer culture and big-money art markets even with the facts and trappings of his fame and wealth readily apparent.  A similar statement can be made about the work and person that is Jeff Koons.  My favorite statement regarding Koons comes from Robert Hughes, “If cheap cookie jars could become treasures in the 1980s, then how much more the work of the very egregious Jeff Koons, a former bond trader, whose ambitions took him right through kitsch and out the other side into a vulgarity so syrupy, gross, and numbing, that collectors felt challenged by it.”

Hughes goes on to say, and I agree, that you will be hard-pressed to find anyone in the art world who claims to actually like Koons’ work.  But because it is ultra-kitsch and still presented as art, we assume the intent is to critique the vulgarity and simplicity of consumer or of the art market itself.  Koons is a businessman, and a shrewd one at that.  He makes a lot of money by “challenging” collectors while stating directly that he is not intending to critique or challenge art, beauty, or kitsch.

Of course, he is challenging them.  It is not his stated intent that is accepted as fact, but it is the intent we as viewers and critics have assigned to him.  In a postmodern view, the authority has shifted to the reader, to the viewer—to the end consumer of a cultural product.  We are no longer interested in a Truth of art, but instead we accept the personal truths of our own subjective views.  Saying you didn’t intent to go over the speed limit does not mean you didn’t do it, and Jeff Koons, Andy Warhol, or even Thomas Kincade saying they don’t intend to create disruptive art work doesn’t mean they aren’t doing it.

If rupture is the new defining characteristic of art, then intent no longer can be.  A child doesn’t intent to disrupt a funeral, but it will because it wants attention.  Attention is the intent, but rupture occurs nonetheless.  Kincade just wanted attention and fame, but that shouldn’t stop us from viewing the work as a disruptive critique of the market.  It hasn’t stopped us from doing the same with Warhol.

Jeff Koons next to his own sculpture, Pink Panther (1988)

The reader’s comment used the word “allopathic.”  Allopathic, according to Merriam-Webster online, is “relating to or being a system of medicine that aims to combat disease by using remedies (as drugs or surgery) which produce effects that are different from or incompatible with those of the disease being treated.”  In this case, the system of art critique is allopathic.  Typically, critique is aimed at works of art that intend to be art in a certain way.  Here, we are critiquing work in a way different or incompatible with its supposed intentions when being produce.  In a world of relative truths, that doesn’t make the critique any less valid.


Neil deGrasse Tyson is Wrong

4 03 2012

I like Neil deGrasse Tyson.  I think he is a warm and engaging face for science on television.  He’s no Adam Savage or Jaime Hyneman—I have yet to see him blow up anything.  To my eyes, he’s no Bill Nye.  That is one titanic bowtie to try to fill.  But, as celebrities of the hard sciences go, Neil deGrasse Tyson is a shining example.

As host of Nova scienceNOW on PBS, he has proven to be engaging and photogenic.  He makes astrophysics something that at least seems accessible to a large audience.  He is the director of the Hayden Planetarium and a research associate in astrophysics at the Museum of Natural History.  When it comes to astrophysics, Neil deGrasse Tyson knows his stuff.  However, when it comes to the cultural mindsets of the Twentieth and Twenty-first Centuries, he is mistaken.

Clip of Feb. 27 Interview on The Daily Show

I am basing my criticism on an interview he gave last week with Jon Stewart of The Daily Show, promoting his book, Space Chronicles:  Facing the Ultimate Frontier.  Stewart characterizes the book as lamenting the fact that the United States, as a culture, no longer prioritizes space exploration.  Tyson acknowledges that the Cold War, fear, and the military industrial complex were the driving force behind the rapid advancements in space exploration from the 1960s until 1972, the last manned mission to the moon.  I will add that moon missions stopped around the same time the Vietnam War ended, drawing to a close the hot part of the Cold War.

Tyson claims that it was the space race that inspired society to “think about ‘Tomorrow’—the Homes of Tomorrow, the Cities of Tomorrow… all of this was focused on enabling people to make Tomorrow come.”  This is where he is wrong.  The space race was a symptom of this mindset, but it the mindset of modernism he is talking about, not just of the space age.  A focus on technological progress is one of the most rudimentary tenets of modernism, with its roots in the Enlightenment.  We see it in the Industrial Revolution, we see it in the advancement of movements in Modern Art, and we see it in the development of technology for war, transportation and communication before, during, and after the space race:  from airplanes to telephones to ipods.  Tyson even cites The World’s Fair as an example of an event geared around the space race.  While the World’s Fairs of the 1960s certainly reflected the interest in space exploration in particular, the institution itself has roots in early modernism—in the Nineteenth Century.

Chicago World's Fair, 1893--long before the space race

Despite being incorrect about its origins, Tyson is correct in pointing out that the drive for progress was the great economic engine of the Twentieth Century, and that careers in science and technology were essential for that progress.  The combined factors of fear, war, and modernist pursuit of progress meant that those careers were celebrated as important for the betterment of society.  Little Jimmy wanted to be an astronaut or a rocket scientist because it was a glamorous and important part of society, an attitude that was reflected in films, news broadcasts, and federal funding.

Stewart assumes that the diminished interest in space exploration had to do with expectations of achievements were not matching the pace of their execution—that we expected to be on Mars by 1970 and since we weren’t there, we got tired of waiting.  Tyson augments his assumption, saying that the diminished interest came from not advancing a frontier.  “The Space Shuttle boldly went where hundreds had gone before.”  This is not the frontier exploration that gains headlines in a world looking for better, faster, stronger, bolder, and further.

Aside from being wrong about the societal motivation behind the space race and the connected advancements in technology, Neil deGrasse Tyson clings to that modernist mindset.  His solution for society is to increase funding for NASA in order to mount a manned mission to Mars, which he believes will excite the populace to value the activity of scientists and technologists, thus fueling the economies of the Twenty-first Century.

Maybe Tyson just wants to revive the careers of Gary Sinise and Tim Robbins. It does promise to be thrilling and exhilarating.

As I have written before, I am skeptical about the notion that we are in an era outside of modernist influence.  While originality in art or even in invention is not necessarily the hallmark of progress that it used to be, advancement is nonetheless necessary for success in our creative, corporate, and governmental evaluations.  A person only needs to look at one very celebrated company—Apple—to understand that advancement and progress are still very much parts of our ideology, and that is the second instance where Tyson is wrong.

Contemporary society does value the activity of scientists.  It might not value the same kinds of scientists that made big, physical advancements like space exploration or the atom bomb, but it does value the kinds of scientific advancements that power the new economic driver: information.  According to postmodern theorist Jean-François Lyotard, the purpose of science is no longer the “pure” goal of its Enlightenment origins. “Instead of hovering above, legitimation descends to the level of practice and becomes immanent in it.”  For Lyotard, scientists are no longer trying to find an absolute “Truth” about the universe (that might come from the exploration of, say, space), but seeking to advance the commoditization of knowledge—the consumption of information.

In a way, Tyson one-ups Lyotard.  By acknowledging the driving force of fear in the space race, he acknowledges that the societal motivation for scientific advancement was outcome-based (winning the Cold War), rather than ideologically-based Truth-seeking.  Even at the height of modernism, pure science was a myth.  Nonetheless, the ideas of Lyotard underlie the entire undertaking of contemporary science.  It isn’t about an authoritative Truth, it’s about consumable truths. For scientists, those consumable truths are technological advancements—however minute, however arbitrary. We do value scientists, as long as they are working toward something we can consume.

The fact that, in this photo, the iphone resembles the monolith from 2001: A Space Odyssey is pure coincidence.

The space race produced consumables—Tang, Velcro, the Tempur-Pedic bed—those were indirect in reaching the consumer market.  Today’s advancements directly aimed at consumers with tablet computers, smart phones, and cars that park themselves.  These advancements aren’t a byproduct of some high-minded pursuit of pure scientific exploration, but directly researched, experimented upon, and produced for us.

I sympathize with Neil deGrasse Tyson.  He wants a modernist society where the pursuit of Truth motivates a populace and advances a culture.  But, as he acknowledges, that pure science may never have been the real motivator at all.  Science is now inextricably linked to product value in technology.  The advancements are more accessible, but they are less tangible.

Works Cited:

Tyson, Neil deGrasse. Interview by Jon Stewart. The Daily Show. Comedy Central. Comedy Partners, New York.  Feb. 27,2012. Television.

Fraser, Nancy and Nicholson, Linda.  “Social Criticism Without Philosophy:  An Encounter Between Feminism and Postmodernism,” Universal Abandon:  The Politics of Postmodernism.  Ross, Andrew, ed. Minneapolis:  University of Minnesota Press, 1988, p. 87.


25 11 2011

Cultural figures regarded as heroes often follow a similar path to other, mythical heroic figures.  From Superman to Hercules to Jackson Pollock to Kurt Cobain, there are components that we tend to latch onto in order to label the person as “great.”  Aside from a skill in a particular field, are that the hero must be, in some way, separate from society.  In mythology, the hero must make a trip to the underworld.  “Real world” heroes, it seems to follow, must also take a trip to the underworld, but they don’t end up returning.  “Real world” cultural heroes must be dead.

Even Superman made a trip to the afterlife.

In the classic Western, the man without a name shows up in a seemingly sleepy town that is overrun by a criminal cattle-rustling gang (Tombstone), or a corrupt mayor (Unforgiven), two families vying for its control (A Fistful of Dollars).  The hero is a symbol of something from outside of society, as represented in the town.  Superman is outside of the society of Earth, as Superman is from Krypton.  Spiderman is a little bit of a trick to fit into this mold—Spiderman is a teenage boy, not necessarily something outside of the society of New York.  However, Stan Lee purposely created Spiderman (and many of his heroes) to be a teenager—teenagers, almost without fail, feel alienated from the society of which they are a part.  Since they feel themselves to be outside of society, they see society as an outsider—even if they really aren’t.

With real-world cultural heroes, it is a similar stretch to see how a given person may exist outside of society.  However, it is often what is glamorized about the person.  Take Vincent Van Gogh for example.  If a person on the street knows nothing else about Van Gogh, they will know that he was in some way crazy and they will certainly know the story about his cutting off of his own ear—which is a crazy thing to do.  A person afflicted with mental illness is outside of the normal boundaries of societal expectations.  This also shows up in the chemical dependency of many cultural heroes.  Earnest Hemingway was an alcoholic.  So was Jackson Pollock.  Sigmund Freud was hooked on cocaine.  For Elvis Presley it was pills, for Kurt Cobain it was heroin, for Hunter S. Thompson it was every drug under the sun.

Proof of the cultural influence of the counter-cultural.

For each of these, and for many more, we see the figure as being outside of the normal confines of expected social behavior.  They are, in some way, “other” than us.  Hunter S. Thompson might be close to the perfect example because, not only did he exist outside of society, he did it in a purposeful manner.  He plunged headfirst into Gonzo journalism and brought the rest of us along for the ride—to see the seedy underbelly of Las Vegas not as a participant, but as a mentally altered, “objective” observer.  His writing is from the point of view of alienation, and through that, we can put ourselves in the position of the hero, if only for a short while.

The real world cultural heroes I have listed here have something in common other than substance abuse.  They are all dead.  Classical Greek heroes make a trip to the underworld.  So did the Roman copy of the Greek hero, Aeneas.  So did the American version of Hercules:  Superman.  So did the basis for the Christian faith:  Jesus.

Non-mythical and non-religious figures have a difficult time returning from the dead, but figures who leave some sort of artifacts have a way to continue “existing” after they have died, even if they are not technically alive.  Van Gogh’s paintings draw crowds and high prices well into the 21st Century. The songs of Presley and Cobain continue to get airplay or to be downloaded onto ipods, even the work of Sigmund Freud, largely abandoned in professional psychology, finds its way into literary, artistic, and academic production.

The longevity of the work of these individuals is the indication of their heroic impact. However, the impact of the works themselves is largely dependent on the fact that they are dead.  Once an artist is no longer capable of creating new work, their oeuvre is complete.  They won’t be around to create new work—so the supply is fixed (hence, with increased demand, prices can go up—see sales figures for Van Gogh’s sunflowers or Warhol’s collection of kitsch cookie jars).  Also, the work is static—unchanging. We can think of Jackson Pollock’s work as the drip action-paintings of the 1950s and not have to worry that he may have been influenced by Minimalism or Pop or some postmodern abhorrence later on in life.  He wasn’t around to be affected by those.  His work can remain pure in his death.

In poets, artists, and musicians especially, (and certainly other professions who heroize historical figures) the pattern of substance abuse and death influences the behavioral patterns of students and young professionals in the field.  In ways, it seems that art students want to find some sort of chemical dependence in order to be like the artists they are taught to revere.  On the flip side of that, one might argue that the “creative mind” is already inclined toward such behavior, since to be truly creative requires an ability to think outside of the accepted confines of societal thought—to exist outside of society.

Personally, I am wary of any broad generalizations made about “creative minds,” as if they are sentenced to be artists and addicts and have no way to behave as, say, an engineer or someone with a “scientific mind.”  While some truly creative people are truly troubled mentally or chemically, many, many more are wannabe hipsters who think that if they drink enough or take enough drugs they’ll be able to be like their heroes—addicted, then dead.

To that end, I am reminded of Sid Vicious.  Sid was no great bass player and really didn’t have an ounce of musical or poetic talent in him.  He was recruited to be in the Sex Pistols because he had the punk look—he seemed to embody the attitude of a group desperately rebelling against society. Maybe that’s all that punk truly was (or is)—an all-encompassing, willful effort to exist outside of society, not necessarily to change it in any way or to contribute some “great” work of art to make general progress.  If that was the goal, Sid Vicious can certainly be seen as punk’s patron saint.

Sid Vicious: A whole lot of style, very little substance

This attitude of nihilism, however, doesn’t line up with the notion of the heroic cultural figure.  Heroes, in existing outside of society, in some way progress or protect society as a whole.  The good guy in the Western chases the corrupt officials out and the city can be civilized again.  Superman fights for “Truth, Justice, and The American Way.”  Jackson Pollock influences the direction of abstraction in art, and the reaction against abstraction, to this very day.

Kurt Cobain existed at the intersection of the outsider and the cultural paragon.  He wanted so much to be outside of the popular culture he was so much an influence on that, in the end, it killed him.  Rather, he killed himself.  True cultural heroes, whether they want it or not, are as much a part of the greater culture as anything they project themselves to be apart from.  Perhaps it is that paradox that drives them further away.  Perhaps it is the paradox itself we end up elevating as heroic.

Musings on Methods of Communication

28 10 2011

Looking out my window, there is a man with a small child—probably four or five years old—walking down the sidewalk.  The man is looking into his cell phone, probably at a text.  The child is tugging at the man’s pants, trying to get him to go the other direction—trying to get his attention to look at something fascinating like a squirrel or a dead bug.  But the man is distractedly continuing.  He’s not necessarily ignoring his child—he is tugging back as if to say, “No, we have to go this way,” but he is detached.  He is otherwise engaged in whatever is on the screen of that phone.

Distracted parents are nothing new, and we can travel back in time and see the same scene with other devices.  Ten years ago, the person would be talking on the phone.  Thirty years ago, the man may be hurrying home to a land-line to retrieve a message on an answering machine.  Forty years ago, the man may be engrossed in a newspaper story as he walked down the sidewalk.  While the distractedness and preoccupation is not new, overall there does seem to be a shift back to communicating via text as opposed to verbally.

Methods of communication have changed over time.  From Gutenberg to the telegraph to fax machines to smart phones, technology has facilitated grand sweeping changes to the methods we use to transmit information from one person to another.  The curmudgeon in me wants to rail at the tide of progress, lamenting the “less personal” approach taken in the present time, but surely a person in the Renaissance may have said the same thing about moveable type.  “What?  You can just mass-produce copy after copy of this manuscript?  Where’s the time spent pondering the true meaning of the text?  If you’re just blindly churning them out, you aren’t spending the hours with each letter, forcing you to ponder what is really behind it.”

I am finally getting a new cell phone plan today, and I have come to the realization that I will need to break down and allow for more text and data and less calls.  Texting is something that I have a hard time with.  Without the nuances of inflection and intonation, I have had many a text message received poorly.  What’s more, I think in longer sentences than the text message is designed for.  It takes me forever to type out a response to someone’s question that may be as simple as, “Where are you going for lunch?”  The straightforwardness of the language required and the expected brevity of the messages lead me to connect the text message with the telegraph.  It’s like we’re moving backwards.  The only difference between now and 1909 is that we don’t need a messenger to deliver the text to us—that messenger is in our pockets all the time.

These are more than telegram-delivery boys.  They can instantly send our messages out—not just between cell phones, but to the entire internet.  Maybe you’re even reading this blog on a smart phone.  We are no longer tied to our homes or wi-fi hotspots to post a blog, status update, or tweet to the entire world.  Everyone can see what we have to say!  And yet, we walk along sidewalks, gazing into our phones, ignoring each other as we pass by in real life.  We can communicate with everybody and yet we talk to nobody.

If we are communicating without contact, I question how real the communication is.  Through all our posts and texts and blogs, are we saying anything of consequence?  Is there any action that comes from all this information transmission?  Are those actions and consequences real, or are they hyperreal?  Of course there are real-world consequences resulting from digital communication.  Just ask Anthony Wiener.  But inadvertent results are far from intentional.  With the power of such mass communication, what more can we learn about and from each other and what can that help us learn about ourselves?

For Contemporary Critique, I sit at a computer and type essays with the intent that they will be read by many, many people.  Sixty years ago, I would have needed a publisher to do this.  Twenty years ago, I still would have needed access to a relatively cheap copy shop and a few friends to help add content for a ‘Zine.  With this blog, I need no editor and no outside evaluation or affirmation, I can simply type, post, and know that out there, somewhere, at least one person has read and understood what I am saying.

As simple as they may seem, it takes at least a few people to put together an effective 'zine.

I am fond of warning artists against what I call “masturbatory” art—art that is solely made for the artist himself, disregarding its impact on any outside viewer.  Additionally, one of the chief purposes of object-based art is communication.  So it follows that I warn against masturbatory communication as well.  In text message- and internet post-based communication, we are working in a one-way fashion similar to art objects or television.  The artist makes the object with a specific intent, and the viewer is left to decipher that intent on his own.  I can send you a text message, but I can’t adjust my statement to a quizzical look or fine-tune my intent with a certain inflection.  With this one-way method of communication, it seems imperative that whomever may choose to use it put as much thought into their statements as an artist puts into his product.

Does this mean we need MFA programs for blog posts?  Editors for text messages?  Publisher-approval for tweets?  Those may all be a bit extreme.  But having an audience in mind for whatever the method of communication may lead to more clear choices, and more clear understanding down the road.

The Nostalgia of 9/11

9 09 2011

Here we are nearing the middle of September, a time when, once again, we start to see a buildup in cultural production—television programming, radio interviews, news commentary, etc.—centered around the topic of remembering the attacks on the World Trade Center towers and the Pentagon on September 11, 2001.  This year, marking the tenth anniversary of the event, has the familiar commemorative speeches, memorial services and monument dedications that we have come to expect.

The further away we get from the date of those attacks, and the more memorializing that happens concerning them, the less impact the events seem to have.  The iconic images are, by now, quite familiar—the video shots of planes hitting the towers, the collapse of each, almost in slow motion, the people fleeing from the onrushing cloud of dust and debris, the thousands walking across the Brooklyn Bridge, the photo of the firemen raising a flag on a damaged and twisted flagpole.  The repetition of those images, especially over time, begins to obscure our own personal memories, our own personal experiences, of that day.

Jean Baudrillard argues that the attacks, to most of the world, were in fact a non-event.  I was living in Spokane, Washington, nowhere near New York City, Pennsylvania, or the Pentagon.  My experience of that day was through the images, not in the events themselves.  The attacks did not really happen to me.  But in a hyperreal world, “factual” experience isn’t the end of the story.  While the physical attacks had no bearing on my experience, the symbol of the attacks did.  The images that were repeated over and over again that day, and in the weeks and months that followed, on television, radio (if  you’ll remember, all radio stations switched to whatever news-format they were affiliated with for about a week), and the internet.  The images were re-born in conversations between friends, family, and acquaintances.  The violence did not happen to us, but the symbol of violence did.  As Baudrillard states, “Only symbolic violence is generative of singularity.”  Rather than having a pluralistic existence—each person with their own experience and understanding of any given topic—our collective experience is now singular.  Nine-eleven didn’t physically happen to me, so it’s not real, but it is real. It’s more real than real.  It’s hyper-real.

But in the ten years since, the hyperreality of the attacks seems to be fading into something else.  As the vicarious (for most of us) experience fades into memory, the singularity of that symbolic violence is shifting into one of nostalgia.  The events as historic fact are replaced by our contemporary ideas about that history as it reflects our own time.  Nostalgia films of, say, the 1950s aren’t about the ‘50s.  They are about how we view the ‘50s from 2011.

The 1950s scenes in Back to the Future don't show us the 1950s. They show us the 1950s as seen from the 1980s.

We’ve seen this nostalgia as early as the 2008 Presidential campaign, which included many candidates using the shorthand for the attacks (“Nine-eleven”) to invoke the sense of urgency or unity or the collective shock of that day.  The term “nine-eleven” no longer just refers to the day and attacks, but to everything that went with them and to the two resulting wars and nearly ten years of erosion of civil liberties.  What happens with this nostalgia is that details become muted and forgotten, and we end up molding whatever we are waxing nostalgic about into something we want to see—to a story we can understand and wrap our heads around.

The Daily Show With Jon Stewart Mon – Thurs 11p / 10c
Even Better Than the Real Thing
Daily Show Full Episodes Political Humor & Satire Blog The Daily Show on Facebook

This morning I listened to a radio interview of a man who carried a woman bound to a wheelchair down some 68 floors of one of the towers on the day of the attacks.  He was labeled a hero, but in subsequent years, slid into survivor’s (or hero’s) guilt and general cynicism.  He looked around the United States in the years after the attacks and saw the petty strife, the cultural fixation on celebrity trivialities, and the partisan political divide seemingly splitting the country in two.  He longed for the America of the time immediately following the attacks, “Where we treated each other like neighbors,” the kind of attitude, as suggested by the interviewer, that led him to offer to help this woman he did not know in the first place.

Certainly, there was the appearance of national unity after the attacks.  Signs hung from freeway overpasses expressing sympathy for those in New York.  Flags hung outside every house in sight.  People waited for hours to donate blood on September 12, just to try to do something to help.  The symbols of unity were abundant, but division abounded as well.  Many were still angry, skeptical, and suspicious of George W. Bush, who had been granted the presidency by a Supreme Court decision which, to some, bordered on illegal.  Within communities, fear and paranoia led to brutal attacks on Muslim (and presumed-Muslim) citizens.  Fear led to post offices and federal buildings blockaded from city traffic.  In Boise, a haz-mat team was called due to suspicious white dust, feared to be anthrax, on the steps of the post office.  It turned out to be flour placed there to help direct a local running club on their course. The flags were still flying, but the supposed sense of unity and “neighborhood” was, in actuality, suspicion.

To look back at September 11th, 2001 and view it as a time of unity in comparison to the contemporary political divide is nostalgia.  The view is not of the historical time period, but what one wants that time period to have been that then acts as an example of what the present “should” be.  Perhaps nostalgia is inevitable.  As time passes and memories fade, the repeated symbols of any given time or event become re-purposed, gain new meaning from the reality (or hyperreality) from which they are being viewed.  The goal for many regarding the attacks is to “never forget.”  The repetition of the images keeps us from forgetting, but it also contributes to the memory changing.

Sources:  Baudrillard, Jean.  “The Gift of Death.” originally published in Le Monde, Nov. 3, 2001

Here and Now (radio show).  “A Reluctant 9/11 Hero Looks Back.”  Airdate:  Sept. 9, 2011

On Connoisseurship

2 09 2011

Connoisseur.  The word itself reeks of snobbery. It brings to mind men in sport coats with leather elbow patches wearing ascots while sitting in overstuffed leather chairs smoking pipes and holding snifters of 100 year-old scotch.  Connoisseurs are experts, people who enjoy, appreciate, or critique something based on knowledge of details and subtleties.  Connoisseurs know why 100 year-old scotch is superior to others, what separates a good work of art from a bad one, and the difference between a masterwork by Tennyson and the vulgar work of a slam poet.

The Ladies Man knows a lot about wine... you might call him a "Wine-Know."

The difference between a connoisseur and a layperson is, supposedly, one of education and taste.  In theory, one must be taught to appreciate the subtleties of fine scotch—one must know the details of the process of production, how to detect the smoky bouquet of flavors provided by the aging process and the burnt layer inside the oak barrels, the consistency of the fluid against the roof of the mouth, blah, blah, blah.  What is required to become a scotch connoisseur is the ability to speak eloquently to justify his opinion, and, above all, access to the high-end scotch he is justifying his opinion about.  Why is it expensive?  Because it’s good.  Why is it good?  Because it’s expensive.  It’s exclusive.  Not everyone has access to it, therefore it is rare, therefore it is something to be coveted, praised, and held in high regard.  Connoisseurs can afford it, so they only drink “good” beer and “good” whiskey.

The rest of us, in the words of poet Kristen Smith, know in our heart of hearts that “no beer or whiskey is ever bad!”  Whiskey, beer, steak, art, poetry—the common attitude of laypeople is that they like what they like.  To each his own, in the case of matters of opinion, on what he might prefer.  This is, at the heart, a pluralist attitude.  What is good to one person may not be good to another, but neither opinion has any more cultural weight.  I like the Beatles.  A former student professes to hate the Beatles, but likes Jazz.  I am not going to convince him that the Beatles hold a higher cultural place than Jazz, just as he isn’t going to convince me of the reverse.  So we just leave each other to our own opinions and move on with our day.  What each of us prefers is dependent on our own personal tastes.  A connoisseur might see this statement and remark, “There’s no accounting for taste.”

While populists might not want to acknowledge it, the statement is true.  There is no accounting for personal taste—it isn’t quantified, justified, or legitimized.  Those are all key components provided by connoisseurs and institutions to answer questions of taste with definitive categorizations.  I could argue until I’m blue in the face that Rolling Rock is just as good as Samuel Adams Boston Lager, but the continued awards won by the latter prove that it holds a higher place in American beer culture.  It is the institutions of legitimation of art that arrange the strata of artistic output—the museums, galleries, and auction houses identify, define, and quantify what art is good and how much it is worth.  In this case, it is the role of the critic, acting as publicized connoisseur, to educate the wider public on how these works fit in to the overall picture of quality that has been painted by these institutions.  Much like the Samuel Adams TV ads in which the CEO and brewers tell you how the beer is made and that you should appreciate it, the role of the critic in art is that of marketing.

Don't drink the beer to see if it's good! Shove your nose in hops! That's how you know it's good!

Clement Greenberg exemplifies this role in regards to Abstract Expressionism.  As America’s “art critic laureate,” he was able to not only see for himself the qualities that made the work of Pollock and de Kooning  “good” art; he was able to write the justification of why convincingly enough that, in the end, the greater American public agreed with him.  They acknowledged the primacy of abstraction in painting and the position of the galleries, auction houses, and museums was now the accepted truth in regard to quality in art.

However, Greenberg’s formalist criticism and attachment to a universal idea of beauty in art, regardless of historical period, led him to be the model for the caricature of the out-of-touch, snobby art critic.  He wanted no knowledge of the person or process of making in a work (or so he claimed), and would not look at a work until he was viewing it all at once—as if expecting it to overwhelm him with its greatness, if it indeed possessed it.  He would stand with his back to a work and wheel around to view it, or cover his eyes until he was ready to take it in, or simply have the lights off in the room so he couldn’t see it until they were turned on and, like a flash, the painting overtook him.

To see this in action, view a scene from the film Who the #$&% is Jackson Pollock?  The documentary follows the path of a painting discovered in a second-hand store by a truck driver that may or may not be by Jackson Pollock.  To help to solve the dispute, former director of the Metropolitan Museum of Art, Thomas Hoving, is called in.  The painting is installed in a room, and Hoving walks in, covering his eyes.  He sits in a chair directly in front of the painting and looks at the floor for a few seconds before abruptly raising his head, eyes wide open, in order to have the presence or absence of Pollock-ness hit him square in the face.

This is Thomas Hoving.

From his actions to his dress to his manner of speech, Hoving personifies the stereotype of the connoisseur.  The film ultimately brands the art establishment as snobs and hypocrites—using Hoving’s and other’s refusal to acknowledge the painting based on lack of provenance pitted against a CSI-like forensics investigation that seems to place the painting in Pollock’s Long Island workshop itself.

But, to dismiss connoisseurship in favor of pluralism is problematic.  Whether it is based on marketing, public relations, or personal involvement, people have opinions and a collective group will ultimately pass judgment on a given cultural product one way or another.  Groups that are more invested will be more passionate in their arguments, groups with more education and skill in persuasion will be more convincing, and groups with access to funds or institutions of legitimation will ultimately make their opinions into acknowledged classifications.  Legitimation comes with the acquiescence of the greater public.

Inevitably, in discussions on connoisseurship and legitimation, an artist will eventually argue that he or she makes work for him- or herself, not for any general public or for anyone else at all.  This is a lie.  A work of visual art is made to be seen—to be seen by someone other than the artist.  If it were not, the artist could just think of the image, never execute it, and be happy with it.  A work of poetry or prose is written to be read or performed to be heard.  All art, from writing to painting to film to music, is, at its heart, communication, and communication must take place between at least two people.  This is true of traditional artworks focused on communicating beauty, and equally true of artworks based on sharing experience.  Once the work in question is in the public sphere, the general impulse is to evaluate it.  Enter the experts; enter the judgments; enter the machinery of legitimation.

The second a work is on display, the process of judgement begins.

Still, a painter or poet may argue that they don’t ever show their work to anyone, that they write it and leave it in a notebook, or they paint it and put it in a closet.  Surely, this kind of masturbatory production of art occurs.  However, these artists then make the argument that, because they don’t exhibit to any public, their art shouldn’t be judged as “bad.”  I suppose that is valid.  I can’t say an artwork is bad if I haven’t seen it.  However, it is inconsequential.  It has no place in the greater cultural discourse that is art.  Masturbating doesn’t mean you’re good or bad in the realm of sex, it means you aren’t a part of sex.  Making work only “for yourself” doesn’t make it bad art, because it isn’t even involved in the rest of art.

Connoisseurship is ubiquitous, and it happens even in areas of cultural production deemed “low” by experts of high standing.  Slam Poetry is a niche art form, widely dismissed by literary poets as too easy, too steeped in cliché and too obvious to be considered high art.  Even so, there is connoisseurship within slam itself—audience members who go to as many shows as possible and have opinions on one poet over another or even rank poems by a single poet.  A certain type of “hostage poem” (one that uses topics that stir universal emotions; topics such as rape or cancer) is generally panned by poets, but often scores well among audiences.  The structure of slam itself is geared toward qualitative evaluation:  there are scores, there is a winner.  Even for an artist outside of the kind of art accepted by so-called experts, to dismiss evaluation doesn’t work.  Within every kind of production—artistic, cultural, or otherwise—there are experts, there is evaluation.

From art to poetry to metal, any cultural product has its share of connoisseurs.

A connoisseur can be Thomas Hoving, all houndstooth jacket and condescending speech.  A connoisseur can also be an expert in street art, or carpentry, or Norwegian cooking.  We see critical writing and opinion on everything from video games to symphonies.  Our cultural output seems to be built to be evaluated, and we look to experts to help us classify what is and isn’t worth out time.

Let’s Talk About Lady Gaga

26 08 2011

Originality just doesn’t seem to be all that important anymore.  Oh, sure, there seems to be a cultural drive toward innovation, but just how much innovation can we, as a society, take?  Technological innovation is not my primary target here, and surely there are examples of technological originality that drives cultural shifts in behavior such as smartphones or ipods.  Although, some areas of technological advancement are somewhat hindered by a societal push against originality.  No matter how revolutionary the electric car that may be developed by one automotive firm or another, they all maintain the same general “look” of the kinds of automobiles we have been used to for over eighty years.  When cars move drastically in style from the typical design with a longer front end to house an engine, they seem silly (take the BMW Isetta for example—there’s a reason it was used as the nerd Steve Erkel’s car in the sitcom Family Matters).  There is no real purpose for having this space in electric or hybrid or even gas-powered cars, but cars that change drastically don’t sell, because the public chooses the familiar over the innovative.

Really, though. This is a ridiculous car.

Culturally, at least “mass-culturally,” we do not seek out the truly innovative, strange, or original.  As I write this, the film Fright Night is opening.  I actually had no idea it was a remake of a 1980s B-movie, though I certainly saw no reason to put it into a category of “ground-breaking films.”  From the previews, it seems like a vampire-filled version of the Scream films, which, while reflexive, were themselves rehashing a horror-movie formula that has been around since the dawn of the genre.  My point is that the non-original quality of contemporary entertainment is not limited to remakes of previous cultural production, but that the formulas are used over and over again, and the cultural quotation that occurs between the individual instances of using those formulas is so universal, that we often don’t realize that anything is being quoted.

As an example, let’s look at an ad from the current Foot Locker campaign:

It seems harmless enough, if a bit stupid.  However, the average television viewer may or may not be aware of the internet video series this ad is similar to:  Drunk History.

Four years after the first Drunk History video, the Foot Locker ad is using many of the same triggers for humor:  a loose grasp of historical facts and contemporary language and behaviors used by historical figures in re-enactments.  It is also using a similar laid-back delivery in narration.  It’s not drunken delivery, but it is somewhat slow and a bit monotone.

As I have outlined before in Ad-Stiche, pastiche can be seen as quotation that seems to make no indication that it is aware of the fact that it is a quotation.  It isn’t satirical or mocking of the original source, nor is it an homage.  It is simply a hollow parody.  Ads as pastiche may seem too easy, too obvious.  Advertisers copying something popular is practically encouraged as a way to tap into the contemporary consciousness.  However, Drunk History is a bit on the obscure side and, more importantly, it is old.  Four years after its first burst of popularity, with thousands of memes, viral videos, and flying pop-tart cats being produced and distributed in the meantime.  In this case, the parody becomes subsumed, unconscious; hence, it becomes pastiche.

Perhaps my favorite example of originality’s lack of importance in contemporary culture is Lady Gaga.  Some years ago, there was a bit of noise raised over the similarity in sound between her song, “Alejandro” and Ace of Base’s 1993 song, “All That She Wants.”


There is obvious quotation in the opening few bars with the flute/synthesizer, arguably the melody, and perhaps even in the narrative.  I don’t see “Alejandro” as an Ace of Base rip-off, but as a knowing acknowledgement of a type of fetishization of Latin American men in popular music—not just with Ace of Base and Lady Gaga, but with Abba as well.  The use of the name “Fernando” is, to me, an obvious allusion to the Abba song of the same name.  It even has flutes!  The reason this becomes pastiche is because, while some of the target audience for the Lady Gaga song might be familiar with Ace of Base, they are largely unaware of Abba, and overall they are unaware of the fact that Gaga is seeking to quote and allude to these earlier songs, not to steal the work.

More recent claims about Lady Gaga stealing from previous songs have been made regarding stylistic similarities between “Born This Way” and Madonna’s “Express Yourself.”  And there, of course, are many. (FYI–if you click the “Born This Way” link, it’s the full music video, complete with extended movie-intro.  You may just want to skip to the middle to get the gist of the song.)  In fact, Lady Gaga’s entire persona is built on the kind of performance-based, change of identity, strong female presence that Madonna embodied in the 1980s and 90s. But Madonna also used pastiche and quotation, most obviously of the look of glamorous, golden-age Hollywood stars like Greta Garbo and Marilyn Monroe, just as much as Gaga uses it.  Gaga is just more blatant, or perhaps I should say, more open, and possibly, more aware.  Lady Gaga’s name itself is quotation, referring to the 1984 Queen song, “Radio Gaga.”

Madonna and Marlene Dietrich. You could do this, side by side, with Greta Garbo as well.

As before, I am not denouncing this trend toward unoriginality and pastiche.  Nor am I disparaging Lady Gaga for employing them.  I love Lady Gaga.  If I could find a way to incorporate her into every single class I teach, I would do it.  What I am doing here is highlighting areas in which we see cultural production on a very commercially and critically successful level, and the originality of that production is not the most important draw—it’s the personae, it’s the performance.  Originality is no longer the touchstone of cultural achievement, packaging is.

Packaging... egg... Gaga... it's a metaphor! Get it?

%d bloggers like this: