Sunday, September 27, 2009

Can the Mona Lisa be aura-free?

In "The Work of Art in the Age of Mechanical Reproduction," German cultural critic Walter Benjamin muses on what our ability to reproduce images in mass quantities does to art and culture. His basic argument is that mass production of art - paintings, sculpture, etc. - eliminates the "aura" of a work.

To Benjamin, a work's aura is best described as"that which withers in the age of mechanical reproduction is the aura of the work of art...One might generalize by saying: the technique of reproduction detaches the reproduced object from the domain of tradition (221)."

He used the word to refer
"to the sense of awe and reverence one presumably experienced in the presence of unique works of art. According to Benjamin, this aura inheres not in the object itself but rather in external attributes such as its known line of ownership, its restricted exhibition, its publicized authenticity, or its cultural value...With the advent of art's mechanical reproducibility, and the development of forms of art (such as film) in which there is no actual original, the experience of art could be freed from place and ritual and instead brought under the gaze and control of a mass audience, leading to a shattering of the aura (Wikipedia)."
In 2005, Former UC Santa Barbara graduate student David Roh took Benjamin's thesis argument and moved it a step further by examining whether it still holds true in the early 21st century:
"Walter Benjamin defines aura as the distance between a purveyor of the work of art and the work itself. With the advent of mechanical reproduction, he [Benjamin] argues, the distance has been closed, aura diminished, and the work of art democratized. Fast-forward nearly 70 years later, and we find that instead of aura having been completely eradicated by perfect and nearly limitless digital reproduction, the distance between the work of art and the purveyor (consumer) grows wider than ever. "
I would argue that they're both right. It may be taking the easy way out, but it's hard to disagree with Benjamin's belief that something is lost when a Kandinsky painting is mass-produced as a greeting card or refrigerator magnet. Having taken a half dozen Art History courses over the years and spent more time in art museums than at home, I think it's somewhat criminal how commercialized art has become. I have a fair share of prints of my favorite works, and I realize the art has always been commercial (how else could an artist survive) but it does seem to have multiplied in recent years. For example, during a quick visit to the Dallas Museum of Art's gift shop after viewing the recent King Tut exhibit I found plastic sarcophagus's, fabric replicas of the head garments Tut wore and numerous poster-size images of some objects not even included in the exhibition.

Though I used to be avid collector of postcards of major works of art, I finally stopped because as Benjamin argues, viewing a reproduction or copy is not nearly as satisfying as seeing the actual work of art in person. That's because postcards have no "aura" - no soul. They're mere imitations - and often bad ones at that - of something that should be considered non-transferable. Prints aren't much better, but since actual works of art are way out of my financial reach I often go that route so I can have some semblance of art other than my own or my daughter's on display at home.

To me, it seems that the commercialization and reproduction in mass quantities of art has indeed caused art to lose its "aura."

However, Roh is also right in the sense that just because something is more available to the masses doesn't make it any less important or awe-inspiring when viewed in person. r all, reproductions can also be inspiring. I am not a big fan of Leonardo's Mona Lisa but I can't ignore that while it is probably the most reproduced image in modern times, millions of people still flock to the Musee de Louvre to see the actual work hanging on the wall. While the story of the work is itself intriguing, many say they're inspired by all the reproductions to go see the real thing. And so, despite having seen countless reproductions, they still stand in line for hours at a time in order to stand six-feet away from a 21 by 30-inch painting hidden behind a glass case several inches thick. Clearly, at least some of the work's aura must remain. (What's really funny about this is that after seeing the Mona Lisa in person, even after previously seeing countless reproductions, few leave the Louvre without yet another copy of the work either on a postcard, magnet, coffee cup, T-shirt, etc. )

So, who's more right? Benjamin? Roh? Neither? Benjamin doesn't address it in this article, but I think the answer depends a lot of someone's answer to the following question - what is art?

Can a photograph be called art if it's a photograph of another work of art, say a painting or a sculpture? Or what about a film that's a compilation of previous films - can it still be considered art if the only new aspect is way the clips are arranged? Is a postcard adorned with the image of a work of art actually art? For that matter, can anything commercially and/or mass-produced be called art?

Sunday, September 20, 2009

15th century model for the 21st century??

What to say about a book that captured my attention about as much as the dictionary. Not to be disparaging as I recognize that this is an important topic, but the book needed some oomph.

Needless to say, there was one passage, near the end of Chapter 5, that struck me as a very acute interpretation of something that holds true to this day. It reads:
"Yet however sophisticated present findings have become, we still have to call upon a fifteenth-century invention to secure them. Even at present, a given scholarly discovery, whatever its nature (whether it entails using a shovel or crane, a code book, a tweezer, or carbon 14), has to be registered in print - announced in a learned journal and eventually spelled out in full - before it can be acknowledged as a contribution or put to further use (141)."
I confront this reality on a daily basis as a science/medical writer at UT Southwestern Medical Center. In my role as a senior communications specialist (basically PR), I witness firsthand our researchers' struggle to publish sometimes groundbreaking discoveries in scientific journals that carry clout in their respective fields. Peer-reviewed journals such as The Proceedings of the National Academy of Sciences, Nature, the Archives of Internal Medicine, the New England Journal of Medicine (NEJM), and the Journal of the American Medical Association (JAMA) have such stringent acceptance policies that only the most thoroughly vetted - and usually staid - research ever sees the light of publication. Researchers must secure approval for numerous peers in order for a paper to be considered, much less accepted. And once accepted, they must go through what often seems like an endless cycle of revisions in which they must answer every question posed to them. All this to get some new findings in print.

Open-access online journals have proliferated in recent years, but open-access doesn't mean that anything and everything will be published. There's still an approval process. In this sense, open access means that the material must be posted online for all to see, rather than available only to those with a subscription or some other inside track. The study doesn't have to be written in lay language either, somewhat limiting the materials' accessibility.

The problem with this system is that in order to be taken seriously - and receive grant money - researchers must publish their findings, no matter how minuscule or incremental they may be. As Eisenstein stated, the must be "registered in print - announced in a learned journal" to be considered worthy of further attention.

It makes sense that a study first published in the New England Journal of Medicine would have more clout than one printed in let's say Vogue. But who's to say what is valuable and what isn't? For all we know, someone could have discovered the cure for cancer - but it was so far-fetched that the researcher and his/her findings were shunned or flat-out ignored, never to see the light of day in a "scholarly" journal of any refute. I know several researchers who have stopped short of submitting research that turned out to be revolutionary because they initially considered the idea too far-fetched to be taken seriously. What hope is there for scientific and medical advancements if researchers censor themselves as well as their peers?

One might argue that scientists should immediately post everything online, but few researchers I've spoken with have any interest in publicizing early findings. They say there's inherent danger in publicizing their results or study methods too early; doing so would create an environment ripe for poaching others' ideas. So, you have a two-pronged problem here: Researchers need to publish their findings in print to get validation and support for their research. But, at the same time, though, they don't want to publish too early or publish far-fetched - even if they're valid - results out of fear that they'll either be subject to poaching or laughter.

I wish I had an answer to this problem. It seems very outdated to rely on a 15th century model, but no better solution has come to pass. In order for change to take place, there has to be both a new outlet and a collective belief amongst researchers that it's in their best interest to adapt to the new model.

Sunday, September 13, 2009

Remediation remediated

Disclaimer: I read this book a few semesters ago, yet I find it even more fascinating this time around. Maybe it's the fact that I have a much better understanding of what hypermediacy, immediacy and remediation are all about? Then again, maybe not. Suffice it to say that this wasn't my first pass at this reading.

This may sound funny, but reading this text makes me want to scream, cry and laugh all at the same time. Scream because the basic concepts presented here seem so simple - yet most who hold leadership positions in mainstream media corporations fail to grasp the concept that none of their so-called innovations are "new". Cry because if newspapers don't do a better job adapting to the changes brought about by new media, they will continue to falter. And laugh because I can remember listening to the Dallas Morning News' publisher and other members of upper management rave only a few years ago about how adding a team of "online" reporters and editors was going to revolutionize the news industry and turn the newspaper back into "the" source of information for Dallas-Fort Worth residents. Having a team devoted to producing content for the Web was going to provide the paper's online readers with the "immediacy" they desired while giving other reporters time to work on more nuanced articles for the daily paper. Unfortunately, most of those reporters and editors were axed in the latest round of layoffs. The Web site has certainly gotten better over the years, but I wouldn't necessarily call it a "must-read" for many locals.

Though the term remediation may be unfamiliar to some, it makes perfect sense when you think about it. There's nothing new about it - it's just a new term to illustrate what artists and others have done for centuries. I remember countless K-12 and college art classes where the assignment was to take a work of art and refashion it in another medium. We once fashioned a sarcophagus (a la King Tut) out of cardboard - something more than a few years away in Tut's time. In another class, we took still photographs and then digitally-enhanced them using PhotoShop - something impossible when the first photograph was taken. Newspapers have taken similar steps to reinvent themselves in different mediums.

As the authors mention numerous times, all one has to do is look at a single issue of USA Today to note how similar its layout is to that of the Web. Bolter and Grusin note this early on when they state:
"Although the paper has been criticized for lowering print journalism to the
level of television news, visually the USA Today does not draw primarily on
television. Its layout resembles a multimedia computer application more than it
does a television broadcast; the paper attempts to emulate in print the
graphical user interface of a web site."
USA Today isn't the only newspaper to rethink its use of graphics, photos and varied fonts as the Web has become practically omnipresent throughout society. Even the venerable Wall Street Journal has started using colored pictures and grahpics throughout the edition. The same is true of the New York Times, once dubbed "The Gray Lady" for its lack of color. Locally, both the Dallas Morning News and Fort Worth Star-Telegram have taken a stab at emulating USA Today's print edition from time to time. Whether their efforts have been successful is open for debate, but it is interesting how as the Web has evolved, newspapers have become more like tabloids or magazines than what Western society has historically considered a newspaper. The stories are generally shorter and less nuanced. (There are certainly exceptions to this, but not as many as even a few years ago.) They're also more visually-oriented than in years past with multiple photos and graphics, some of which are only available to online readers.

Though I am no longer employed full-time by a newspaper, I'm still a voracious consumer of news....television, online, print, radio. What I find most frustrating about all of it is that everybody - not just newspapers - is constantly trying to be like everyone else. As the authors also noted, television news broadcasts are more like the Web than ever before with multiple mini-screens and scroll bars fighting over your attention. Recently, many TV anchors have begun asking viewers to tweet answers to questions posed on the air; the results are shown later in the broadcast. How long will it be before consumers start broadcasting the news from their personal computers? Oh, wait. That's already happening. Rather than buy into the so-called "immediacy" that mainstream media outlets purport to deliver, many consumers are ditching it entirely and reporting the news that matters to them themselves. They're using PDA's, iPhones, laptops, etc. to report in real time what they're seeing, hearing, feeling. This sort of "immediacy" is what mainstream media strives to achieve, but oftentimes misses because of a supposed lack of staff and/or money.

I've rambled on long enough about this particular reading, but suffice is to say that I'll be intrigued to hear what others think about remediation, hypermediacy and immediacy. Is it something that mainstream media can achieve, have already achieved, or will never be able to achieve? My bet is on the middle one. I think many outlets have achieved hypermediacy and immediacy, but only to a point. As for remediation - well, most mainstream media is the definition of remediation.

Monday, September 07, 2009

The state of education today

In 2008, Business Week ran an eight-part series by Don Tapscott, the author of Grown Up Digital. In the series, he argues how digital technology has affected the children of the baby boomers, a group he's nicknamed the "Net Generation."

Though the entire series is intriguing, I find the Nov. 30 article particularly relevant to our needs. In it, Mr. Tapscott describes a speech he delivered to a group of university presidents:

"The prevailing model of education, I said, made no sense for young people
today. This model revolves around the sage on the stage, the teacher who
delivers a one-size-fits-all, one-way lecture. This model, designed in the
Industrial Age, might have been a good way to condition young people for a
mass-production economy, but it makes sense neither for young people who have
grown up digital nor for the demands of this digital age."
What amazes me is how similar it is to the following statement McLuhan made in the 1969 Playboy interview:

"Our entire educational system is reactionary, oriented to past values and past
technologies, and will likely continue so until the old generation relinquishes
power. The generation gap is actually a chasm, separating not two age groups but
two vastly divergent cultures. I can understand the ferment in our schools,
because our educational system is totally rearview mirror. It's a dying and
outdated system founded on literate values and fragmented and classified data
totally unsuited to the needs of the first television generation."
Who would have - or better yet - could have imagined that 40 years later, we'd be having the exact same debate albeit about the first Internet generation instead of the first television generation?

Few, if anyone, foresaw how pervasive the Internet would become when it was first introduced - yet here we are today devoting countless hours trying to figure out how to best educate children who literally grew up online.

Just as McLuhan said that the division between the first television generation and those educated beforehand was more a chasm than a simple generation gap, I would argue alongside Tapscott that the same could be said of the division between today's youth and even my generation - which witnessed the explosion of the Internet as we were graduating from high school in the mid- to late 90s.

Consider this: Until my sophomore year in high school, I was using an electric typewriter to prepare term papers. My half-brother, on the other hand, has to my knowledge never used a typewriter. When he graduates from Garland High School in May 2010 he'll have spent his entire academic career preparing presentations and term papers using a computer. He was preparing Power Point presentations (elementary school) at an age when I was expected to present posters or overhead slides, if I wanted extra credit. And the time he has spent in any library is negligible compared to the years I devoted to the Corpus Christi Central Library - where in the late 80s/early 90s, I used a card catalog system to research everything from the anatomy of wolves to the history of China before I reached middle school.

Don't get me started on encyclopedias. It was a huge deal when my mother forked over who-knows-how-much for our family's first and only set of World Book Encyclopedia's - yet I have never once seen my half-brother crack an encyclopedia or even mention using one, for that matter. There is not - nor has there ever been - a set of encyclopedia's at my dad's house. That's partly because what my mother considered a major investment in her children's education was made almost completely obsolete by the time my half-brother started school.

The problem this causes in education is that many - not all - teachers - have yet to realize just how expansive this generational chasm really is. They look out over a roomful of students who have been online since birth and have no clue how to engage them.

In the Business Week article, Tapscott describes how the academics reacted when he questioned why it is taking so long for the educational system to change. One educator (whose age wasn't disclosed) blamed the problem on his/her colleague's age: "Their average age is 57, and they're teaching in a 'post-Gutenberg' mode."

"Post-Gutenberg?" another president injected. "I don't think so...Our model of
learning is pre-Gutenberg. We've got a bunch of professors reading from
handwritten notes, writing on blackboards, and the students are writing down
what they say. This is a pre-Gutenberg model—the printing press is not even an
important part of the learning paradigm."
Unfortunately, the university president's assessment that many teachers are still following a pre-Gutenberg model remains right on the money. While this style of teaching is fine for many - but not all - nontraditional (i.e. older) students, the methodology simply doesn't serve the younger "Net" generation. This is because they have grown up to believe, rightfully so, that their education is in their hands and that the teacher-focused, one-size-fits-all, methodology is archaic and doesn't fit their lifestyle.

Luckily, many teachers are beginning to change their ways. As an education reporter at the Dallas Morning News for nearly five years, I witnessed first-hand how classrooms are moving toward becoming more "Net-generation" friendly. Lectures still have a place in the classroom, but teachers are encouraging more group interactions and fostering conversations rather than one-way dialogues. They're also embracing the Internet as much as their superiors allow by using blogs, Wikis, and social networking applications to connect with students.

All in all, education seems to be moving in the right direction - just not nearly fast enough. With technology continuing to advance as quickly as it does, educators will continue to be hard-pressed to keep up with the latest and greatest technology. My only hope is that they try - for our (my) children's sake.