Author Archives | Nikolaus Kennelly, Columnist

Can you Separate a Creator from their Works? Yes.

In a very real sense, works of art are dead. Like Lenin’s corpse, they lie entombed in great mausoleums, a mixture of noxious preservatives serving as a kind of quasi-life support system. Except here, unlike in the case of the dying patient, the life support system serves us—the patient’s family, admirers, lovers. Its purpose is simple: freeze time and make us believe that the patient is only now on the cusp of death (or in the case of Beethoven’s Ninth, recreate an experience that happened 200 years ago). We are motivated in part by the belief that Lenin’s corpse, like the great baroque painting, is missing something essential to its nature: its soul. We believe that without that essential thing it will lose its value, becoming no different than the rocks and valleys and the deep void of space.

This view is flawed. It pretends that the humanity of the viewer should be subordinated to that of the artist. It forces us to believe that artworks are like feeding tubes rather than mirrors. But this isn’t how the process works. Instead, when we look at a piece of art, our own humanities—our pains, joys, sorrows and delights—are reflected back onto us. If your humanity is full, your reflection will seem fuller. If, on the other hand, your humanity is empty, all you’ll see is void. It doesn’t matter if the artist is alive or dead, and consequentially, it doesn’t even matter who the artist is.

Can you imagine the alternative to this view? A world where Gauguin’s abandonment of his family somehow impacted the meaning—and dare I say, value—of his artworks? Can you imagine how empty the museums would be if an artist’s immoral deeds rendered her sketches, paintings and writings immoral? Would works by artists of dubious moral standing—artists like Caravaggio, Dostoevsky, Schiele, Picasso, etc.—be seen as corruptive simply because of their creators? The answer, were we to give the holders of this view even the slightest bit of leeway, would seem to be in the affirmative.

There is, of course, a way for these sorts of blacklists to occur even when we stop reading artists into their artworks. They come in the form of economic boycotts, but they require the artist (or her estate) to still be around in order to hold ground. They go roughly like this: Because the artist or her estate gains revenue from my consumption of her art, if I find the artist’s actions reprehensible, I have a reason to avoid consuming said artist’s works. I can’t really contest this in the same way that I can contest so-called “social capital,” except maybe to advocate piracy in those situations.

To sum up, I’m advocating for a shift in how we normally view artworks. Instead of valuing them because of their connection to an artist, we should value them for what they can tell us about ourselves. This means recognizing two things: first, great works of art are dead until we resurrect them. And second, the value of a work of art depends entirely on our own humanities. Finally, under this framework, the only time boycotting makes sense is when there’s a monetary connection between an artist and her works.

Posted in UncategorizedComments Off on Can you Separate a Creator from their Works? Yes.

The Pursuit of Authenticity

We humans crave the anxieties of the Savanna. To engage in a standoff with a hungry lion over an antelope carcass is a dream many of us secretly harbor. The rush of leading a life on the edge, where only the essentials matter, is shared almost universally by those who play boardgames, watch TV shows and engage in extreme sports. Replace the antelope with a digital recreation, a little red flag or the words “you win,” and the desire becomes apparent. Once you’ve started paying attention, the craving begins to pop up everywhere: it appears in pharmaceutical commercials, where an anxiety is induced and then relieved; it pops up in video games, where monsters must be defeated in order to achieve a gold star; and it even shows up in academics, where a test begins to take on the form of a ravenous carnivore.

Illustration by Claire Revere

In the short term, these surrogate activities appear to give our lives meaning, but over time their power begins to decline. Sooner or later, we find that in order to really be fulfilled, our activities should serve an essential purpose. These essential purposes are diverse, but they all relate in some way to our basic needs: food, water, shelter, sanitation, healthcare. The problem, of course, is that none of the activities that lead to these needs are direct enough. In order to acquire food, we must sit in front of a computer screen for a set amount of time; in order to acquire healthcare, we need to vote properly in an election that happens every four years; and in order to have proper sanitation, we need to pay taxes.

This sort of indirectness would seem inconsequential were it not for the ubiquity of surrogate activities; we sit in houses bought with funds acquired from office work playing video games that simulate building houses. There’s clearly something slightly off about this, especially when the surrogate activities involve closing oneself off to the world. It’s actually an interesting aside—discussed perhaps by figures like Durkheim—that as social cooperation becomes the norm, people begin to feel more and more alienated from one another, ultimately retreating into solitary surrogate activities.

The alternative, of course, is equally problematic. If living an authentic life means building my own house, growing my own food and managing my own sanitation, does that leave any room for social cooperation? Further, how am I to engage in creative pursuits if I’m constantly tied down by my self-imposed obligations? The answer here really should be simple: Try to live an authentic life within reasonable boundaries. Just as retreating into surrogate activities can be unhealthy, completely retreating into authentic activities can be. The goal really should be to find a healthy mean between the two, where social cooperation is maintained but the fruits of one’s labor are clear.

Posted in UncategorizedComments Off on The Pursuit of Authenticity

The Purposes of Protesting

Illustration by Claire Revere

When you are standing in the middle of a crowd of protesters it’s hard not to get the impression that something is being done. Thousands of people chanting in unison “education not deportation” or “this is what democracy looks like” packs the same kind of punch as finishing the first stanza of Allen Ginsberg’s “America” after having consumed two glasses of sherry.

Then there are the signs: printouts of raised fists taken directly from the Industrial Workers of the World, Salvadoran folk art depictions of families being sundered by a terrifying anthropomorphic wall and of course the countless anti-Trump witticisms—phrases like “read a damn book” or “you can’t comb over racism” printed in multicolored Sharpie. It all feels so needed, so important.

But when night arrives and everyone begins dispersing, their throats sore and bellies empty, a terrible sense of unease sets in. Sitting in their cars, they might secretly wonder to themselves, “What, really, has changed?” Others might try to quell their doubts by focusing on the next thing—letter writing campaigns, a piece of protest art, an article, another protest. But, still, the sense of powerlessness lingers, resurfacing in the least expected moments as an amalgam of existential and political questions: “Do I have any power at all?” “What’s the difference between power and freedom?” “Am I free?”

Of course, these doubts are compounded by the fact that the 21st century is full of what are often seen as ineffective protest movements. The Feb. 15 protests against the Iraq War—the single largest protest movement in human history—barely dented public policy, and the Occupy movement looks to many like a confused and jumbled mess, filled with factious infighting and overly Utopian goals.

But here’s the thing: the real work of these sorts of social movements doesn’t lie at the level of the catchy chants or nominal goals, but rather at the murky level of human-to-human interaction. Occupy, for example, may not have achieved a Robin Hood tax (one of its most explicit aims), but according to the journalist Michael Levitin it did pave the way to smaller protests that led to wage increases in states like Arkansas, Alaska and Nebraska. Are most of the Occupy participants aware of this fact? Probably not, but they were nonetheless instrumental in its development.

We find exactly the same sort of thing when it comes to the Feb. 15 protests, where numerous small countries decided to opt out of sending troops to Iraq as a response. Here, too, it’s unlikely that your average protester that marched through the streets of Quebec in 2003 is aware that his actions led directly to Canada delaying troop deployment. In this hypothetical protester’s mind his actions were a failure because the war didn’t end, but he may very well have been instrumental in saving the life of some Canadian soldier.

What’s the point of all this? Well, those deep questions keeping the protester up at night might have a lot to do with the fact that he’s just unaware of the impact of his actions. See, it’s easy to be misled into thinking that you are powerless if you’re unwilling to accept the possibility that the impact of your actions might be more subtle than you can comprehend. If you do accept this, though, you find that those unsettling questions from earlier are replaced by the following even more unsettling one: “Do I feel comfortable impacting the world in ways that I might not understand?”

Now this is the question that should be keeping the protester up at night.

Posted in UncategorizedComments Off on The Purposes of Protesting

Living in a Post-Cynical Age

To the outcasts of the earth, blue-collar intellectuals, protégés of Salinger and Kerouac (if any of you are still around), know this: cynicism is dead. No longer is it fashionable to lament the decline of the sacred, the rise of consumer culture and the innumerable social ills engendered by urban sprawl. Whereas past ages have been defined by works like “The Catcher in the Rye,” our age is defined by the MacBook Air and the Harry Potter franchise. The first a symbol of our estrangement from reality—a hog of a device whose low profile feigns eco-friendliness—and the second a symbol of our collective desire to enter the Etonian elite.

If you responded to that last sentence with a shudder, you’re out of date. The hip among us have moved way beyond that sort of cynicism to a radical form of unapologetic consumerism—the sort in which driving a Subaru makes you an environmentalist and opting for a kombucha makes you a progressive. The obvious shallowness isn’t so much accepted as secretly treasured—loved even.

Why, you might ask, be so cynical? My response can only be that you misunderstand me. I am as post-cynical as they come. It’s true that I’m the sort of person who will discuss labor violations on Ethopian coffee farms over a cup of Yirgacheffe and ponder the horrors of factory farming over bacon and eggs (my breakfasts are a bit heated), but that’s where it ends—I never take the leap to full-on cynicism. That is, much of what I say straddles the line between appearance and actual belief.

Does this sort of insincerity remind you of someone? The president elect, maybe? If so, we are ready for my central claim: although it might at first seem totally contradictory, Trump is the result of post-cynicism. He’s what happens when most people accepted long ago that humans are self-centered and have moved on to treating selfishness and denial as virtues. What’s really sad (or would be, if I was a cynic) about this is that unlike post-truth rhetoric (where our side has science), we on the left are not immune from post-cynical rhetoric and so are partly to blame.

Let’s return to the MacBook Air to get at the real point underlying all this. To a cynic of old, this sort of device merely represents the disconnect between ideology and reality, but to the post-cynic it represents freedom from reality. The user gets to perceive the Mac in any way she chooses, even if her perception directly conflicts with the grubby truth. This freedom is extremely alluring and, as I’ve argued above, has permeated the lives of people all across the political spectrum. Further, it is precisely this permeation that made a Trump presidency—one in which reality is unimportant—a possibility.

If I believed truth mattered, I’d use this space to argue for more cynicism, but as I am a victim of my post-cynical age doing so would be out of place.

Posted in UncategorizedComments Off on Living in a Post-Cynical Age

Mainstream Media: No Second Helpings Please

Like most of my colleagues, I am the result of an experiment. I am what happens when you take 25 million kids and raise them on a 24-hour all-you-can-eat information buffet. A buffet in which a few main courses are served ad nauseam and the dessert section emits wafts of irresistible odors. I’ve tried all the usual options—high-circulation newspapers, mainstream television, popular lit—and moved on to the buffet’s poorly traversed corners. Here I’ve found obscurantist newspapers advocating radical and dangerous ideas, films asking their audiences to withhold their expectations about narrative structure and literature existing somewhere on the periphery between high (here come the accusations of elitism) and low culture—too difficult to be widely disseminated, but too recent to be discussed in the lecture halls of academia.

But through all this, I’ve been tempted by those irresistible desserts—kittens wearing top hats, overblown action thrillers, badly written sci-fi novellas—and I’ve grown a bit of a tummy as a result. I know what I should be consuming, but like an addict, my will falls apart at the slightest glimpse of a few-week-old mammal belonging to the family felidae.

Or, well, that’s the sort of internal conflict I’m supposed to be having.

Here’s a potentially radical alternative: What if both the main courses and the desserts are making me fat? That is, what if the distinction between what I’d normally consider meaningful media and meaningless media isn’t really there and they are both harming me in equal measures? Take, say, all the coverage of the election. I don’t think it would be particularly controversial to say that most of the stuff that’s pumped out about the candidates is filler, but what if I suggested that it has more in common with eclairs than bouillabaisse? What if when I read about Trump or Clinton, really what I am seeing are fluffy nonsense phrases that have a bite but no real lasting value? I mean, when I consider the utter vacuousness of the candidates’ words (“we are going to build a country where all our children can dream and those dreams are within reach,” “It’s time for real leadership and time for change,” etc), it begins to seem obvious that most mainstream political articles—which are by nature derivative—are fated to be high fructose.

Illustration by Meg Cuca

Illustration by Meg Cuca

If this were the case, and if I cared about my intellectual blood sugar, I’d be left with one option: look elsewhere. But in looking elsewhere, how am I to know when I’ve stumbled upon media that is actually worthwhile? There are a few things to keep an eye out for, I think.

First, it should make me uncomfortable about myself. This is perhaps the most jarring difference between the mass media and some worthwhile alternatives—the mainstream media almost never presents information critical of its viewers. When, say, the U.S. supports an oppressive regime—take the Suharto regime in Indonesia—it receives very little coverage from mainstream sources like The New York Times, but often makes headlines in alternative sources like Dissent, In These Times, Z Mag and The New York Review of Books. Further, domestic policy issues and congressional deadlocks are not portrayed as symptomatic of our collective way of life, but rather as some outside attack on an otherwise ideal system. That is, the mainstream media quibbles about the minutia existing within an accepted ideology (a warped rumball of neoliberalism, republicanism and exceptionalism), while alternative media often mulls over a potluck of competing ideologies.

Second, it shouldn’t apply the principle of concision. The mainstream media will often avoid historical background and nuanced analysis in order to keep viewers hooked. If I catch myself unreflectively darting from subject to subject, I might be looking at the wrong type of media. The average word count of sources like The Wall Street Journal is between 600 and 1200 words for op-eds (which is way above CNN and Fox at a few hundred words) whereas the average word count for sources like Dissent and The New Statesman is in the 2500 word range (The New Statesman even has a section titled “Long Reads”).

Third, it shouldn’t use language restricted to a particular time and place. We see this all the time in the context of labor rights, where the mainstream media consistently avoids using Marxist terminology because of the belief that it’s passé. This aversion to historical language and complex jargon might at first appear to be driven by a desire for clarity, but anyone even remotely versed in Orwell should see the dangers lurking under the surface here. No words exist free of ideological baggage, and so the belief that simplistic language will somehow lead to objective reporting is obviously misguided.

Of course, even if I apply all three of these principles, there’s still a good chance that I’ll be wasting my time. Consuming worthwhile media might help me develop a nuanced understanding of the world, but all that understanding will be for naught unless I conquer my appetite and walk out of the buffet.

Posted in UncategorizedComments Off on Mainstream Media: No Second Helpings Please

Stop staring, protect yourself

Illustration by Meg Cuca

Illustration by Meg Cuca

I have a friend who has an unhealthy penchant for stale metaphor. Just last week we were discussing the Ruth Fluno retrospective at Sheehan when he referred to the art as “a delicious feast.” Normally I’d treat this sort of remark as an empty flourish, but as I was in a slightly unusual state of mind, I immediately began wondering what, exactly, it would mean for art to be delicious. Well, it’d seem to mean that when you look at a piece of art, you are actually consuming it with your eyes and enjoying the taste. What a strange concept, I thought, as I tried to picture beady-eyed connoisseurs consuming paintings by looking at them.

I think there is something wrong here. Unlike food, where proper digestion is necessary for a worthwhile experience, art sometimes becomes more meaningful (and therefore more worthwhile) when there’s regurgitation, aversion and nausea. Take, say, the Chihuly chandelier hanging in Reid. Sure, the fiery colors and smooth curves are pleasant and maybe even a little mouth-watering (there is a certain resemblance to hard candy), but what happens when you take a closer look? You begin to see hands reaching out in all directions, as if desperately trying to escape the inferno. Suddenly you are filled with a sense of dread and avert your gaze. Did the dread somehow detract from the deliciousness of the piece? Is “deliciousness” really the right word here?

Let’s say that you held your gaze—the equivalent of chewing—until you were able to swallow. Suddenly the chandelier no longer exists outside you, but is instead deep in your gut. And what is lost? Everything really meaningful, I think. As soon as you consume the chandelier, the sacred, terrifying and mysterious—what Burke and Kant called “the sublime”—disappears and all that you’re left with is a pretty thing.

But I can take this further. In our day-to-day lives we are constantly faced with the question of whether to consume or regurgitate. When we drive by a car accident on the highway, when a friend says something unwholesome or when we see something distressing in a class we are thrust into that split-second decision. By turning away, we retain a degree of awe—the sense that there’s something bigger than us—that would otherwise be lost if we persevered. And this sense of awe should be celebrated.

I want to make it clear that I’m not extolling the virtues of ignorance—at least, not in the usual sense—but I am advocating an orientation towards life that stresses a form of abstinence. By not directly facing our fears, they begin to take on a form that is deeper—and maybe even truer—than they otherwise would.

In Plato’s Republic, there’s an account of a figure named Leontius who, upon stumbling upon a pile of corpses, is overcome by appetite and shouts, “Look for yourselves, you evil wretches, take your fill of the beautiful sight.” By pointing out the terrible scene he is trivializing it and eliminating the sacred. Death as a concept becomes a closed book, something fully understood and more or less ignored. Paradoxically, this is the result of sight: by seeing the corpses, death is suddenly rendered less meaningful than it actually is.

So, when faced with the sublime, don’t take shame in averting your gaze. Instead recognize that through aversion you might be seeing the truth in what is directly in front of you.

Posted in UncategorizedComments Off on Stop staring, protect yourself