Did the Hobbit Movies “Suck”?

I think it’s safe to say that the Hobbit movies that came out in the past few years proved somewhat disappointing for most people. But is that because the movies themselves were bad, or because so many fans of fantasy, Lord of the Rings, and Peter Jackson had certain expectations after the booming success of the Lord of the Rings trilogy in the early 2000s? I’ve heard some suggest that maybe, just maybe, if we watched the Hobbit trilogy without comparing it to Lord of the Rings, we may actually enjoy the prequels as fun bits of fluff, if nothing else. So what, if anything, makes the Hobbit movies actually bad?

1. They Won’t Let Us Forget About Lord of the Rings

So let’s say we do try to watch the Hobbit trilogy as movies in their own right, separate and apart from Lord of the Rings – not an unreasonable endeavor. How many of us treated the books as separate entities, after all? I read The Hobbit long before I even started on the Lord of the Rings books, and always considered them to be entirely separate from each other, despite occurring in the same world and sharing a couple of characters. True, Lord of the Rings serves as a sort of sequel to The Hobbit, and The Hobbit as a prequel to the trilogy, but the two have such vastly different tones that they provide entirely different experiences for the reader. The Hobbit movies, however, not only take on the same tone as Lord of the Rings, but borrow scenes, scenery, and references that make it feel as though the films are constantly screaming, “Hey! Remember that other, super popular trilogy?” “Hey, remember that time Peter Jackson had a cameo as a carrot-eating villager in Bree??? Look at him do that here!” “Hey, remember Sauron from Lord of the Rings? We’re going to forcefully insert him here, and devote a whole plotline to him!” “Remember Saruman? Already turning evil!” “Remember Galadriel? And Legolas? And Gimli? And…?” Some of the references and throwbacks don’t even make sense. Thranduil advising Legolas to go seek out Aragorn? Even assuming Thranduil knows who Aragorn is, why would he think anything more of the guy than “that kid who Elrond adopted”? Why would he think of Aragorn as somebody that his son should particularly seek out? Not to mention, Aragorn was just some 10-year-old kid at the time! And even in movie canon, it’s quite questionable whether the two men meet each other much – if at all – before the events of The Fellowship of the Ring.

The Hobbit really shouldn’t require references to or knowledge of Lord of the Rings to be enjoyed – goodness knows, the book didn’t rely on such. If anything, Lord of the Rings would have required explicit references to or knowledge of The Hobbit. But Peter Jackson proved otherwise with the Lord of the Rings movies. So why the need to place such trivial yet distracting references in the Hobbit movies?

2. Unnecessary Length

Not only did Peter Jackson & Co. make a completely unsubtle money-grabbing move in deciding to split a 300-ish page book into three movies, but those three movies weren’t short, either. While 90 minutes – or an hour and a half – is all it takes to qualify a movie for the label of “feature length,” all three Hobbit movies spanned upwards of two hours. Battle of the Five Armies clocked in the shortest time, at just under two and a half hours, while its two predecessors each managed to squeeze over two hours and 40 minutes of runtime out of an already overwrought adaptation. Sometimes it seems almost like some sort of magic trick that they managed to stretch a moderate-length kids’ book into such a long trilogy. How did they do it? Well…

3. New Characters and Plotlines Are Shoehorned In

They’re forced and they feel forced. Lets take a look at some individual examples…

Tauriel

At first, I was incredibly excited when I heard about the addition of Tauriel, the female elf who serves as Captain of the Guard at Thranduil’s Mirkwoodian palace. Goodness knows, The Hobbit was always such a sausage fest, so introducing a female character into the movies seemed like it would be a welcome relief… until it actually happened. Instead of being a cool and interesting character like the many others that Bilbo and the dwarves encounter (at least, they’re cool and interesing in the books) – Gollum, Beorn, Bard… – Tauriel is written as Hollywood’s stereotypical “strong woman,” who is strong in the literal sense but is otherwise fairly one-dimensional; and she’s really only used to create this awkward, cliched love triangle involving Kili the Highly-Unlikely-On-So-Many-Levels Dwarf Lover and Legolas the Why-Is-He-Even-In-This-Film-Except-As-A-Reference-To-That-Other-Trilogy Elf Prince (who at best made a background appearance in the book and, like so many of the added and/or just-for-reference characters, really has no purpose in the movies except to bloat the run time). To add insult to injury – as if the potentially incredibly feminist character hadn’t been reduced to enough of a sexist cliche – Tauriel is constantly referred to as a “she-elf,” a new term that didn’t come up with Galadriel or Arwen in Lord of the Rings, even with Gimli, who – as a dwarf – had an excuse of sorts to harbor prejudice and use less-than-kind words to describe the elves he encountered. But, of course, now none of the dwarves can resist pointing out that Tauriel is no normal elf – she’s a girl elf! WooOOOooo! So exotic and weird!

And did anybody like the whole “Tauriel has to go save Kili and cure him with athelas root” subplot in The Hobbit: Desolation of Smaug? Filler if I ever saw any – need I say more?

The Were-Worms

A completely unnecessary addition to Battle of the Five Armies that really only adds to the confusion of just how many armies were actually in that battle. I discovered that the creatures were, in fact, very – very – briefly mentioned in the book, but really only as part of a single throwaway line from Bilbo. Why did Jackson & Co. feel the need to take such a large and literal interpretation of a creature that had only one line – with no actual appearance or description – devoted to it in the book? They seem to have decided that were-worms were not only real (and not just part of tales and legends among hobbits), but literal worms – this despite the fact that Tolkien regularly refers to dragons as “worms” and “wyrms” throughout The Hobbit, which makes it highly likely that “were-worms” were actually a particular breed of dragon rather than literal, monstrously-sized versions of “those things that dig in the dirt and are good for your garden.”

Azog and Bolg

In the book, the only orc commander we meet is Bolg, and even then, he’s only briefly discussed when the Wolves and goblins band together to get revenge on the dwarves who screwed them over. Oh, and did I mention that in the books, Bolg is actually a goblin who’s out (with an army of his kin) to avenge the Great Goblin, who Gandalf and the dwarves killed in the Misty Mountains – something that, you’ll recall, actually happens in the story, rather than requiring a bunch of unnecessary flashbacks and backstory from before the current tale started? Azog isn’t really a thing in the book (as with the were-worms, he gets a brief-but-fairly-insignificant mention), and honestly, given the sudden shift of responsibility from Azog to Bolg (who’s pretty much turned into “discount Azog”) midway through the second Hobbit movie, we’re left to wonder why both of them were needed – even if that whole weird backstory with Thorin and the “pale orc” was necessary (and let’s be clear: it wasn’t), why not just dump that whole backstory onto Bolg instead of adding a new character? (Or just plain replace Bolg with Azog, if Jackson & Co. were so set on the lead orc being “not Bolg.”)

Radagast the Brown

The character actually appears briefly in the Lord of the Rings books, but certainly not in The Hobbit. And he most definitely isn’t some cross-eyed, over-the-top lunatic. In the Lord of the Rings books, he’s presented as a calm, collected wise man of the forest who Gandalf goes to for advice. But in Jackson’s Hobbit movies, he’s just “campy ’90s kids’ film villain”-levels of poorly-done and over-the-top comic relief. Johanna Robinson put it quite well in her article for pajiba.com after the first film came out:

We don’t have time for Tom Bombadil but we’ll manufacture an entire sequence with some addled, guano-smeared wizard on a rabbit sleigh? Like Azog, Radagast is canon, but plays NO part in “The Hobbit.” Many will say that the sillier tone of the Radagast sequence fits with the child-like nature of “The Hobbit” itself. That thinking got me through the snot and bum jokes in the Troll sequence, but did nothing for me here. Sylvester McCoy did a bang-up job with what he was given, but what he was given was bird crap. And that Warg chase scene with Radagast, the dwarves, Gandalf and Bilbo was some of the sloppiest action I’ve seen in a long while. Harrumph.

I’m a big proponent of the “Hobbit movies should have kept more closely to the lighthearted and whimsical nature of the book” argument, but Radagast was far too campy and overdone even for me. It adds insult to injury that Radagast was never actually in The Hobbit, and that he only appears in scenes that should have never existed in the movies to begin with (e.g. the orc/dwarf/wizard chase scene that clearly only happens because you’ve got to fill those two-and-a-half hours somehow and still have room for actual plot to happen in the next two movies).

Sauron

This not only adds extra bloat where extra bloat is neither wanted or needed, but it contradicts much of what went down in the Lord of the Rings trilogy, especially in Fellowship, by suggesting that Gandalf, Galadriel, & co. perfectly well knew about Sauron coming back/being a threat during Bilbo’s adventures, but just decided to sit on the information for 60 years rather than doing anything about it. Also… did Gandalf’s brain just fall out or something? After the events in Peter Jackson’s version of The Hobbit, why and how does he need to wait 60 years for confirmation that the ring Bilbo found is, in fact, Sauron’s ring?

The Gandalf/Galadriel Love Story

Wait… what? What the hell? And why? I… I just… I’m speechless. It was completely out-of-character, completely out of left field, and completely unnecessary. The only thing that wasn’t “complete” about it was the reasoning behind making it happen.

Gandalf and Galadriel taking a moment to wonder why the hell Galadriel's even in

Gandalf and Galadriel taking a moment to wonder why the hell Galadriel’s even in “The Hobbit.”

4. Decent Performances, But Nothing Amazing (No Thanks to the Script)

This wouldn’t be a problem for many – even most – movies. An actor doesn’t have to blow you away to be convincing, or for the movie to be fun. But when the script is mediocre at best – tirelessly shoehorning in references, characters, and plotlines that serve to distract from rather than add to the story; providing its actors with cheesy or unnecessarily vague and cryptic dialogue – it simultaneously takes away from the actors’ performances and requires them to go above and beyond in order to “save” the story. A movie with a mediocre, even somewhat bad script can still be saved by great performances. A stunning actor who’s working her top game could probably say, “You insensitive bastard – monkeys ate my uncle!” and have you crying in a puddle – or at least sell the line like she’s really experienced a legitimately traumatizing event. But even the acting greats who appear in the Hobbit trilogy don’t give strong enough performances to cover the fact that at least two of them (RIP the beloved Christopher Lee) are really just in there as (a) padding and (b) references to “that other trilogy that was so popular.” And one has to wonder whether they were even able to properly use and show off their acting chops, given the limitations of such a forced script – not to mention one that gave itself so much added fluff action and filler plots that it didn’t really have time to develop the characters beyond the occasional expositional flashback that typically provided an extra action sequence more than any chance at emotional depth.

5. The Films Utterly Betray the Tone of the Book (And Try Too Hard to Create a Different Tone)

The book of The Hobbit was a fun, whimsical children’s story, while its sequel trilogy took on a completely different, much more serious tone. This isn’t uncommon – the first three Harry Potter books were tales of whimsical discovery, while the last four took on a much more grim tone (even if J.K. Rowling’s writing style didn’t exactly evolve with the tone, but that’s a discussion for another time).  And while 90% of the Chronicles of Narnia books are completely about seeing a magical world through children’s eyes, and 100% of them are biblical allegories, the very last book, aptly titled The Last Battle, takes on a much graver tone, acting as more of a warning against what could happen if you let “nonbelievers” run everything (compared to the previous books’ sense of “this is how awesome life could be if you just believe in God!”).

But apparently, Jackson & Co. didn’t feel that giving the Hobbit movies a different tone from the Lord of the Rings trilogy would sell. They tried to mesh the brooding atmosphere from a story about a life-or-death mission to literally save the characters’ world with the story of a hobbit being swept up into 13 dwarves’ somewhat less pressing personal quest to reclaim their ancestry. As a result, the grim-and-gritty attempts at semi-realism make the more lighthearted moments feel completely out of place. I highly doubt that I would have heard so many people complaining about “Why the hell is everybody singing all the time?” if Peter Jackson had chosen to keep the whimsicality that allows an audience to more easily suspend disbelief about dwarves suddenly bursting into song while employing physics-defying methods of cleaning Bilbo’s dishes. As it is, when such lighthearted moments do appear, they grate against the films’ otherwise gritty atmosphere.

6. Taking Away Basic Rules of Tolkien’s Universe Created More Problems Than It Solved

As with characters and subplots, Peter Jackson and his writing team decided to add armies willy-nilly in the climactic battle that gave the Hobbit’s third installment its name – Battle of the Five Armies. But no sooner was a new army introduced than it was subsequently lost in the chaos of the movie’s messy writing, thus creating quite a confusing head count for a battle supposedly named for its very specific number of armies. The issue might easily have been fixed by remaining more faithful to the source material, both in overall tone and in its more tangible aspects (characters, subplots, creatures and their abilities, etc.) – returning the ability to talk and scheme and plot to the Wolves, rather than just making them “those animals the orcs use like horses” would have allowed Jackson to keep one of the original five armies in the story (and have it visibly/obviously count as such). That, in turn, would have removed the need to introduce an entirely new species to Middle Earth at the last minute in order to account for the “five” in the title.

Keeping the “animals can talk” rule would have also helped to answer why the Eagles only ever seem to appear at certain moments. The way that Peter Jackson portrays and uses the Eagles, who can blame the hordes of viewers who are annoyed with the Eagles’ deus-ex-machina-like appearances and their odd only-semi-usefulness, despite the fact that their arrival on scene seems to be dictated solely by the plot’s need of them? In the films, Gandalf seems to summon them like some sort of familiars who exist only to do his magical bidding. In the source material, their appearances can, in fact, be explained by the Eagles overhearing certain conversations and conflicts or helping out behind the scenes in ways that enable them to interact with people like Gandalf just like any other character might, rather than simply being summoned at will. And the Eagles themselves are able to explain their personal issues and limitations to the books’ heroes, thus dealing with questions regarding, for example, their lack of direct taxi service to the Lonely Mountain or Mount Doom. (Well, we also tend to ignore the fact that the Eagles may be physically powerful, but are not infinitely so, and as such would, at the very least, have had to land at some point between point A and point B, making for less-than-smooth journeys, even if they had provided full taxi service – whether in The Hobbit or the Lord of the Rings trilogy. But I digress.)

You know what else Tolkien’s rules regarding animal communication would have solved in the Hobbit movies? The issues re: “How Do We Kill the Dragon?” In Peter Jackson’s version of the story, we get a whole convoluted backstory that involves one of Bard’s ancestors failing to kill Smaug, but somehow managing to chip off a single scale and create an approximately inch-wide weak spot, which nobody realizes until Bilbo conveniently spots it by standing at the exact perfect angle to see said weak spot and later managing to report back about it. Tolkien also provided a somewhat over-convenient save for Bard, but at least it was relatively straightforward and logical within the laws that he’d set up for Middle Earth: Bard had the gift of understanding birds’ (specifically sparrows’) language, and as such was able to get help and advice from a tiny sparrow who could actually fly up and check this shit without being noticed by a gigantic dragon. And instead of finding some lucky chip in Smaug’s scales, the sparrow notices a weak spot that the scales naturally don’t cover on a certain part of the dragon’s belly – which actually anatomically makes a whole lot of fucking sense. Even a dragon can’t be completely covered by his scales. At the very least, joints like his armpits (or wingpits, rather) need a certain amount of flexibility that would be inhibited if they were completely covered by scales. And if we look at animals in real life, the stomach tends to be one of the most vulnerable/least armored parts of the body.

So say what you want about campy moments or over-convenient timing in the book, at least Tolkien created a world where, at the very worst, suspending disbelief was easy, and at best, you had a smile on your face as you read about the dwarves’ shenanigans or Gandalf’s ability to thoroughly troll Bilbo. Peter Jackson’s over-extended movie adaptations are more likely to leave you thinking, “Huh?” “What?” and “But why?” As such, I think we can safely say that, yes, the movies did, indeed, suck.

Infighting in Social Justice

Welcome to another installment of your favorite segment: “Evan gets pissed off on a social media site and ends up writing a post that gets so long, he might as well just make it a proper blog post.” Today’s topic? Counterproductive infighting within social justice movements.

I see it way too often. Two people working for the same cause get into a catfight. Or one subgroup of a movement does its damnedest to scratch the eyes out of another subgroup, while the attack-ee(s) tries to stay above the fray to varying levels of success. I don’t mean calling somebody out on purposeful assholery and bigotry – I’m all for, say, letting TERFs (Trans-Exclusionary Radical Feminists) know that it is, in fact, bigoted and not-at-all-okay to deliberately exclude trans women and insist that they are “not real women”/are actually just privileged fakers, in spite of all evidence to the contrary.

But there’s a problem when “checking yourself” – or, rather, your movement – becomes synonymous with blowing a small issue (say, for example, “This person could have worded their statement a bit better, though their underlying message is a good one”) into, “This person or group is the actual worst. Thing. Ever.” When talking with friends or posting on Facebook, I’m not shy about what I think of the “But-I-Like-Makeup Movement,” so to speak. Somebody should be able to say, “You look beautiful sans makeup,” without being lambasted because some makeup-wearing women like their lipstick and mascara and nail polish. Hell, I love nail polish. But I know that Joe Schmoe over there isn’t trying to ban makeup or take it away from me or say that makeup is the source of all evil. As eloquently or as clumsily as (s)he may have phrased it, one can usually come to the conclusion that all Joe Schmoe is really saying is that, “Women don’t have to and shouldn’t feel obligated to follow the route that society prescribes to them. Despite what makeup ads and ‘office-professional attire’ guidelines tell you, women don’t need to wear makeup to be pretty.” And if you stopped for a moment before knee-jerking into “But I personally like makeup” mode, an entire, completely unnecessary argument could be avoided. We’re all guilty of knee-jerking – I know that I’ve done it before, as much as I hate to admit it – but this method of self-checking a movement has become all too common.

So before you criticize a person or organization for saying something that’s not exactly the way you would have said it, or for supporting the cause in a slightly different way than you personally would, stop and think. Analyze what they’re saying. And if it’s more complicated than just “you don’t have to wear makeup,” do your research. And keep in mind that oftentimes, the wording used to speak out against somebody can be very deceptive. Of course, not every writer is a muck-raker, and some criticisms are very much worth consideration, but be careful of how these criticisms are painted – don’t just blindly accept. I’ve often seen scathing articles leave out tidbits of information that may seem small and unimportant at first, but that shed a completely different light on the lambast-ee once revealed. Articles that tend to omit such details also often use wording that heavily colors a picture in a subtle, but very real way. To use an example that I see often: people deride Goodwill all the time for paying its workers less than minimum wage. This is definitely not a lie, but if you dig a little deeper, you’ll discover that Goodwill does this because many of its workers depend on disability pay and would continue to depend on it despite also having a job – above minimum wage or no. For example, there are cases in which a person’s disability means that they need expensive home care or equipment, but that they can’t physically work a job that would cover those expenses – cause let’s face it, most “easy” jobs that involve folding clothes or stocking shelves won’t be all that high-paying, even if they meet minimum wage requirements. Goodwill helps make sure that they can have a paying job and get disability to help cover the costs of equipment, medicine, etc. In many circumstances, the best way to do this is for the employee in question to work off of an incredibly low income from their paying job so that they can still qualify for disability under the premise that they’re not working full-time/making a living wage. Call it “cheating” if you want, but it’s hugely helpful for people who need a little extra boost and can’t get that much-needed extra cash in the “normal” way.

Congresswoman Loretta Sanchez attends a 'Real Goodwill' Tour.

Congresswoman Loretta Sanchez (D-CA) attends a ‘Real Goodwill’ Tour.

Moral of the story: if you can, talk to a person (or people) before declaiming them as bad. At the very least, do research into the position/organization/what-have-you and make an educated decision as to whether you think they’re actually all that terrible. Catfighting and knee-jerk reactions are not the answer.

My Beef with the English Major, or How a Subject I Almost Majored in Made Me Hate It With a Passion

So somebody posted an article on Facebook entitled “Why You Should Not Major in English,” and the author has some good points both for and against declaring a major in English while in college. I was just going to post the article to Facebook with a couple paragraphs about my own thoughts, but that quickly turned into a long rant that would probably feel more at home here…

While I don’t agree with everything the author said, and I can understand why many people choose an English major, this article does a good job of explaining a few, though not all, of the reasons English departments at colleges and universities bug me so much. It’s not that “if you’re an English major you’ll never get a job.” And English majors, no need to whine about how everybody tells you that. Trust me, plenty of other majors get the same grief from non-majors, if not more, without the benefit of their choice being a no-brainer, resume fluffer, “employers/people in general don’t question its validity as an area of study” kind of major.

No. It’s that the English Major does get talked up to the point of puke-worthy annoyance for certain liberal arts students who decided not to take the major for perfectly valid reasons. And the language used to talk up English majors – or to talk to English majors (and non-majors taking English classes) – is way too often extremely high, mighty, and holier-than-thou without the acknowledgment of, “Oh, right, you need practical, real-life skills because, y’know, humans can’t live on ‘Yummmm, symbolism in 19th century novels.'” And the sad thing is that “English” does have a huge potential to teach practical skills that are useful for both making a living and positively contributing to your community.

When I was studying at Oberlin, I would have jumped on board an English major in a heartbeat had the English professors there demonstrated any promise of focusing more on, “This is how author X writes. What works about this style? What doesn’t? How does one achieve that style – or any personal writing style – while maintaining clarity?” (etc.); and less on,  “Write me a précis of Lord of the Flies and then give me 10 pages on all the symbolism in the book, especially the background/descriptive details that 90% chance have no meaning outside of ‘this book would be frakking boring if the author didn’t describe anything.’ Actual symbolism represented by Simon, the conch, etc. is optional, as long as your BS is pretty enough.”* That is not to say that people can’t have fun hyper-analyzing books on their own time; or that writers never ever ever insert symbolism in their work; or that nobody could ever conceivably search for that kind of thing for their own amusement. But if that’s your thing, you can do that without making a class out of it or laying it out like the symbolism is definitely, 100% there and intentional and not just you going on a scavenger hunt for hidden meaning that is questionably there. (Hint: Writers usually make symbolism painfully obvious when it’s intentional. Like with a bad joke, if you have to reach into nooks and crannies to explain it, it’s probably not there. At that point, it’s your personal added meaning/interpretation that may be meaningful to you, but should not be expected to hold meaning for anybody else.) Sadly, such a “read and let read” attitude is not, generally speaking, the English course way, and so – partly in order to learn something I found useful and partly to protect my ability to enjoy reading – I steered clear of an English major while I was at Oberlin.

Because when William Golding writes that some kids startled a red bird when trekking through the forest in 'Lord of the Flies,' he definitely meant,

Because when William Golding writes that some kids startled a red bird when trekking through the forest in ‘Lord of the Flies,’ he definitely means, “They startled communism. See? Red – communism? Huh? Huh? Huh?” The kids startling a red bird definitely does not in any way mean, “They startled a red bird. I thought you should know details like this because it would be incredibly boring if I just wrote, ‘And the boys walked. And then they walked some more. And then they kept walking.'”

Instead I went with a Cinema Studies major** and a Rhetoric and Composition (or “Rhet/Comp”) minor. I chose that minor because one of the professors from the department showed me during the first semester of my freshman year that Rhet/Comp professors didn’t just know the value of appreciating a good book, but that they also knew the value of writing a good book (or article, or script, or grant application…). And the more classes I took with Rhet/Comp professors, the more that proved true. It wasn’t pure philosophical debate over, “Did this one sentence Othello says in this one particular moment of Act IV, scene iii symbolize his flighty nature and indicate that he had a stick up his bum?” It wasn’t, “Don’t worry about practical skills. All you need to do is go out of your way to analyze whether Othello had a stick up his bum, or find communist symbolism in a single sentence about a red bird in Lord of the Flies, and that alone makes you a human with superior intellectual skills.” (Please note that most of my English major/former English major friends are not, in fact, so stuck up as to think this of themselves. This is simply the way I’ve seen many – not all, but many – college English departments and English professors flaunt their subjects. Even one of my favorite writers/writing advice gurus, Anne Lamott, is guilty of some of the “Ah, don’t worry about the practical parts of the game” sin that the aforementioned article complains about.)

For Rhet/Comp classes, the mentality was always along the lines of, “Here. Read this. You may not have heard about it or about the author, but it’s good. Learn about the cultural context in which the author wrote their poetry or prose. Analyze the give and take between that culture and the book’s content and style.*** But also analyze how that style worked and how it didn’t. Put that analysis into practice by actually writing an article or a script or a poem – whatever you might actually find yourself writing (or wanting to write) once you’re out of college. Exercise your writing muscles where they need the exercise, not just where it’s easiest to bullshit.” And English departments across the U.S. need to find their way back to that mode of thinking.

___

*Please note that, while some of these situations draw inspiration from real-life events, they are hypothetical and do not, as such, represent individual events in real life.

**Which, as much as I complain about certain aspects of that department, I do not regret in the least.

***Note the emphasis on using concrete facts to make educated hypotheses over the pure speculation and personal opinion that many English profs prefer.

Modern art = I could have done that + Yeah, but you didn’t

There is a phrase that has been at least semi-popularized to describe modern art, especially the brand that produces blank canvases, overpriced attempts at children’s scribbling, and oversized red paint swatches. “Modern art,” many have written, “Modern art = I could have done that + Yeah, but you didn’t.”

Now as somebody who is thoroughly annoyed with such cheap versions of modern art, I have to admit that is a valid point – if it’s followed up by the clause, “because you wouldn’t have thought to do it yourself.” Otherwise, the phrase just validates any art, no matter the quality. Which, unfortunately, seems to be the mindset of quite a few modern art over-enthusiasts.

If you start reading a book that has poor grammar and whose main character is an annoying Mary Sue, it’s a bad book. If you listen to a song that sounds like pretty much every other overly-autotuned pop song you’ve heard and has shallow, poorly-written lyrics, it’s a bad song. A painting or similar form of art does not have quite as precise delineation between “good” and “bad” – or, at the very least, “not good” – but that doesn’t mean that variations in quality don’t exist. Just like any other art form, it needs to provide its audience with something unique, whether it’s pleasure from aesthetics that they don’t get at home or an insight into somebody else’s story (whether it’s an individual or a group, fiction or fact).

Now, before I keep going, I should note that I am not one of those critics who thinks that simplicity automatically makes for something terrible. Those kinds of people frequently miss out on some really neat stuff because they’re too busy grumbling. Oftentimes, “simple” is the key to perfect presentation. I have seen a lot of modern art that I could have probably done myself, but that I still enjoy because, well, I wouldn’t have thought to make visual word puns by overlaying two similar-looking words; or I wouldn’t have thought to combine and blend those two particular color swatches in that particular way, and y’know, that combination and texture is actually pretty. Things can be simple and still unique and enjoyable to look at. This is when “Yeah, but you didn’t” applies – when the audience actually wouldn’t have made the art because it never occurred to them, not because they don’t like it.

Found at a Chelsea Gallery some years back. The words "lavatory" and "love story" laid over each other. (artist unknown)

Found at a Chelsea Gallery some years back. The words “lavatory” and “love story” laid over each other. An excellent example of “I could do that + Yeah, but you didn’t (think of it).” That, along with an obvious sense of humor, gives it some value beyond just an Orwellian fetishization of simplicity. (artist unknown)

But you know why I didn’t paint a giant canvas plain, solid red, or just lazily leave it blank and then hang it up? Because, given that I had the means to create art, why would I want to look at that over anything else I could have done, much less hang it in my house or pay to stare at it in a museum? I wouldn’t. It’s not that I necessarily couldn’t physically do it, or that I wouldn’t have thought of it. Rather, it’s that if I want to see a solid block of red or to just look at a white wall, it’s called painting the house. If I don’t want to put in that kind of effort and I still, for whatever reason, have an incurable craving for a swatch of some unvarying shade of red, there are plenty of ways I can see that and pay zilch to do so. Like, y’know, looking at an actual paint swatch, which costs little to nothing to acquire.

One of these items is "Red Square" by artist Kasimir Malevich. The other is a "Pioneer Red" paint swatch. Can you tell which is which?

One of these items is “Red Square” by turn-of-the-century artist Kasimir Malevich (founder of the Suprematist art movement). The other is a “Pioneer Red” paint swatch. Can you tell which is which? Really?

And to be honest, if I’m looking to temporarily enjoy plain, solid blocks of color, I see plenty of that my way to work or walking down the street, often in vastly more interesting forms than a giant, nondescript square of unvaried color hanging lifelessly on a wall. A blank stretch of grey on the side of a bridge that is otherwise covered by graffiti. A swatch of colored paint that covers up a graffiti artist’s mistake. Unintentional grayscale on the once entirely white sides of a dirty cargo truck. And every one of those pieces of “unintentional art” is virtually screaming that it has a story behind it* – a real, non-high-falootin’-maybe-it’s-there-maybe-it’s-not-but-gosh-darn-it-people-like-to-overanalyze-things-and-look-for-phantom-metaphors-in-everything kind of story behind it, and one that, even if it’s not grandiose and complex, rarely touches the realms of, “Well, here’s something I can sell to rich snobs at a gallery.” The unintentional art of the world around you is so much more interesting than many cases of forced simplicity on a canvas precisely because, in large part, the “art” that surrounds us in everyday life was made by accident, or for some other purpose than being stared at by people desperate to find some sort of hidden meaning or metaphor that just as likely isn’t there. (Which is not to say that you can’t make a game of looking for hidden, probably nonexistent metaphors on your own time if you want to, but Lord(s) almighty, some of us are tired of hearing how every apple to ever appear in any kind of literature or work of art is symbolic of the Adam and Eve story.)

So, long story short, if you’re debating whether or not to go to a Suprematist art exhibit, why not get outside and appreciate the world around you instead? It’s cheaper, and generally a lot nicer. At the very least, it’s more interesting.

But the blank canvas or single solid square techniques aren’t the only methods that allow laziness to pass for abstraction in modern art. I have seen enough 3-year-old-worthy scribbles done by 30-somethings to last a lifetime (not that it takes all that many). Hon, if I wanted a child’s scribbles, I wouldn’t pay $500 for a 5″ x 8″ canvas that some 30-year-old stranger took crayons to. Instead, I’d happily commission some free scribbles on printer paper from, say, one of my old students** who has actually and for real not hit puberty yet and has actual real reasons for drawing tactlessly messy and semi-formless scribbles for anybody but themselves. Cause let’s see…

  1. They don’t necessarily have the ability to do anything better yet. But with a few exceptions, an adult who commits to being an artist typically does have the ability to do better, and…
  2. …they better have if they’re asking money for it. The six-year-old who gives me a heart she cut out of printer paper is not asking for money. She’s just showing affection towards somebody who she considers to be a friend or, at least, a likable teacher. And that brings us to our last point:
  3. If I get a bunch of scribbles from a real kid*** who I actually know, the drawing has genuine sentimental value and not just the wishful thought of, “Well, I threw a lot of money at this, so it ought to be valuable, right?”
Some modern art for you. Have it. No, really, take it. It took about two seconds to make this; it's about on par with a good chunk of the modern art out there; and the best part is that the most you'll have to spend is the cost of printing it on whatever canvas you choose. Printer paper, for all I care. Now explain to me why you want to pay $100+ for so-called "professional" scribbles again?

Some modern art for you. Have it. No charge. No, really, take it. It took me less than five minutes to make, and it’s about on par with a good chunk of the modern art out there. The best part is that the most you’ll have to spend is the cost of printing it on whatever canvas you choose. Printer paper, for all I care. Now explain to me why you want to pay $100+ for so-called “professional” scribbles again?

Now, don’t get me wrong, there’s plenty of really cool abstract modern art out there, some of which has merit partly because it is abstract, but art also needs to give its audience some reason to care about it. It could be as simple as, “That’s a really cool pattern or combination of objects that I wouldn’t have encountered otherwise.” But just like any other art form, the fact that some abstract art is good doesn’t make it all good. And because of this notion that abstract automatically equals good, “fine art” has, in many ways, gone the way of “fine dining,” where customers will often pay out the wazoo to get a serving of prawns that would barely feed a mouse. (Although the prawns will probably be delivered to you with more artistry than some of the paintings in New York City’s Chelsea galleries.) Many artists’ policies seem to have become, “Pay us more so we can give you less.” And while sometimes “less is more,” other times less is just less.

*This is what makes photography so wonderful. Momentarily putting aside the fact that framing a shot is an under-appreciated art in and of itself, when a photo’s candid, it’s capturing a moment, or something that has a genuine meaning and story, even if somebody else could have physically snapped the shot. And even when a photographer sets up a staged shot – even when a picture isn’t candid – they’re creating a moment in a similar way to how, say, Vermeer created a moment in “Girl With a Pearl Earring,” or Frida Kahlo did with her self portraits. Besides, painting does not have quite the same ability to capture slightly more natural, just-pick-up-a-camera-and-click sentiment the way that photography does, so sorry painters – you have to do a little something more than act as a glorified house painter to get me to pay you for something that doesn’t even cover the entire wall.

**I taught math and grammar to 5- and 6-year-olds last summer, and still have some of their doodles and cards in which they misspell my name. This is probably some of my favorite art. It’s priceless to me because I bonded with the artists over things like “5 + 1 = 6” and “big is the opposite of small.” It is worthless to most other people precisely because most other people did not have that kind of experience with those kids. And that’s okay. Sentimental value is non-transferrable. The fact that you, quite reasonably, would not likely want to pay even 10 bucks for it does not diminish it’s value to me personally, and vice versa.

***The same might be true if an adult who gave me something like that, depending on the adult and on the story behind the gift. But the value of the piece would still be sentimental, and personal sentimental value does not, in and of itself, qualify anything for an art gallery or museum.

I Hate the Word ‘Embarrassed’

Hypothetical situation:
Jack and Jill are out in a park one evening. They’re both super into each other, heavily making out on a park bench. Jack is feeling incredibly horny, and there’s a small, somewhat secluded area amidst a bunch of bushes behind the bench. He proposes that they really get it on in the bushes. Jill says no. Not now, not here, at least. She says this because…
a) she is a prude.
b) she is embarrassed.
c) she is legitimately uncomfortable at the idea of having sex in such a public space.

If you picked option c, good job. You are a smart person, and you probably don’t need to read this blog post. (Well, I am assuming you didn’t just pick it because I put obvious emphasis on it as the one I wanted you to pick. If that was your motivation for picking option c, shame on you.) However, if you didn’t pick “c”…

Did you pick option a? If so, wow, good job being an asshat. It does not matter in the slightest whether somebody doesn’t want to have sex now, doesn’t want to have sex ever, or, in fact, doesn’t want to do anything that another person asks of them. Trust me, it doesn’t make them a prude. They have their reasons, and 99% of the time it’s not because they’re stuck in the 1700s or because they want to make you miserable. Get over yourself.

What about option b? While option a is pretty obviously misguided, option b gets into murkier waters, and this is the option we are here to talk about. In the situation above, I’d like to think that most people would definitively say that Jill is uncomfortable, not embarrassed. Though their definitions have some elements in common, they are decidedly different words. Google defines embarrassment as “a feeling of self-consciousness, shame, or awkwardness”; and while discomfort can be related to that, this definition perfectly describes the tone in which “embarrass” and “embarrassment” are generally used – to refer to a mild unease, usually associated with concern about how one appears to others, but nothing more.

Discomfort, however, makes somebody “feel uneasy, anxious, or embarrassed.” While being uncomfortable can involve embarrassment, it often – if not usually – implies that something more serious may be going on than just a mild case of self-consciousness. And as long as she’s not harming anybody, it’s Jill’s prerogative to avoid situations (like this one) that make her uncomfortable. Most decent people can agree on this, yes?

But put somebody in situation that is not quite so potentially rape-y, and suddenly “uncomfortable” becomes 100% synonymous with “embarrassed.” There is something about non-potentially-PTSD-inducing levels of discomfort that, it would seem, make people think, “Well, Person X is just wimping out,” or “Person Y is being vain and shallow and is only refusing to do Action Z because she doesn’t want people to judge her.”

And describing somebody as “embarrassed” – often implying that her concerns are irrational or that she doesn’t want to enter a situation merely because a stranger might give her a quizzical look – when he or she is legitimately uncomfortable easily becomes a bullying tactic, even by people who don’t initially intend it as such.

When I didn’t want to do something as a child, one of my mother’s favorite refrains was, “You’re just embarrassed,” thrown at me in the same tones as one might tell an angsty pre-teen, “Don’t be ridiculous. Not going to one party isn’t going to ruin your life.”

Once I was old enough to make decisions without my mother’s approval, I thought I was done hearing that tactic. I was wrong. Years later, once I was out of the house with some semblance of a real life of my own, I heard the word “embarrassed” used against me in a way that I hadn’t heard since my tweens. It wasn’t spat at me like an insult or wielded like a weapon – I’m pretty sure it was, in fact, well intentioned – but it had the same effect.

You see, my boyfriend at the time was incredibly physically affectionate. In public. He was the kind of person who would, mid-day, decide that it was a good time for a big smooch in the middle of a most-definitely-not-empty sidewalk. I was not. He would do it anyway. I’d face away in an attempt to turn it into a kiss on the cheek. He’d take my chin in his hand and force me to face him. He thought it was some sort of game. Or that it was romantic. Or something. He meant well, but nonetheless, I spent a good month feeling wary whenever we went out in public together. Not that he forced public displays of affection on me 24/7. He didn’t. But he did it enough to make me nervous whenever we were together.

Finally, one evening when we were lying in bed, I mentioned something – “I would like to spend more time just the two of us alone because I don’t feel comfortable with being that affectionate in public.”

Roundabout as I can be sometimes, he immediately knew what I meant. “Sorry if I’ve been a bit too pushy. I can be kinda oblivious to social cues sometimes,” – okay, understatement of the year, but he clearly felt bad, so I gave him a pass on that one – “so please just tell me if you ever feel embarrassed.” There it was. That word again. Embarrassed. I wasn’t embarrassed. I loved my boyfriend. I wasn’t ashamed of him at all. The idea that he might think I was ashamed of him because I didn’t want to mack on him in public made me feel incredibly guilty.

Maybe I should have stopped him and said, “Embarrassed and uncomfortable are two different things. It’s hurtful when you say I’m embarrassed by something that makes me legitimately uncomfortable. And it’s not my job to tell you, ‘Hello, Captain Obvious here. When I quite literally dramatically duck out of your attempted kiss,* that’s not a good thing.’”

What I did say went something like this: “Oh, no, no, no! I wouldn’t say I’m embarrassed by you. I don’t go out and think, ‘Oh, God, people are seeing me with you.’ I just, you know…”

Still, despite my fumble, things got slightly better. Cringe-worthy situations did go down. But they didn’t go away. And every so often, an incident felt like it was even worse than before I’d had a talk with my boyfriend.

I remember one evening, walking back from a dinner out, he pushed me to the point of vocalizing my discomfort after physical attempts to disentangle myself from his grip (cringing, pulling away, pushing) failed to get his attention. And even after my boyfriend’s invitation to “tell him” if I ever felt “embarrassed,” it took pushing me to a snapping point to get me to say the simple words, “Too much.”

Now, I’m sure I seem like a very loud, vocal person to many – I go on rants in my blog posts, and when talking with the right people, I will often easily (if accidentally) dominate a conversation if I get excited. But my reaction to the slightest amount of discomfort is usually to go quiet and remove myself from the situation as soon as possible. Basically, I go through the mental equivalent of an opossum’s “if I freeze, this person will definitely ignore me” instinct, and boy is it a struggle to get words out when you’re in that state. So if my man was able to push far enough for me to fight against my instincts and spit out so much as two words, he was taking it about 10 steps too far.

The Opossum Effect

The Opossum Effect

But we’d already had a conversation about that. Besides, my boyfriend was going through some tough shit at the time, and still handholding me when I was going through shit that must have paled in comparison; so I didn’t want to accidentally make him feel like he was shame-worthy. Again. So after that, I stayed silent.

That is the abusive power – accidental or not – of conflating “discomfort” and “embarrassment.” That is why word choice is so incredibly important. So, please, check yourself before you speak to somebody. Because words can be hurtful without you even realizing it. Hell, on a couple of occasions, I’ve been the one to accidentally insult somebody, and that is no more fun than being on the receiving end. Trust me.

*Actual, real thing that happened at one point.

Previous Older Entries