The Single Biggest Issue with Postmodernism

It’s interesting to note that, of all philosophical trends in history, only modernism was declared dead due to a failure of architecture.  The demolition of the Pruitt-Igoe complex in St. Louis was hailed by everyone from serious sociologists to literary critics as the end of one era and the beginning of the next, which they imaginatively called “postmodernism”.

pruitt-igoe implosion

Pruitt-Igoe complex demolition – hailed as the end of Modernism.

Of course, the aspiring postmodernists had were simply using a fortuitous event to further their cause and ignoring inconvenient truths.  While it’s true that Pruitt-Igoe was undoubtedly designed on modernist principles, its failure had more to do with mismanagement and public policies than with modernism itself*.

In the long tradition of social reformers, however, the postmodernists ignored the facts and pushed their way of thinking forward – successfully.

In its original form, postmodernism was a typical adolescent rebellion by social theorists against what had come before, turning a skeptical eye towards both antique institutions and modernism itself.

So far, so good.  They say nothing is more predictable for intelligent people than the avant-garde, and postmodernism was living up to that truism from the outset, and would soon settle down to become the established norm with new rules and values.

They did this admirably.  Nowadays, if you know what is particular pet topic is, you can write a postmodernist scholar’s paper for him before he knows he is going to write it**.

And therein lies the problem, and ultimate barrenness of postmodern thought.  At some point, postmodernism began searching for tools with which to give form to what began as a rejection of what came before, and they seem to have taken a wrong turn.

The central tenet they ended up embracing is, in layman’s terms, that there is no such thing as a “big picture”, and that it is perfectly valid to analyze individual elements separately – and in a separate, but ultimately equally damaging turn, that the observer is a critical part of the analysis.

While subjectivists were alive in Ancient Greece, the idea that single-element analysis is valid it’s called deconstruction, BTW) has been particularly detrimental in combination with it, damaging fields as disparate as History and Architecture.

We can dispense with the architectural elements easily – all one needs to do is to envision a building where the elements are meant to be viewed individually with no concern for the whole.  There are some out there (you can see one below – and it isn’t even the ugliest), but most architects have a grounding in art history, and an appreciation for aesthetics, so they have, on the whole, rejected the idea that the big picture is irrelevant.

k2_building_tokyo

The K2 building is Pure postmodernism.

Where things do get unfortunate, however is in the softer sciences such as history or literary criticism (I won’t repeat the XKCD joke here – go find it yourself!).

History students suffering the postmodern wave of revisionism (every movement has its revisionist wave) are being taught that unimportant groups and people were just as important as the movers and shakers of their era.  That slaves were historically important in societies where they were just used as human cattle, or that minority groups were politically influential in ancient India, or whatever.  The justification seems to be that the history of anyone who ever existed is important, so it must be taught as important.

The reality is that the suffering of minorities, slaves, or any other disenfranchised group is only important in times when the group managed to get some kind of power… if not, their suffering actually was in vain.

And yet, historians today are telling a different story.  It’s all very democratic, but will ultimately prove as damaging to the science as any other philosophically-based prejudice (see Eugenics for another 20th century attempt to fit history to philosophy – that one didn’t turn out so well either).

Criticism is often a butt of jokes about the academic worth of its practitioners, but we have to admit that, lately, the discipline has earned the scorn.

The problem is that with deconstruction allowing one to choose the focus one wants, it becomes easy – nay, obligatory – to focus on a single dimension when evaluating a work of art.

Warhol Campbells Soup

Soup Can: very pretty, but how does it speak to animal rights?

So a novel that touches the human spirit can be attached for not being feminist enough, a beautiful sculpture is worthless because it doesn’t address the plight of oppressed minorities.  Postmodernism’s obsession with minutiae blinds it to everything other than minutiae, to its own detriment.  Political arguments in the early 21st century seem to be imbibed with the same kind of narrow-gauge thinking.

It ends up feeling like postmodernism is the whiny self-absorbed teenager of philosophical movements…  Even to the point where there are already rumblings of a post-postmodernism.

However, like whiny teenagers, it will be hard to steer this one to a good port.  You see, the death blow to postmodernist thought has already been dealt, nearly two decades ago.

In 1996, physicist Alan Sokal submitted an intentionally flawed, parodical academic article to peer-reviewed postmodern journal Social Text.  Not only did the ridiculous piece pass the peer review process, but, after Sokal came forward to announce the hoax, some of the journals defenders actually said that (and I paraphrase) “Sokal didn’t understand the actual depth and significance of the piece he had written”.

Now that is more embarrassing than a simple demolition, don’t you think?

 

 

 

*Modernism clearly had its moronic moments, but Pruitt-Igoe wasn’t its fault.

**For example, that last sentence would be rewritten by a feminist post-modernist using “her” in place of “him” and “she” in place of “he”.  A multi-gender postmodernist will attempt to use an invented gender-neutral word in its place, etc.

The Synchronicity of Birds

It seems like this was destined to be a Hitchcock-themed week, even though we didn’t plan it this way.  Our Tuesday post and this one were planned completely separately, but there is no denying that Daphne du Maurier and Alfred Hitchcock are inextricably linked, so it’s a happy coincidence for those who are fans of both! –Ed.

Daphne Du Maurier

Daphne du Maurier

Most writers would probably kill to write a string of popular best-selling books spanning four decades and be created a Commander of the British Empire for their efforts, but it’s arguable that, in Daphne du Maurier’s case, she might have been better off having written just two books.

du Maurier will always be linked to one of the great novels of the 20th century, the brilliant Rebecca.  Despite modern covers that attempt to fool readers into thinking that the book is aimed at the 50 Shades audience, or possibly the crowd that prefers tamer romances, this one is not a piece of entertaining fluff.  It’s a mature, unflinching look at adults who are less than perfect, but who do what they must and deal with the consequences as best they can.

Rebecca also contains one of the most memorable (some people say the best) opening lines in literature:  “Last night I dreamt I went to Manderley again”… a haunting preview of what is to come and perfect for the novel.

It’s a bit sad that, while attempting to recapture the magic of her first hit, du Maurier focused on the romantic elements of the novel and produced a string of books that has since been completely dismissed by the establishment – with some justification – as mere time-passers not worthy of a second look.

birds-image

The Birds Film Still

The true tragedy is that the dismissal of her work often extends to Rebecca itself (which is both ignorant and unforgivable) and to her other noteworthy book: The Birds and Other Stories.

That du Maurier was a master of suspense is clearly evident from the fact that Alfred Hitchcock decided to film no less than three of her tales:  The Birds, Jamaica Inn and Rebecca – and it’s arguable that The Birds is Hitchcock’s most famous film (although, admittedly, he has so many that it could be quite an argument!).  Nevertheless, that’s not the way she’s remembered, and most people wouldn’t be able to connect The Birds with her at all.

It’s their loss.

Originally published as The Apple Tree, the title was changed and the book was reissued as a companion to the film in 1963… and it’s well worth reading.

It’s a book that clearly shows that du Maurier was wasting her time with romance.  While love interests were fine to sustain the plot, what she really, truly did well was a kind of weird suspense, a mix of slightly surreal elements that never let the reader understand whether events are caused by natural or supernatural forces, or even if, perhaps, the characters are imagining it all.

It’s a slim book, and has six stories in it, but, with a deft touch, explores everything from adultery to cults with much the same effect as Rebecca, but in bite-sized chunks.  Anyone wanting to learn how to write a modern suspense tale – or wishing to consume one, need look no further.  Even though they are well over a half-century old, they feel perfectly modern (if one overlooks technology, of course).  The prose is that good.

And the title story feels very different from the film… so even if you think you know the tale, you don’t (also interesting to read the original material as Hitchcock did, to see what inspired him about it).

Of course, this review is being written for Classically Educated, so we’d be truly remiss if we failed to mention that a beautiful edition of this one was Published by Easton Press, although we don’t know if it’s currently available (ebay should help if not…).

All in all, we strongly recommend you pop into the local bookstore, buy these two du Maurier books and make a comment to the clerk about how sad it was that she never wrote anything else.  It would be a small white lie, and who knows – you might possibly be starting the restoration of her reputation.

Did this guy ever screw up a film?

Bergman and Peck

Ingrid Bergman and Gregory Peck in Spellbound

Today, we look back on a rare beast – a suspense film from the mid-forties that had no noir pretensions whatsoever.  Spellbound (1945) is a Hitchcock vehicle which is the second Psychological thriller to have appeared on the list – the first was 1942’s Cat People.

The two films feel completely different, since the older movie is more about the shadowy workings of the mind, while Spellbound actually looks into both the methods and profession of psychology.  Whether or not it’s an accurate portrayal of the state of the field in the 1940s is not something we’re qualified to discuss, but for the purposes of the movie, it worked well.

As usual with Hitchcock, the movie is well thought out and reasonably convoluted – and the ending is impossible to guess, despite the best efforts.  Hitchcock was a master of foreshadowing enough that the partial reveal wasn’t a surprise to the more intelligent viewers, but that the whole picture would only really appear when the director himself felt the time was right.

That technique actually works much better in Spellbound than it did in the film that old Alfred himself said was his favorite.  In fact, of the movies he directed that have been on the list so far, this is the best of his Hollywood movies (although there are still plenty more to come, so that might change over the coming months.

Spellbound Dream Sequence

Alfred Hitchcock’s Spellbound. Dream sequence by Salvador Dali.

We won’t get into the plot of the film itself, as it’s well worth watching, but it’s interesting to see the kind of talent they put together for it.  As leading couple, no less than Ingrid Bergman and Gregory Peck.  Then there was famous acting coach Michael Chekhov. The film even had the collaboration of Salvador Dali, who filmed the dream sequence, which was reputed to be completely insane, but, sadly, was cut by the production team and is now mostly lost (although Dali’s unmistakable flavor can still be seen in what remains).

Perhaps this film would give To Have and Have Not a run for the title of the old film with most still-recognizable names involved.  All that talent created a good flick – go find a copy and enjoy it!  It does somehow seem that most Hitchcocks fall into this category…

 

As always, a mention of two of the actors who were involved in this one who are still with us: Rhonda Fleming and Norman Lloyd.  Here’s a shout out and thank you, if you’re reading this!

 

 

Somewhat Scholarly Reflections on Science Fiction – Part 1

Today, we begin what we hope will become a popular, long-running and Nobel-Prize-Winning* series on Science Fiction.  It will likely have a focus on literature, at least initially, but will be perfectly willing to include movies, comics and any other interesting subjects.

It will also be open to Fantasy and certain types of Horror, as much of the audience for the three genres overlaps.

Worlds_Best_Science_Fiction_1969_cover

1969 Worlds Best SF – Edited by Donald A Wollheim

A couple of weeks ago we reviewed and analyzed one of the many Year’s Best collections that the Science Fiction / Fantasy genre: the Wollhein 1989 Year’s best SF.  Suddenly, it dawned on us that that review of a book that has proven to be a minor volume in genre history is actually an excellent starting point for comparing eras.  So let’s call that post the honorary “Part Zero” of this series.

A logical place to start was with a couple of collections that could be compared directly to that ’89 book.  We chose the 1969 and 1972 Wollheim Year’s Best collections, but not without some trepidation, as we will explain a little further below.  But misgivings aside, these fit the bill perfectly – by choosing the same editor, we avoid questions of wildly differing taste and bias, and by going back nearly two decades, we get enough of a gap that contrasts are notable.

The first thing one notices about these two titles is how much more recognizable the names of the authors are than on the 1989 edition of the same collection.  Genre fans will all recognize Sheckley, Anderson, Silverberg, Aldiss, Knight, Delaney, Lafferty, Foster, Sturgeon and Lieber from the older books.  And everyone, even non-genre readers will perk up at seeing the names Vonnegut, Clarke and Ellison – three writers whose names appear on the tables of contents of the 69 and 72 books whose stature simply isn’t matched on the 89.

Why were so many important names present?  Well, there are a couple of reasons.  The first is that during the late sixties and early seventies, the writers that made the genre important were still active and close to their primes.  The amazing Golden Age of Science Fiction has, to date, never been equalled, and the writers active in the 30s and 40s were still around.  Just look at that list again.

1972 Wollheim Years best SF

1972 Year’e Best SF – Edited by Donald A Wollheim

The second reason is that SF briefly became chic in the sixties due to a combination of experimental writing in the genre and, quite possibly, an excess of recreational drug use by editors of journals such as The Atlantic and The New Yorker, who allowed their hallowed pages to be sullied by this basest of genres.  Also, in order to be able to say that one read Playboy for the articles, one needed to be able to discuss the articles – and there was some SF there as well.  This mainstream exposure is still why casual readers recognize names like Bradbury or the aforementioned Clarke, Vonnegut and Ellison.

The second thing one notices is just how much difference the editor makes in one of these collections.  We’d had some trepidation in selecting the era because of an intimate knowledge of Judith Merril’s anthologies of the same era.  Possibly fueled by the same drugs as the editors of the journals, she seemed to have a knack for selecting kaleidoscopic  jumbles of words which, though possibly beautiful, were not ideal places to extract meaning.  It was like reading a modern artist or looking for the truth in the patterns generated by a lava lamp. Perhaps you had to be fully immersed – in every aspect – in sixties culture to appreciate the stories.  Like they say: If you remember the sixties, you weren’t actually there.

Wollheim’s selections were not aimed at making a statement about pop culture, but rather are core SF tales that explore ideas about how the world will be like some years in the future.  That is what SF used to be about, and is still what good SF is about today, whether the changes be technical, social, ecological or political.  It can reflect and comment about the present, of course, but if it isn’t done obliquely, it becomes preachy and unreadable – and a lot of the (thankfully now forgotten) SF of the sixties fell into that trap.

It defeats the purpose of this analysis to do a story-by-story rundown, but suffice to say that even the Vonnegut tale is almost completely devoid of impossible dreaming – although it is admittedly weird.

So, compared to their peers, these two books hold up reasonably well, but how do they stack up against the 1989?

 

Without taking into account individual highs and lows, such as the excellent “Peaches for Mad Molly” in the ’89 collection the older books are better overall.  I believe that is driven mainly by the fact that the level of the writers was higher, as was the purity of the genre elements.  By 1989, science fiction was in a transition between the popular but looked-down-upon work of the 40’s and the literary but boring SF of today.  Sometimes that transition produced masterpieces (Dune, or Ender’s Game are examples), but more often muddled works that attempt to be socially relevant but really only succeed in being vague, preachy or both.

The older anthos are highly recommended, with some true classics among the more pedestrian tales.

 

Ad Space:  If you know someone who is classically educated, and as the personality to say so and damn the torpedoes and accusations of elitism, you might want to consider getting that person something from the Classically Educated Product Store this Holiday Season!

 

*We refuse to believe that there is no Nobel Prize for blogs.  This should be reviewed.  We may need to put a clause in our manifesto making this an explicit goal of the site.

The Curse of the Polymath

Photo of the Vitruvian Man

Photo of the Vitruvian Man

Most of the time, the Classically Educated Manifesto is a document which we are all proud of.  But, on occasion, we stop and look around the world and realize that modern human society is not really designed to cater to polymaths.

Generalists as a species have been out of favor even in places where they should thrive, such as multinational corporations, for twenty years or so.  But this is just a deepening of a trend that has been around for a century or more.

The case of companies can be quickly studied.  The reason generalists are useful for corporations is that, from a certain size onwards, companies need managers.  A manager’s job is twofold: to get results for their particular area of responsibility through the work of others, and to coordinate activity with other managers with a view toward optimizing shareholder value.

So, for instance, the company’s best programmer really can’t be promoted to management unless a) he has a grasp of human resources management, and b) an understanding of what the rest of the company is doing, from finance to marketing to production.  This is why people with MBAs tended to get those promotions.

Over the last few years, however, many companies have been ignoring this hard-learned truth and simply promoting the best-performing functional experts, people who really, really  know how their department works, causing much laughter among experienced managers who then get to watch the train wreck while munching popcorn.

There are many explanations for this phenomenon, starting with a sense that MBAs are elitist, and elitism goes against the inclusive culture of many new companies, especially in the tech arena, and continuing with the fact that a lot of HR people have gotten extremely conservative and only hire / promote technical experts within their fields in order to cover their own asses – they seem to have forgotten the immutable truth that a good manager can manage anything, even complex technical departments.  And it ends with the fact that companies aren’t getting any smarter.

While this is all very interesting, it doesn’t seem to cover the root problem, which is that as the world becomes more complex, obsession is beginning to trump… well, everything else.

Lewis Carroll portrait of Beatrice Hatch

Lewis Carroll portrait of Beatrice Hatch

So, you have people who live, breathe and dream computers, all day, every day.  Or any number of individuals who take their company work home with them and think about it to the exclusion of all else.

Even those people aim at balance tend to have one all-consuming hobby, whether it be rock climbing or model trains.  They then get together with people who have the same hobby.

So a person who works as an engineer at an airplane factory, and reads renaissance literature during his lunch break, practices amateur theater two nights a week and plays softball with friends over the weekend before his painting class and then gets together with friends from none of these activities is about as common as hen’s teeth.

It wasn’t always like this.  As recently as the Victorian and Edwardian ages, amateurs were making important contributions to both the arts and sciences (and probably even moreso to that ultimate mixture of the two: the soft “sciences”).

Lewis Carroll was a mathematician and a social critic who is best remembered for his children’s books (although a close reading of Alice will show that “children’s” is a bit of a misnomer).  He was an example of the gentleman polymath of his time.

And perhaps therein lies the problem.  The twentieth century was a century of democracy, and elitist concepts such as that of the gentleman with the leisure time to be an expert in various fields fell into disfavor – and distrust.  Even today, deep knowledge on too many subjects can get one branded as elitist extremely quickly. (If someone brands you as elitist, please let us know immediately, and we’ll offer you a place on our writing staff – unpaid, but proud to join a whole raft of elitists).

The loss of polymath pride since the turn of the 20th is a tragedy, perhaps, but even those Victorians and Edwardians were but a pale shadow of the true colossi of polymathy: the men of the renaissance.  Why, even today, the term “renaissance man” is used to refer to anyone who masters various disciplines.

Choosing one giant from among them would be an arduous task were it not for the unsurpassed genius of Leonardo, of course, but he was simply the giant among giants.  From Michelangelo to Galileo, they reveled in a society that celebrated breadth of genius far more than depth of expertise in a single subject.  They were even allowed to build huge buildings… although they were actually painters and astronomers (clearly, there were fewer lawyers back then, or the lawyers were also polymaths who got it).

That is what we have lost.  Today, the admiration that was once reserved for giants of the intellect is reserved for actors who often can’t count to ten and for surgeons who likely wouldn’t understand references to Humbert H. Humbert.  Guitar players for whom impressionism is a side effect of cocaine.  Geniuses in their fields, all, but limited in scope.

And it won’t change.  The 21st century will see a deepening of democracy globally, and one of the central tenets of democracy  is that equality is a right.  Most peoples of the world have chosen to interpret that as “no one is better than anyone else”, and if achievements show the contrary, then the person flaunting those achievements must be brought down a peg.

So polymathy, especially in “elitist” intellectual pursuits, will only get less popular as time passes and the world panders to the easily-bruised egos of the masses.  Polymaths will increasingly become dinosaur-like rebels flying in the face of social convention, the crazy old uncle no one ever talks about.

But that’s fine.  It’s more fun to offend than to conform.

Anything that requires  an exertion of sheer bloody-mindedness must, necessarily, be a good thing.

So onward the polymaths.

When Bad Propaganda is Good

John Huston

John Huston is best known for directing Hollywood classics ranging from The Maltese Falcon to Annie, but perhaps his most interesting films are three that were shot at the behest of the US army during the Second World War.  The Army Signal Corps requested a series of propaganda films, which Huston duly filmed…  and which were then released only in a limited way, never really used during the war effort.

Perhaps the least controversial of the three was Report from the Aleutians, which was reasonably aligned with what the government wanted, but was delayed by Huston’s portrayal of Army life as monotonous – not a particularly welcome message for a wartime propaganda film, obviously.  It can be viewed in its entirety, here.

The most surprising thing about Let There Be Light is that it was allowed to be filmed in the first place.  1946 was hardly a time to focus on the “nervous condition” and treatment of veterans.  It brought to light a whole raft of issues that are only really being taken seriously today, and which were extremely unwelcome in the dawning light of the cold war.  The one unsurprising chapter in its history is that it was banned by the Army until 1981.  It can be viewed here.

The final film is, by far, the most interesting of the three.  It is called The Battle of San Pietro, and was filmed during and immediately following the battle of the same name, during the Italian campaign.

As a film that documents a victory for the Allies, this one could have been (it is arguable that it should have been, as that was what he was being paid for, after all) a paean to the justice of the Allied cause and an ode to the heroism of its troops and to the inevitability of victory when one took into account the combined virtues of justice and heroism.

The Battle of San Pietro Still

But Huston, unlike his Nazi counterpart, documentary genius Leni Reifenstahl, decided not to obey his masters’ commands to the letter.  He let an evident love for truth in documentary filmmaking overrun his assignment, and showed just how hard-fought the victory had been.  Dead GIs are not something one normally expects from a film meant to raise US morale, nor is the effect of the war on civilian populations… but they are present.  The film was released, in 1945 (though some troops saw it in 1944), and eventually even allowed to be called a classic in its own right (and by far the most famous of Huston’s wartime work), but it was a close-run thing.  Curious people can watch it here.

Perhaps the fact that Huston could do this kind of thing and still get promoted means that the correct side won the war.  The fact that he wasn’t sent to Siberia or executed without trial for disobeying the spirit, if not the letter, of his contract makes these films even more valuable today.

The fact that both of the above were a close-run thing…  bears thinking about.

Quick Thoughts on the November 2015 Paris Attacks

November 2015 Paris Terror Attack

Everyone interested in world affairs has probably been glued to the news over the past twenty hours or so, so there is no need to review the horror of the crimes that have committed, but it’s definitely worth sharing a couple of immediate thoughts about the situation, as they may be worth reflecting on.

1.  Extremist groups, it seems, are incapable of learning.  We’ve examined before the fact that these extremists are incapable of accepting the inevitability of a modern, free and inclusive world, in which globalization is a given and women are equal to men, but the sheer stupidity of this latest series of attacks surprises even in that context.

In the first place, France has traditionally been a lukewarm supporter of the international war on terror, at best.  The French combination of arrogance and an anachronistic view of their own importance has seen the country often holding back nations who would pursue the war more aggressively.  In fact, as a staunch opponent of the Al-Assad regime in Syria, France has actually been hindering the war against ISIS.

While it’s true that France is the origin of freedom in the modern sense, and thereby represents a highly symbolic target, an armed insurrection that has been catalogued as a criminal enterprise by all respectable elements on the worldwide stage should be a little more pragmatic when selecting targets.  All this attack will do is galvanize the French people against ISIS… an organization that seems not to understand that ANY of the countries they are attacking could wipe them out in a few weeks if they have popular support.  And now, the French do – and after listening to Hollande last night, I wouldn’t be surprised if they sent in troops and did just that.  It would be the best thing for everyone.

And ISIS can’t say that this is a surprise.  In 2001, Osama Bin Laden decided it would be a good idea to attack the US.  That ended extremely badly for him, his Afghan allies, his organization, and also for Saddam Hussein, who had nothing to do with any of it, but was a target of opportunity.  A people that had been supporting a fight against terrorism half-heartedly suddenly awoke, rallied behind an otherwise unloved president and kicked some ass.

ku klux klan

It’s not just recent examples that show how silly this is, either.  After the US Civil War ended, the Ku Klux Klan was born as a terrorist group to attempt to end Reconstruction, which, though a colossal injustice in practice had the might of the Union army behind it.  That original incarnation of the Ku Klux Klan was eventually disbanded… because the leaders understood that the terror attacks were only serving to intensify the crackdown, and that claims had to be pursued using other methods.  Which shows that even white supremacists, a group not noted for their brilliance, are less moronic than the current generation of Islamic extremists.

2.  Has Al-Jazeera replaced the BBC as the go-to news source when something globally important happens?  In the 1990s, especially during the first Gulf War, CNN was often the only international option to watch news live, and was the most complete coverage on cable.

But as more and more options became available, most global audiences grew to prefer the BBC’s news channel, as the stories were covered with a much more global and complete set of assumptions.  CNN was clearly too US-centric to be useful, while Fox news, of course was ridiculous (last night they referred to Hollande as the President of Paris).  Watching feeds from France and Italy last nigh left me impressed with the RAI’s coverage, while I think the French channels were in shock.  But both the RAI and the French channels are hampered by the fact that not everyone understands French or Italian (my own French means that I need to concentrate hard on that), while almost everyone interested in world affairs speaks English.  The BBC was plodding along, and Euronews, caught with it’s late-night anchors on the air, was a mess.

And then I turned to Al-Jazeera.  What a revelation.  Impeccable British accents giving the news without stridence or partiality, combined with interviews with security analysts from the US, political analysts from everywhere – including the middle east – and French government officials.   A near-perfect balance.

And they had a team on the ground, a hyper-professional impeccably dressed reporter (British accent, of course) and a couple of camera men.  And twhat they were saying was better and more informed than anything else going on at the time.

I’d never paid much attention to Al-Jazeera before, but a quick side-by-side with everyone else gives me the feeling that impartial audiences are going to keep increasing for them if they keep up the good work. I know I’ll be looking to them within the first few minutes (as opposed to just out of curiosity) the next time anything big happens.

A Novel Point of View

You know what a novel is, right?

Of course you do.  It’s any one of those fat books on the shelves at Barnes & Noble that isn’t divided into short stories or something.  What a silly question.

Well…

Most people use a working definition of the word “novel” which is pretty similar to the one above, but scholars most certainly do not.  In fact a good way to amuse oneself if one were to be trapped in a college of literary pretensions during a hurricane would be to ask a random professor to define the term for you in the presence of other professors.  It is very important to be prepared for the little disagreements this will generate: bandages, iodine, and possibly a fully-equipped trauma ward would be good things to have handy.

The Theory of the Novel Edited by Philip Stevick

Just as an example of how hard the novel is to pin down, the book that started the mental process towards this article, (Philip Stevick’s The Theory of the Novel) is divided into sections that analyze the novel from different angles (Generic Identity; Narrative Technique; Point of View; Plot; Structure and Proportion; Style; Character; Time and Place; Symbol; and Life and Art), each filled with essays written by such luminaries as Conrad or Cervantes.

It’s quite an impressive piece of name dropping–and an extremely interesting, albeit somewhat dry read–but it would be hard-pressed to fit with the popular perception of what a novel is.

So let’s put that popular perception into words quickly, in order to have a rough working definition moving forward:

Novel: Any work of prose fiction longer than about a hundred and fifty pages or so that tells a story, and which has a beginning, middle, and end.

This clearly isn’t an academic definition, but it gives us the gist – the novel is longer than a short story or a novella, it tells one story, as opposed to being a collection of shorter works, and at the end of the thing, the reader knows how it turns out for the people involved – even if what happens next may be a bit open-ended as in more modern work.  Most people would agree with this definition.

Most scholars would probably move to have anyone proposing such tripe burned at the stake.  Even Wikipedia, that supposedly democratic collection of worldly wisdom has a long, rambling article about novels that touches on every possible inclusion and ancestor, and even has a handy little chart on reading habits in England in the 18th century (and a bonus discussion on Dan Brown discussing whether The Da Vinci Code is an anti-Christian novel.  Don’t believe that? See for yourself).

This is one of those cases where a rigorous definition of the subject matter, and the obsession of academics of going beyond popular knowledge is counter-productive.  Sadly, however, it is clear that, other than Sociology, there are few branches of study quite as dominated by obsessive people who wouldn’t be able to survive in any other discipline than literary criticism.  Anyone who has ever heard of critical race theory and is aware that some people consider it a valid approach to literary criticism should be enough to convince you of the unfortunate state of literary criticism.  If that doesn’t convince you, simply pick up or browse your chosen newspaper – you will see that books are not judged based on their literary or artistic merit, but by the politics of their authors (try it, it’s fun – The Guardian is particularly unsubtle about it, which is sad because their cultural section is otherwise among the world’s best).

But if none of the above convinces you, here’s XKCD.  XKCD cannot be argued with.

XKCD impostor

(As always, you can see the original – with the mouse-over, at their site.  Plus, buy their t-shirts and stuff – anyone producing material of that quality and not charging others to use it deserves to be supported.)

But if you really want to start a fight, ask one of your captive professors what the first novel was.  You won’t even need the trauma room, as survivors are unlikely.

When Everyone is Out to Get Everyone Else

Murder My Sweet Poster

We’re on an unapologetic film noir binge here at CE, and we don’t care who knows it.

After our recent review of Double Indemnity–which established a lot of the basic format of noir while simultaneously ignoring the most important element, the hardboiled detective–we’re back in more familiar territory.  In fact, we’re entering hallowed ground, for we are about to speak of Philip Marlowe‘s film debut.

While other Chandler novels had been filmed–even Marlowe ones–the character had never appeared by name until 1944’s Murder My Sweet (which British audiences will likely know as Farewell My Lovely).

Possibly the most notable element of this film is that Dick Powell, known for light-hearted roles as opposed to anything Marlowe-esque was cast in the lead role… and, seventy years later, therein lies a problem.  The major issue is that the hard-boiled dick actor par excellence is Humphrey Bogart, and no amount of thespian versatility by lesser men could ever really equal that.  Having anyone else play Marlowe seems somehow sacrilegious.

This is still a great film, mainly because the plot is so twisted that one ends up needing a corkscrew to figure it all out… that is until the end, where the spider at the center of the web is revealed, and the motivations become a bit clearer.

Dick Powell in Murder my Sweet

We won’t spoil it by giving away the final revelation, but will limit ourselves to noting that most of the comments about human sorididity that we made about  Double Indemnity are still valid, but detract less from this film.  It’s one of those cases where having the plot focus less closely on the relationships between people and having more actually go on shifts the focus away from the baser elements of behavior.  This one feels more like a roller-coaster ride through the murky depths than the view through a microscope of that same muck… and gains by it immeasurably.

Watching the two films back to back is recommended for anyone who wishes to truly understand the extremes of noir, and how two aesthetically similar films in the same genre which touch on similar themes, and even use a similar flashback framing to tell the story, can feel completely different, and yet be unmistakably related.

And a final reflection is how dark films seemed to find favor during dark times, despite the best efforts of the Hays office.  Noir is a product of the early and mid forties, which would seem to be anti-intuitive; one would think that a people weary of war would look for light-hearted filmography.

But that clearly wasn’t the case.  Noir would never be done as well as it was then, much like comedy would never be as good as the screwball type of the thirties – Hollywood simply never recovered that particular magic.

We give this one four Schlemmons.

The Unbearable Heaviness of Being

Double Indemnity Movie Poster

The nice thing about our Manifesto is that it allows us to cheerfully jump from the horrors of WWII weapons of terror to light-hearted reviews of science fiction anthologies without batting an eyelash.  Perhaps the move we are making today is conceptually much smaller (although, admittedly, the last time we discussed films we went on and on about crazy Russians), but it does take us back to somewhat darker themes.

Film noir has often been analyzed from an aesthetic standpoint, and with good reason.  The darkness and visual cues (such as venetian blind lighting) are signature moves.  But today, in analyzing a film that is often credited with creating the noir look, we’re going to be contrary and look at the characters, a sordid little bunch.

Let’s begin by saying that 1944’s Double Indemnity is a film with a bunch of unforgettable scenes and plot devices – perhaps the most memorable of which is the dictation of the story into a recording device by the main character.  Having said that, it’s not actually an enjoyable film.  One doesn’t watch this one with the same pleasure as, say, The Maltese Falcon.  Though the characters are equally down-on-their luck, and often just as self-serving as the ones surrounding Sam Spade, they don’t have that touch of black humor or dogged streak of hidden nobility.

Barbara Stanwyck's towel in Double Indemnity

What I applaud most is that they managed to get it past the Hays code – even if they had to make some compromises (notably the size of Barbara Stanwick’s towel)

Despite not being enjoyable, that’s probably what has made this film so respected even seventy years later.  Think back to 1944.  There was a war on.  The public was thinking of heroism, of sacrifice – and so many films of the time reflected that.  The ones that didn’t at least attempted to give the audience some sense of humanity’s redeeming qualities…  and along comes Billy Wilder with an unflinching look at the seamier side of human nature.

This is a film where the main character is a heel, where the girl is worse than he is, and where even the supposedly pure younger woman’s innocence and decency can very easily be called into question by cynical viewers.  It looks at the real world, a world in which Sam Spade is as likely as a flock of flying unicorns.

It moves along at a decent clip, piling intrigue upon intrigue until, by a commodius vicus of recirculation, we are back where we started, but we now know why the man is dictating into the machine.

It works, it’s powerful, it’s much more true to life than most of the hardboiled genre… but you won’t like it as much. On a scale of one to five, we give it three Schlemmons.*

*For an explanation of the Schlemmon system, see here… now we just need to get someone to create a Schlemmon icon for us.