Krauthammer on Churchill: The Indispensable Man


One of the primary goals of every writer is to have those who read his work regard him brilliant. Another goal, and a far more difficult and impressive one, is to have the reader arrive at brilliant thoughts while reading that work. Whether or not Charles Krauthammer’s new book Things That Matter: Three Decades of Passions, Pastimes and Politics accomplishes the former is relative to the reader, but in my humble opinion, the book definitely accomplishes the latter.

Book%20Cover_0In the second chapter, following the requisite intro, and the requisite chapter describing the author’s days of youth –playing baseball– Charles Krauthammer posits the notion that Time Magazine got it wrong when they nominated Albert Einstein “Man of the Century”. “Einstein may have been vital,” argues Krauthammer, and he is “certainly the best mind of the century”, but Britain’s Prime Minister Winston Churchill “carried that absolutely required criterion: indispensability” in the 20th century, and to the 20th century.

One thought this reader had, while reading, is that provocative, bar stool discussion that no person had a more prominent effect on the 20th century than Adolf Hitler. While that is arguably true, a question to that provocative notion should be, were the lessons of Hitler’s evil transgressions more transcendent than Winston Churchill’s efforts to, as Krauthammer later describes it, “slay that dragon”?

Hitler is, of course, indispensable to any study of the 20th century, in that he illustrated much of what’s wrong with human nature, and he gave us a template for how we should treat countries after war (after World War I). Though evil can take many forms, Hitler provided students of history a model of unprecedented evil that we can now use as a guide to detect evil, based on the precedent he set. We will hopefully never allow an evil despot to rise to such levels of prominence in their country that they would be in a position to coerce its citizens to do such evil things to one another. With all these lessons and precedents regarding absolute evil, students of the 20th century say that Hitler has to be the man of that century.

It’s a provocative notion, and it would probably give Hitler the stature, and historical value, that he sought all along. How many men, and how many precedents of the 20th century, will be cited more often than those Hitler provided humanity for centuries to come? Young people, involved in bar stool discussions, love such provocative notions, for they provide all listeners the impression that the provocateur is intelligent with such shock and awe proclamations. Most of us love such impressions, when we’re younger. As we age, and move past the desire to be perceived as intelligent through provocation, we actually become more intelligent, and we realize that most provocative thoughts should go through careful examination and attempts to disprove. The final conclusions we reach may not be as provocative, or as memorable, but as we age, read, and learn to temper our temperament, we realize that being correct is more valuable than being memorable or provocative. There is no doubt that the lessons evil men leave behind are monumental in history, but too often these provocative conversations leave out the dragon slayers that should, at least, be considered as prominent, if not more so.

To say that Winston Churchill hasn’t already achieved a prominent place in history would be foolish, as most historians continue to rank him in their top five most prominent figures of the 20th century, and most left-leaning historians will rank him in their top twenty. Does he deserve even greater prominence than we’ve already allowed, however?

One of the reasons Churchill is not higher on the list, I would submit, is that hindsight has proven that he was so obviously correct in his doomsayer predictions about Hitler. The idea that all of his warnings were so obviously on the mark, however, makes it almost boring to declare him the most prominent person of the 20th century.  It’s an of-course statement that causes readers to yawn over the headline, when a more prominent listing of others, such as Einstein, prove more provocative, compelling, and newsworthy.

Churchill was, as Krauthammer writes, “A 19th century man parachuted into the 20th,” but “it took a 19th century man –traditional in habit, rational in thought, conservative in temper– to save the 20th century from itself.” Yawn. Such lines don’t play well on the cover of a magazine to suggest that Churchill was right about Hitler, and thus he should be nominated the Man of the Century for speaking out and saving Britain and most likely the rest of the world. Especially when compared to the exciting, and revolutionary, bullet points a writer can compile about Einstein’s accomplishments.

Before dismissing the obviousness of Churchill’s warnings, one has to examine what he was up against while still in the British Parliament. Most of the British Parliament, and Prime Minister Neville Chamberlain, dismissed Churchill’s warnings. They did not want to view Hitler through Churchill’s simplistic, black and white lens. Churchill’s warnings were viewed as the impulsive, irrational, and the unreasonable views of a war hawk. Neville Chamberlain has been viewed, by right and left historians as one of the obvious fools of the 20th century, but is it a glaring headline that Churchill should be viewed as the most obvious hero of the 20th century, no, because it is just so obvious. It doesn’t require any creativity to back up. It just is what it is, as we now say.

Churchill suggested that the year’s delay between the Munich Pact and what he deemed an inevitable war worsened Britain’s position, in direct opposition to Chamberlain’s assessment. (Editor’s note: Chamberlain would later declare that that year allowed the British to bolster their troops, and that the British military was not prepared for war during the previous year.) In that year, between Munich and World War II, Chamberlain also exhausted the possibility of diplomacy with détente, blockades, and anything and everything the world could use to achieve “peace in our time”. To refute the Chamberlain claims, Churchill stated Hitler could have been removed from power by a grand coalition of European states to prevent World War II from happening in the year in question.

That suggestion, that in some cases waiting too long can worsen one’s position, would rear its ugly head before Hitler’s body even went cold, when U.S. General George S. Patton’s warned General Eisenhower about Russia. Eisenhower, presumably recognizing that Patton’s warnings were not unfounded, responded that Americans were simply too war-weary to make any moves against Russia. The suggestion would later haunt the world in the 21st century, with Iraq in 2003, in a manner some would suggest the reverse of the Churchill suggestion, saying that we acted too impulsively, and the suggestion will probably haunt nations around the world for many more, because the human instinct is to avoid war at all costs, no matter how black and white, and simplistic, and obvious the need for action becomes.

In later writings, “Churchill depicted Chamberlain as well-meaning but weak, blind to the threat posed by Hitler, and oblivious to the fact that (according to Churchill) Hitler could have been removed from power by a grand coalition of European states. Churchill suggested that the year’s delay between Munich and war worsened Britain’s position, and criticized Chamberlain for both peacetime and wartime decisions. In the years following the publication of Churchill’s books, few historians questioned his judgment.”{1}

It may appear redundant to call an historian a hindsight historian, since history is documented in hindsight, but some historians document the facts of an era while others provide hindsight commentary to historical events that were not as clear to the historical figures of the day. These historians provide the unlimited omniscience that hindsight provides. Hindsight historians may document Churchill’s warnings as obvious now, but most hindsight historians will not tell you how popular Neville Chamberlains “peace in our time” efforts were at the time.

Another question those who believe Hitler’s quest for power was so obvious that it’s now redundant to talk about, should ask themselves how obvious it was to Neville Chamberlain at the time. How obvious was it to the British Parliament, the isolationists in America, and the world at large? Much like today, Churchill was regarded as a war hawk, and presumably a fear monger when he spoke of what he believed to be Hitler’s aspirations. Some have said that Churchill is almost solely responsible for the meetings that occurred at Tehran, Yalta, and Potsdam with FDR and Stalin that eventually won the war for the allied forces. 

We’ve all read hindsight historians document that America shouldn’t have been “so stupid” as to allow the attack on Pearl Harbor, when so many signs pointed to its eventuality. It’s easy for them to look at the decade preceding the terrorist attack on September 11, 2001, to declare that we were obviously naïve in trying terrorists as criminals rather than wartime adversaries. It’s also easy for them to write that that the call to war in Iraq, in 2003, was impulsive based on our inability to find weapons of mass destruction in Iraq. What’s not so easy, however, is for those figures that were involved in the present tense of history to stick their neck out and speak out against the conventional wisdom of their day and declare that it’s “weak and blind” to continue to follow the conventional line of thinking. Hindsight historians now slightly diminish Churchill’s role in 20th century, because it is now so obvious that Hitler was the epitome of evil. To read through an objective telling of the history, however, it obviously wasn’t so obvious to some at the time.

As Krauthammer wrote in Things That Matter:

“And who is the hero of that story?  (The story of the 20th century’s ability to defeat totalitarianism, and leave it as a “cul-de-sac” in the annals of human history.) Who slew the dragon? Yes, it was the ordinary man, the taxpayer, the grunt who fought and won the wars. Yes, it was America and its allies. Yes, it was the great leaders: FDR, de Gaulle, Adenauer, Truman, John Paul II, Thatcher, Reagan. But above all, victory required one man without whom the fight would have been lost at the beginning. It required Winston Churchill.”{2}

Krauthammer, Charles.  Things That Matter: Three Decades of Passions, Pastimes and Politics.  New York, New York:  Random House, 2013.  Print.

James Joyce: Incomparable or Incomprehensible?


Those of us who are always on the lookout for edgy, racy content have heard the term “Joycean” thrown about with little discretion over the years. If you’ve heard this term as often as I have, you’ve no doubt asked, what does it mean to be “Joycean”? To listen to critics, it can mean whatever you want it to mean? They appear to be more interested in using the term than using it properly, but how do we use it properly? What does “Joycean” mean? If James Joyce were still alive, we would love to ask him if his last two books were two of the most erudite, most complicated pieces of fiction ever written, or were they a great practical joke you played on the literature community to expose reference makers and elitist, scholars for who they are?

James Joyce
James Joyce

Readers who seek to up their erudite status by reading “difficult” books, have all heard of Joyce’s final two works of fiction: Ulysses and Finnegans Wake, as literary scholars list these books as some of the most difficult, most complicated works of fiction ever created. Those of us who were intrigued, decided to pick them up that as a challenge of the mind, others attempt to read them to gain entrance into their subjective definition of elite status. Most are confused and disoriented by the books, but some have the patience, the wherewithal, and the understanding of all of the references made, and languages used, in these books necessary for comprehension. Those readers either deserve a hearty salute, or the scorn and laughter that Joyce provided, as a gift to the havenots, who are honest enough to admit that they don’t know what was going on in them. Was Ulysses such an ingenious book that it’s worth all of the effort it requires for greater understanding, or was it a book about nothing?  

I don’t understand either of these books, and I have gone back numerous times to try to further my understanding. Some have said that Ulysses is the more palatable of the two, but I have found it too elliptical, too erratic, and too detail-oriented to maintain focus, and I have purchased three different aides to guide me through it. Some of the readers who claim to enjoy Ulysses, admit that Finnegans Wake is ridiculously incomprehensible.

Most people enjoyed Dennis Miller’s tenure as an announcer on Monday Night Football, but most of those same people complained that they didn’t understand two-thirds of the man’s references. I didn’t keep a journal on his references, but I’m willing to bet that at least a third of them were Joycean in nature (Ulysses specifically). Miller stated that his goal, in using such obscure references, was to make fellow announcer Al Michaels laugh, but any fan who has followed Miller’s career knows that he enjoys the motif of using complicated and obscure references to make himself sound erudite. There are, today, very few references more obscure than those that recall the work of James Joyce, a man who described his last book, Finnegans Wake, as “A book obscure enough to keep professors busy for 300 years.”

Andy Kaufman referenced James Joyce when trying to describe his method of operation. The import of the reference was that Kaufman wanted to be a comedian’s comedian, in the manner that Joyce was a writer’s writer. Kaufman wanted to perform difficult and complicated acts that the average consumer would not understand, and the very fact that they didn’t “get it” was what invigorated him. He wanted that insider status that an artist uses to gain entrée to the “in the know” groups. After achieving some fame, audiences began laughing with Kaufman in a manner that appears to have bored him, and he spent the rest of his career trying to up that ante. By doing the latter, we can guess that there was something genuine about Kaufman’s path in that he was only trying to entertain himself, and his friends, and if anyone else wanted on board that was up to them. Joyce and Kaufman, it appears, shared this impulse.

Anytime an artist creates a difficult piece of work, there is going to be a divide between the haves (those who get it) and the havenots. When Mike Patton formed the relatively obscure band Fantomas, he never did so with the illusion that he was going to unseat the Eagles Greatest Hits, or Michael Jackson’s Thriller, atop the list of greatest selling albums of all time. He knew that his group would playing to a very select audience.

What is the audience for such difficult subject matter? Most people seek music, as either background noise, something to dance to, or something to which they can tap their finger. Most people read a book to gain a little more characterization and complication than a movie can provide, but they don’t want too much characterization, or too much complication. Most people only buy art to feng shui their homes. Most people don’t seek excessively difficult art, and those who do are usually seeking something more, something more engaging, and something more provocative that can only be defined by the individual. The audience who seek something so different that it can be difficult generally have such a strong foundation in the arts that they reach a point where their artistic desires can only satiated by something different.

Yet, different can mean different things at different times to different people. Different can be complicated, and discordant, but it can also be limited to style. At this point in history, it’s difficult to be different, in a manner that cannot be called derivative of someone or something, so some people seek any separations they can find. When the latest starlet of the moment twerks in a provocative manner, has a construction worker find her pornographic video, or accidentally has her reproductive organ photographed, we know that these are incidents were created by the starlet, and her people, to get noticed after they have exhausted all other attempts to be perceived as artistically brilliant and different.

There are other artists who are different for the sole sake of being different. This is often less than organic, and it often disinterests those who seek a true separation from the norm, because we feel that this has been thoroughly explored to the point of exhaustion. Andy Kaufman created something organically different that can never be completely replicated, in much the same manner Chuck Palahniuk, Mike Patton, David Bowie, Quentin Tarantino, and Jerry Seinfeld and Larry David did. Can it be said that James Joyce’s final two books were different in an artistically brilliant, and cutting edge manner that all of these artists’ creations were, or were James Joyce’s writings more symbolism over substance? Put another way, was Joyce a substantive artist who’s true messages need to be unearthed through careful examination, or was he simply twerking in a provocative manner with the hope of getting noticed by the elite scholars of his generation after exhausting the limits of his talent in other works?

Judging by his short stories, James Joyce could’ve written some of the best novels in history. Those who say that he already did, would have to admit that his final two works were not overly concerned with story, or plot. Those who defend his final two works would probably say that I am judging Joyce’s final two works by traditional standards, and that they were anything but traditional. They would probably also argue that the final two works sought to shake up the traditional world of literature, and anyone who dared to take up the challenge of reading these works would probably say Joyce sought to confound us, more than interest us, and if they concede to the idea that the final two works were different for the sole sake of being different, they would add that he was one of the first to do so. Those who defend his final two works say that they are not as difficult to read, or as complex, as some would lead you to believe. These people suggest that reading these two works only requires more patience, and examination, than the average works. Anyone who states such a thing is attempting to sound either hyper intelligent, or hyper erudite, for it was Joyce’s expressed purpose to be difficult, complicated, and hyper-erudite.

To understand Ulysses, one needs an annotated guide of 1920-era Dublin, a guide that describes the Irish songs of the day, some limericks, mythology, and a fluent understanding of Homer’s The Odyssey. If the reader doesn’t have a well-versed knowledge of that which occurred nearly one-hundred years prior to today, they may not understand the parodies, or jokes Joyce employs in Ulysses. Yet, it was considered, by the Modern Library, in 1998, to be the greatest work of fiction ever produced.

“Everyone I know owns Ulysses, but no one I know has finished it.”  —Larry King.

To fully understand, and presumably enjoy, Finnegans Wake, the reader needs to have at least a decent understanding of Latin, German, French, and Hebrew, and a basic understanding of the Norwegian linguistic and cultural elements. The reader will also need to be well-versed in Egypt’s Book of the Dead, Shakespeare, The Bible, and The Qur’an. They also need to understand the English language on an etymological level, for one of Joyce’s goals with Finnegans Wake, was to mess with the conventions of the English language.

Some have opined that one of Joyce’s goals, in Ulysses, was to use every word in the English language, and others have stated that this is a possibility since he used approximately 40,000 unique words throughout the work. If this is true, say others, his goal for Finnegans Wake, was to extend the confusion by incorporating German, French, Latin, Hebrew, and other languages into his text. When he did use English, in Finnegans Wake, Joyce sought to use it in unconventional and etymological ways to describe what he believed to be the language of the night. He stated that Finnegans Wake was “A book of the night” and Ulysses was “A book of the day”.

“In writing of the night, I really could not, I felt, use words in their ordinary connections . . . that way they do not express how things are in the night, in the different stages – conscious, then semi-conscious, then unconscious. I found that it could not be done with words in their ordinary relations and connections. When morning comes of course everything will be clear again . . .  I’ll give them back their English language. I’m not destroying it for good.” —James Joyce on his novel Finnegans Wake.

This use of the “language of the night” could lead one to say that Joyce was one of the first deconstructionists, and thus ahead of his time by destroying the meaning of meaning in the immediate sense. Those obsessed with James Joyce could interpret the quote, and the subsequent methodology used in Finnegans Wake, to mean that Joyce had such a profound understanding of linguistics that normal modes of communicating an idea, bored him. He wanted something different. He wanted to explore language, and meaning, in a manner that made his readers question their fundamentals. Readability was not his goal, nor was storytelling, or achieving a best-seller list. He sought to destroy conventions, and common sense, and achieve a higher realm of perfect, in which timeless abstractions cannot be communicated to those who adhere to common sense. This makes for an interesting conversation on high art, and philosophy, but does it lend itself to quality reading?

“What is clear and concise can’t deal with reality,” Joyce is reported to have told friend Arthur Power, “For to be real is to be surrounded by mystery.”

In the modern age, there is much discussion of the widening gap between the haves and the have nots. That particular discussion revolves around economic distinctions, as it has for time immemorial, but in the Joycean world, the gap involves those who “get” his works, and those who do not. Those who get it usually prefer to have deeper meanings shrouded in clever wordplay. They usually prefer symbolism over substance; writing over storytelling; and interpretation over consistent and concretized thoughts.

The two schools of thought between the haves and the havenots can probably best be explained by breaking them down to the different approaches James Joyce and one of Joyce’s contemporaries Ernest Hemingway. Hemingway wrote clear and concise sentences. Hemingway stated that his methodology was to write something that was true:

“The hardest thing is to make something really true and sometimes truer than true.”—Ernest Hemingway.

Putting Joyce’s final two works through the Hemingway school of thought, one could say that Joyce’s methodology was: Some of the times, it’s more interesting to make it false and allow others to define it as true. 

“Though people may read more into Ulysses than I ever intended, who is to say that they are wrong: do any of us know what we are creating? … Which of us can control our scribblings? They are the script of one’s personality like your voice or your walk.” —James Joyce

Those of us who have had a deep discussion, on a deep, multifaceted topic, with a deep thinker know that sooner or later a declarative distinction will be made if we stubbornly insist that we are not wrong. “You don’t get it, and you probably never will,” is something they say in a variety of ways. We all know what it feels like to be summarily dismissed as an anti-intellectual by a deep thinker? Those who aren’t snobbish in an anti-social manner, often avoid openly dismissing us when we’re around, but even the polite snobs give us a vibe, a look, or a chuff that is intended to let us know our place.

“Well, what do you think of it then?” is the response some of us have given, after being backed into an anti-intellectual corner by deep thinkers.

If they are an anti-social, elite intellectual snob, they will say something along the lines of: “I simply choose to think deeper!” It’s a great line, and it purportedly puts us stubborn types in our place, but it’s a self-serving non-answer. Those of us who are more accustomed to interaction with deep thinkers, will then ask them to expound upon their complicated, deep thinking? Pushing deep thinkers deeper will often reveal a lack of substance beneath their piles of style, and the careful observer will find that the results of their deep thinking is no deeper than the deep thinker cap they wear to the pub.

A number of attempts at reading Joyce has led me to believe that he probably didn’t have much substance beneath his piles of style, so he muddied the waters of his message with puns, songs, gibberish, abstractions, foreign languages, and overly complicated complications. He did this, in my opinion, to conceal the fact that when compared to his colleagues, he didn’t have all that much to say. If that’s true, he was definitely artistically accomplished in saying it.

Who can forget the many sayings that Finnegans Wake dropped on our culture, such as the transcendental sound of the thunderclap that announced the fall of Adam and Eve from the garden of Eden:

“bababadalgharaghtakamminarronnkonnbronntonnerronntuonnthunntrovarrhounawnskawntoohoohoordenenthur-nuk!”

What about the mirthsome giggles we have had in social gatherings with the catchphrase:

“A way a lone a last a loved a long the riverrun, past Eve and Adam’s, from swerve of shore to bend of bay, brings us by a commodius vicus of recirculation back to Howth Castle and Environs.”

Or the ever present: 

“(Stoop) if you are abcedminded, to this claybook, what curios of sings (please stoop), in this allaphbed! Can you rede (since We and Thou had it out already) its world?”

If you just read those sentences three or four times, and you still have no idea what it says, and you just went back to read them again, because you want to be a have that “gets it”, you’re not alone. If these passages were merely anecdotal evidence of the difficulty involved in reading Finnegans Wake, that would be one thing, but these difficulties litter just about every sentence of every paragraph of the book, as evidenced by the exhaustive assistance provided at the site Finwake.com for readers who have no idea what this writer is going on about. 

Finnegans Wake is reported to be in English, but it’s not the standard version of English where words have specific meaning. The “language of the night” was intended for linguists who are tired of reading words that have exact meanings, and it was intended to be playful and mind-altering, and rule breaking. James Joyce made references intended to be obscure even to the reader of his day who may not have Joyce’s wealth of knowledge of history, or the manner in which the meaning of the words in the English language have changed throughout history.

“What is really imaginative is the contrary to what is concise and clear.” —James Joyce

James Joyce was a stream of consciousness writer who believed that all “mistakes” were intended on some level that superseded awareness. In the 500+ page book, Finnegans Wake, Joyce found 600 errors after publication. He was informed of some, if not all of these errors, and he was reported to have fought his publishers to keep them in. Later editions were written to correct many of these errors, and provide readers “the book in the manner Joyce had intended.” If Joyce didn’t believe in errors, however, how can those who corrected them state that the corrected edition is the definitive edition that “Joyce intended”?

“The man of genius makes no mistakes, his errors are volitional and portals of discovery.” –James Joyce

Throughout the seventeen years Joyce spent writing Finnegans Wake, he began to go blind, so he had a friend named, Samuel Beckett, take dictation over the phone to complete the novel. At one point in this dictation setting, someone knocked on Joyce’s door. Joyce said, “Come in!” to the knocker, and Beckett wrote the words “Come in!” into the narrative of Finnegans Wake. When this error was spotted by Joyce, and the confusion was sorted out, Joyce insisted that Beckett, “Leave it in!” On another occasion, when a printer’s error was pointed out he said, “Leave it. It sounds better that way than the way I wrote it.”

There are three different versions of the text: The first and second are the editions that Joyce submitted for publications with all of the errors intact. The third edition has the errors that the editors located, and the 600 corrections that Joyce spent two years locating, corrected. Some would have you believe that first two editions are the definitive editions, but you have to be a Joyce purist to appreciate them.

Can it be called anything short of egotistical for an author to believe that his subconscious choices and decisions, are somehow divine? If, as Joyce said, and Picasso later repeated in regard to his paintings, mistakes are portals of discovery, then we can say that’s great, and incredibly artistic in the process of creation. To leave it in the finished product, however, and subject your readers to the confusion, just seems narcissistic. “Here’s what I was thinking at the time,” Joyce is basically telling his readers. “I don’t know what it means, but this is a higher plane of thinking than simple conscious thought. Isn’t it magical? Maybe you can make some sense of it. Maybe you can attribute it to your life in some manner.” This method of operation may say something profound about the random nature of the universe, but when we’re reading a novel we don’t necessarily want to know about the randomness of the universe, unless it’s structured in a manner that leads us to your statement. 

Not everyone can write a classic, and some realize this after a number of failed attempts. Once they arrive at this fork in the road, they can either write simple books that provide them and theirs an honest living, or they can grow so frustrated by their inability to write classics that they separate themselves from the pack through obscurity. The advantage of creating such an alleged contrivance is that beauty is in the eye of the beholder, and the beholder can assign their own relative beauty to it. Some would say this is the very definition of art, but others would say even that definition has limits. Some would say that the most obscure painting is art, because they “see it”, where others see only schlock for elitists to crib note to death, until meaning is derived.

James Joyce is considered the exception to this rule, fellow writers have told me, and if you are going to attempt to write an important novel in the 21st Century, you had better be familiar with him. I’ve tried, and I now believe that I’m destined to be a havenot in the Joycean world … even with Ulysses. The question that arises out of those ashes is, am I going a long way to becoming more intelligent by recognizing my limits, or should it be every aspiring intellect’s responsibility to continue to push themselves beyond any self-imposed limits to a point where they can finally achieve a scholarly understanding of difficult material? If this is a conundrum that every person encounters when facing challenges to their intelligence, is Ulysses, or more pointedly Finnegans Wake, the ultimate barometer of intelligence, or is it such an exaggerated extension that it had to have been a practical joke James Joyce played on the elitist literary community to expose them as the in-crowd, elitist snobs that they are when they “get it” just to get it. Do they really “get it”, or are they falling prey to Joyce’s clever ruse to expose them as people that “get” something that was never intended to be “got”?

What the World Needs Now is Another Calvin Coolidge


President John Calvin Coolidge Jr. said “No!” He said “No!” so often, through vetoes, that he’s still, nearly 100-years since he left office, ranked 9th among presidents for most vetoes. Does his unflinching, non-prejudicial ability to say “No!” so often make him the best president the United States has ever had? “No!” but his courage in the face of mounting pressure does land him on my personal Mount Rushmore.  

Before we categorically dismiss this as “The guy said no, who care?” Think about who he said “No!” to. If we became politicians, our first job would be to gather coalitions, or a group of other people to help us amass power. Calvin Coolidge became a governor because he was the lieutenant governor for a successful governor. He became president by being a vice-president to a president who died. If this happened to us, we would probably be overwhelmed by the idea of it, but Calvin Coolidge went back to bed moments after he learned he would be president. When he awoke, he went about saying no to powerful members of Congress, Senators, and the most powerful power brokers in Washington. In every session of Congress and the Senate, there are always those scary politicians and power brokers to whom everyone is afraid to say no, but the historical record shows that Coolidge was not intimidated. He dropped nos on everyone in a patient, reasonable, rational, and nonprejudicial manner.  

“No!” carries a lot of power, as any two-year-old, who is just learning the rudimentary power of language can tell you, but “That depends on who you’re saying no to,” the seasoned politician might argue. “Saying no to the wrong person in Washington could just as easily render you powerless.” President Calvin Coolidge didnt care. He was either one of the least ambitious president in terms of amassing a power base, or simply fearless, as the record states he said “No!” to everyone.

In the near one-hundred-years that have followed that great president’s tenure in office, our politicians-turned-presidents have fallen prey to the seductive power of “Yes!”, and they have found creative ways to say “Yes!” to other politicians and constituents. Even the most ardent supporters of “Yes!” would have to admit that the seductive power of “Yes!” has led to more centralized government with the strongest power residing in the office of the president. Before you say, “No, that’s not true,” is your party in power in the moment? Will your opinion change when the other party assumes power? We should all succumb to the power of “No!”

We want to hear our politicians, our leaders, and other authority figures to learn how to say “Yes!” more often, and we throw childish temper tantrums when they don’t. “Yes!” builds affinity and loyalty that can evolve into love when we hear it often enough, but what we want versus what we need are two entirely different hemispheres. Before we categorically reject “No!” we should consider what “Yes!” has wrought us, annual deficits that have lead to a federal debt that is currently spiraling so out of control that economic forecasters predict that an inevitable disaster could happen at some point.

Psychologists say that we not only do we learn to adjust to hearing “No!”, but as much as we hate having any authority figures dictate how we live our lives, we do adjust, and those adjustments can lead to a sense of appreciation for the structure and parameters “No!” provides.

Political scientists might admit that a world of “No!” might be idyllic in terms of economic survival, but modern Americans are too far down the path of “Yes!” to ever elect a Calvin Coolidge President of the United States. The modern United States, presidential election is now a battle of the yeses. Only an unimaginable economic disaster could turn that around, political scientists might agree, but even then, even then, the power of “No!” would hold no sway. At this point in our history, the only difference between the parties, on this issue, is in the creative ways their candidates can find to say yes.

Historians suggest that even as far back as 1918, Calvin Coolidge’s “No!” policies may not have resulted in election victories, as Coolidge ran for Governor of Massachusetts as the sitting Lieutenant Governor, and he ran on the previous administration’s record, and he later assumed the office of the President when the previous president died an untimely death. If he were a relative unknown in either of those elections, it’s probable he wouldn’t have won either of them.

President Calvin Coolidge’s claim to fame was that he was all about budgets. Budgets, creative accounting, and numbers might win you an article on Rilaly.com, but to win a presidential election Calvin Coolidge probably needed to be viewed as an incumbent in a prosperous time period. After reading a Coolidge biography, we get the idea that he was more at home in the company of numbers such as two and zero than he was a Senator, Congressman, or a power broker addressing him as “Mr. President”.   

“I believe in budgets. I want other people to believe in them. I have had a small one to run my own home; and besides that, I am the head of the organization that makes the greatest of all budgets, that of the United States government. Do you wonder then that at times I dream of balance sheets and sinking funds, and deficits and tax rates and all the rest?”  

Read that how you want, but it’s pretty hard to chant in a convention hall.    

Coolidge Enters Stage Right

Calvin Coolidge
Calvin Coolidge

Following the Warren G. Harding/Coolidge ticket’s 1920 victory for the office of the president, President Warren G. Harding’s inaugural address set a dramatically different tone from that of the outgoing Woodrow Wilson administration:

“No altered system will work a miracle,” President Harding said, “Any wild experiment will only add to the confusion. Our best assurance lies in efficient administration of our proven system.”

Harding’s ego-less approach was that he would be nothing more than a steward of the American system that worked just fine in the 130 years of America preceding his election. Harding’s stance —as opposed to Woodrow Wilson’s— was that his administration wouldn’t try to outdo the prosperous model The Founders created. Put in this light, what kind of ego looks at the model of America —that was, and is, the envy of the world— and thinks they can do it better? How many of them succeeded in this venture? Harding was basically saying that he didn’t regard himself as a “miracle worker” who would step into office with his think tank notions to tell the nation that he has a “new and improved” model to cure what ails us? Isn’t that what politicians do, yes, but is “Yes!” the solution to our problems or the source of it? If I were running for an office, I would build my campaign around no, “No, we can’t! We can’t, because of the miserable mess we’ve all created. We have to clean this (expletive delete) up!” That campaign probably wouldn’t help me get a job as a drive-thru attendant as Hardee’s, but I would go down with that ship with a righteous right fist held high.

Harding was basically telling the American public that he wouldn’t present what we now call the “New Coke” formula that no one has ever thought of before. The actual “New Coke” campaign involved the Coca-Cola Company attempting to gain greater market share in 1985, by essentially copying the formula of its less popular competition Pepsi-Cola. Similarly, numerous narcissist U.S. presidents, before and after Harding and Coolidge, have attempted to impose formulas that have been tried and tested by other countries in history. The idea that those formulas have failed in those other countries, and America’s is the envy of the world, doesn’t stop “New Coke” advocates from believing they are the ones who can administrate this failed formula to success. The legacy of Coca-Cola’s “New Coke” campaign, and the “New Coke” ideas in politics are influential as a cautionary tale against tampering with a well-established and successful brand. By saying that he would act as nothing more than a steward for the prosperous model The Founders created, Harding was displaying what some call the pinnacle of intelligence by stating that he was smart enough know what he doesn’t know. 

One of Warren G. Harding’s first steps was to shepherd through Congress the Budget and Accounting Act of 1921. This bill allowed Harding to create a special budget bureau— the forerunner to today’s Office of Management and Budget— where Harding’s director of the bureau could cajole and shame Congress into making spending cuts. Unfortunately, some of Harding’s privatization policies, combined with some ill-advised appointments, led to bribery and favoritism, and ultimately to something historians would call the Teapot Dome Scandal.

Coolidge as President

After the untimely death of Harding, Calvin Coolidge became the 30th president of the United States, serving from 1923 to 1929. Coolidge sustained a budget surplus and left office with a smaller budget than the one he inherited. Over the same period, America experienced a proliferation of jobs, a dramatic increase in the standard of living, higher wages, and three to four percent annual economic growth. The key to this level of success was Coolidge’s penchant for saying “no.” If President Ronald Reagan was “The Great Communicator,” Coolidge was “The Great Refrainer,” a title Reagan gave Coolidge. 

Calvin Coolidge separated himself almost immediately from Harding with his willingness to say “No!” to appointees, Congressman, and to various, other “New Coke” bills. (Coolidge ended up vetoing fifty bills, a total that ends up being more than the last three presidents combined.) Coolidge summed up his penchant for vetoing these bills saying:

“It is much more important to kill bad bills than to pass good ones.”

How many of today’s issues would be resolved with that mindset, that philosophy, and that president? Calvin Coolidge was the type of president, the type of person, that if you asked him what time it was, he would tell you. Modern presidents get their tongues so tied up with advice from advisers, pollsters, and focus group testing, that they’re almost afraid to tell you what time it is based on the fact that a direct answer might be taken seven different ways by seven different networks that appeal to a 24-7 audience.

Within 24 hours of arriving in Washington after Harding’s death, Calvin Coolidge met with his budget director, Herbert Lord, and together they went on offense, announcing deepened cuts in two politically sensitive areas: spending on veterans and District of Columbia public works. In his public statements, Coolidge made clear he would have scant patience with anyone who didn’t go along:

“We must have no carelessness in our dealings with public property or the expenditure of public money. Such a condition is characteristic of undeveloped people, or of a decadent generation.”

Perhaps reflecting his temperament, Coolidge favored what more modern presidents could use to veto a bill without the political consequences of doing so, the pocket veto. This is a method a president can use to reject a bill without actually vetoing it, while giving Congress little ability to override it. Grover Cleveland, whom Coolidge admired, used this type of veto in his day, as had Theodore Roosevelt. But Coolidge raised its use to an art form. The New York Times referred to it as “disapproval by inaction.” Perfect, I say, ingenious. It’s what the world needs now.

The words “perhaps reflecting his temperament” paint a nice portrait of President Calvin Coolidge, for when given the choice between grandstanding on an issue and quietly advocating or dismissing a bill, Coolidge opted for the quiet approach. The most illustrative story on this theme of restraint involved one of the greatest tragedies of Coolidge’s presidency. The great Mississippi River flood of 1927 was the Coolidge administration’s Hurricane Katrina. Rather than appear in a photo op, Coolidge chose not to appear on the grounds of the devastation fearing that doing so might encourage federal spending on relief. Another issue that might define the Coolidge administration in an historical manner involved the Klu Klux Klan. When faced with the problem of how to handle the then powerful Klu Klux Klan, Coolidge quietly avoided appointing any Klan members to prominent positions in his cabinet, and he thereby decimated the power of that group in America. When faced with the dilemma of what to do with farming subsidies, the man from farming country, chose to veto the subsidies. He also vetoed veterans’ pensions and government entry into the utilities sector. What current politician would favor vetoing farming bills and veterans’ pensions? The man had no qualms with vetoing bills he likely, personally favored, because he didn’t want to set a bad precedent. 

If a modern politician for any office even flirted with doing any of these things (the maneuver with the Klan excluded), and they listed one of them in their campaign, how many of us would laugh them off the stage? The party’s leaders wouldn’t even consider them for their nomination. The only obstacle for modern politicians is how to find a creative way to say yes that doesn’t tick off too many constituents who might want them to say no. 

Yet, how many tragedies does a nation as large as America face every day? How many constituents suffer as a result? The impulsive reaction for any person, politician, and president is to do whatever they can to end their suffering, yet how many unintended consequences arise from a president’s, and Congress’s decision to provide federal aid? Before you reveal yourself as a person somewhat addicted to federal spending, imagine if a President Calvin Coolidge denied federal aid for even “logical” and “heartfelt” expenditures? Imagine if a president said, “I’d much rather not set the precedent of the federal government coming in to rescue all of the people, places and things. I’d much rather leave such aid to the states and local municipalities.”

How many of these problems could’ve been avoided if we had more presidents do whatever they could to train the country’s expectations to be more limited when the subject involves the federal government’s ability to fix their problems. As many informed politicos will tell us, it’s too late now. The country, thanks to nearly 100 years of conditioning from ego-driven, narcissist  presidents, seeking praise and adulation for their administration, has come to expect the president to do something. It’s a fait accompli now, and there’s little to nothing anyone can do to roll that back now. All of this may be true, but what if Harding’s special budget bureau survived the politics of the 70’s, and the president and Congress conditioned the country to accept the idea that the federal government has attained from taxpayer’s is finite? Would the American public let the locale drown, or would the most generous people in the world, Americans, do whatever they can to help their fellow American out? Would the American citizen learn to look to their state, local, and even their own communities to aid them in times of crisis? It’s easier and far more popular for a president to just say yes, but I don’t think many objective, dispassionate observers would argue that America would be in a far better place if the presidents who followed Coolidge invested more of their political capital in his politics of no?  

“Four-fifths of all our troubles would disappear if we would only sit down and keep still.”

What came first the chicken or the egg? Did the “yes” politicians condition us to expect more yes from them, or did we condition our candidates for the office to say “yes” to everything? How many candidates stubbornly insist that we need to say no more often? Long question short, are we in unprecedented debt, because of the ruling class, or because Americans have the country we want? I don’t know about you, but I would love to see that specific flowchart with historical bullet points.

The current barometer of the presidency is not set on “Yes or no” but “When, how much and how often” they spend other people’s money, Coolidge exhibited a level of restraint politicians often reserve only for their own money.

Despite the budget surpluses the Coolidge administration accrued during his presidency, he met with his budget director every Friday morning before cabinet meetings to identify budget cuts and discuss how to say “no” to the requests of cabinet members, and other politicians up and down the ticket. Think about that for just a moment before reading on. Think about how a modern politician, on any level and both parties, would react to even a momentary surplus. The impulsive reaction, some might even say instinctive reaction politicians have to surpluses is to find the best way to allocate that surplus for greater political gain, and to reward those who played a pivotal role in securing the surplus by allocating funds for a bridge or a hospital in the Congressman’s district. How many politicians, by comparison, would meet with budget directors, Congressmen, etc., to find further ways to cut. Most presidents give in after a time —Eisenhower being a good example— but Coolidge did not, despite the budget surpluses accrued during his presidency. 

In a conference call with Jewish philanthropists, Coolidge explained his consistency this way:

“I believe in budgets. I want other people to believe in them. I have had a small one to run my own home; and besides that, I am the head of the organization that makes the greatest of all budgets, that of the United States government. Do you wonder then that at times I dream of balance sheets and sinking funds, and deficits and tax rates and all the rest?”

Speaking of tax rates, in December 1923, Coolidge and Treasury Secretary Andrew Mellon launched a campaign to lower top rates from the fifties to the twenties. Mellon believed, and informed Coolidge, that these cuts might result in additional revenue. This was referred to as “scientific taxation”—an early formulation that would later influence economist Art Laffer to develop what we know as the Laffer curve. Coolidge passed word of this insight on:

“Experience does not show that the higher tax rate produces larger revenue. Experience is all the other way,” he said in a speech in early 1924. “When the surtax on incomes of $300,000 and over was but 10 percent, the revenue was about the same as it was at 65 percent.”

The more recent egos who have occupied the tax payer funded seat of president would likely show a blush at the mention of the power and prestige they have achieved by attaining residence in The White House. That humble blush would be shown in the manner a 70’s comedian would show one hand to reject the applause he was receiving, while the other, jokingly, asked for more applause. Calvin Coolidge rejected congratulatory mentions of his power completely. When Senator Selden Spencer took a walk with Coolidge around the White House grounds, the Senator playfully asked the president, “Who lives there?”

“Nobody,” Coolidge replied. “They just come and go.”

For all the praise that authors like Amity Shales heap on Coolidge, some of his critics state that his policies caused The Great Depression and others say he did not prevent them.

“That is an argument I take up at length in my previous book, The Forgotten Man, and is a topic for another day,” Amity Shales said. “Here let me just say that the Great Depression was as great and as long in duration as it was because, as economist Benjamin Anderson put it, the government under both Hoover and Franklin Roosevelt, unlike under Coolidge, chose to “play God.”

Three lessons we can learn from the Coolidge presidency

Beyond the inspiration of Coolidge’s example of principle and consistency, what are the lessons of his story that are relevant to our current situation? One certainly has to do with the mechanism of budgeting: The Budget and Accounting Act of 1921 provided a means for Harding and Coolidge to control the budget and the nation’s debt, and at the same time give the people the ability to hold someone responsible. That law was gutted in the 1970s, when it became collateral damage in the anti-executive fervor following Watergate. The law that replaced it tilted budget authority back to Congress and has led to over-spending and lack of responsibility ever since. On this note, one could say that Congressional control of the budget is outlined in The Constitution, and that Congress is more representative of the American citizenry. As I wrote above, however, Calvin’s budget director’s primary job was to cajole and shame Congress into making spending cuts. That wouldn’t play in the 70’s, and it definitely wouldn’t play in the modern era. As such, Coolidge’s quote, “I don’t fit in with these times” would definitely describe a modern day Coolidge, as he probably couldn’t be elected dog catcher. The American people have stated that they prefer an out of control budget with massive spending.

A second lesson we can derive from the Coolidge administration concerns how we view tax rates. Our natural inclination is to believe that higher tax rates produce larger revenue. As Coolidge states, “Experience is all the other way.” The reason behind this is a complicated formula that current supply side economists suggest raising taxes results in more people and corporations engaging in less taxable activity. Coolidge’s experience with the code suggested that we should consider lowering taxes, until we find that sweet spot in the tax code that encourages greater taxable activity, and thus more taxable revenue arriving in the government’s coffers. Tax policy can also be a mechanism to expand government. The goals of legitimate government —American freedom and prosperity — are left by the wayside. Thus the best case for lower taxes is the moral one — and as Coolidge well understood, a moral tax policy does not demand higher taxes but tougher budgeting from paid employees of the state that we call our representatives.

Finally, a lesson about politics. The popularity of Harding and Coolidge, and the success of their policies — especially Coolidge’s — following a long period of Progressive ascendancy, should give today’s conservatives hope. Coolidge in the 1920s, like Democrat Grover Cleveland in the previous century, distinguished government austerity from private-sector austerity, combining a policy of deficit cuts with one of tax cuts, and made a moral case for saying “no.” A political leader who does the same today is likely to find an electorate more inclined to respond “yes” than he or she expects. {1}

The point, I believe, is that in the current climate of “yes” in Washington D.C., we could use a little “no”. In the event of a natural disasters, there will always be “unprecedented” disasters in a land mass as large as America, “yes” ingratiates the president to the people of the area, the media, the nation, and history, but it is also “yes” that ends up contributing to the national debt, and the idea that the federal government is a parent that should clean up the messes of her children. It could also be argued that federal intervention discourages smaller scale charity and communities seeing themselves through a disaster of this sort.

“Yes” also lends itself to the already massive egos of those who will sit in our most prestigious seat of representation, and it leads them to believe they can invent “New Coke” formulas, until we’re swirling around the drain in it. These massive egos can’t withstand one commentator saying negative things about them, so they start saying “yes” to everything, because “yes” doesn’t have the political consequences of “no”. Saying no to Congressmen and Senators can bruise egos and cause negative sentiments and statements; saying no to Governors who ask for state aid will lead to political fallout in the media as every story on that tragedy of the day would be accompanied by their “no”; telling a woman who asks for a car in a town hall debate the meaning of the word no, and telling her exactly what time of the day it is, would lead to utter devastation for that candidate’s campaign. Why would a politician, in today’s media cycle, say no and expound on that by saying that’s not the federal government’s role, and refrain from engaging in photo ops that might encourage Americans to believe that it is the government’s role? By saying no, a politician puts his or her nose out, and it takes courage and humility for a politician to risk everything by denying a power grab in this sense. While Coolidge never faced the 24-7 news cycle modern politicians do, a decent search of his history will reveal that his “no” policies did face a relatively intense amount of scrutiny, and he continued to stubbornly say “no” throughout.

It would probably be a fool’s errand to try and find another person in our current political climate who has the temerity and resolve to say no as often as Coolidge did. The nation has stated that they would much rather live in the fairy tale land of yes, even if that means that the New Coke ideas lead to greater complexities, long-term consequences, and probable economic turmoil. The greater question, that appears to be approaching closer every day, is not whether a “a great refrainer” is a better president than one who believes the nation can “yes” their way out of every problem, but if the nation will ever be ready for such an answer without the assistance of a cataclysmic economic incident that affects them directly.

Calvin Coolidge’s obituary states that his prestige at the time of his impending third-term* was such “that the leaders of the Republican Party wished to override the tradition* that no President should have a third term.” His response was, “I do not choose to run for President in 1928.” When a “draft Coolidge” movement arose to select him for the GOP ticket, Coolidge said no. When they attempted to override his desire, believing Coolidge’s refusal to run was a shrewd attempt to avoid revealing his ambition, he told them no again. President John Calvin Coolidge Jr. may not go down as the greatest president who ever served the public, and judging by the quote that he was one of the few who managed to be “silent in five languages” Coolidge will never go down as one of the most charismatic individuals to ever sit in the seat. No, I have Calvin Coolidge’s face on my personal Mount Rushmore for his ability to say “No!”

***

*Calvin Coolidge ended up serving six years, as a result of Harding’s death two years into his presidency, so his re-election would not have been a third term, technically.

*In 1928, the idea of a president serving more than two terms was still a tradition, until the 22nd amendment passed to Constitutionally limit a president to serving two terms. This “tradition” began with George Washington refusing to run for a third term, Theodore Roosevelt continued the tradition, initially, before running again, and some suggest Harry Truman could have run for a third term, because the 1947, 22nd amendment only applied to presidents after the then-current one (which was Truman), but Truman was deemed too unpopular to seek a third-term.    

{1}http://www.hillsdale.edu/news/imprimis/archive/issue.asp?year=2013&month=02

{2}https://en.wikipedia.org/wiki/New_Coke