Krauthammer on Churchill: The Indispensable Man

One of the goals of every writer should be to have those that read his work regard him brilliant. Another goal, and a far more difficult and impressive one, is to have the reader think brilliant thoughts while reading that work. Whether or not Charles Krauthammer’s new book Things That Matter: Three Decades of Passions, Pastimes and Politics accomplishes the former is relative to the reader, but in my humble opinion, the books definitely accomplishes the latter.

Book%20Cover_0In the second chapter, following the requisite intro, and the requisite chapter describing the author’s days of youth –playing baseball– Charles Krauthammer posits the notion that Time Magazine got it wrong when they nominated Albert Einstein “Man of the Century”. Einstein may have been vital, argues Krauthammer, and he is “certainly the best mind of the century”, but Britain’s Prime Minister Winston Churchill “carried that absolutely required criterion: indispensability” in the 20th century, and to the 20th century.

One thought this reader had, while reading, is that provocative, bar stool discussion would have it that no person had a more prominent effect on the 20th century than Adolf Hitler. While that is arguably true, a question to that provocative notion should be, were the lessons of Hitler’s evil transgressions more transcendent than Winston Churchill’s efforts to, as Krauthammer later describes it, “slay that dragon”?

Hitler was, of course, indispensable to any study of the 20th century, in that he illustrated much of what’s wrong with human nature, and he gave us a template for how we should treat countries after war (after World War I). Though evil can take many forms, Hitler provided students of history a model of unprecedented evil that we can now use as a guide to detect evil, based on the precedent he set. We will hopefully no longer allow an evil despot to rise to such levels of prominence in their country that they would be in a position to coerce its citizens to do such evil things to one another. With all these lessons and precedents regarding absolute evil, students of the 20th century say that Hitler has to be the man of that century.

It’s a provocative notion, and it would probably give Hitler the stature, and historical value, that he sought all along. How many men, and how many precedents of the 20th century, will be cited more often than those Hitler provided humanity for centuries to come? Young people, involved in bar stool discussions, love such provocative notions, for they provide all listeners the impression that the provocateur is intelligent with such shock and awe proclamations. Most of us love such impressions, when we’re younger. As we age, and move past the desire to be perceived as intelligent, we actually become more intelligent, and we realize that most provocative thoughts should go through careful examination and attempts to disprove. The final conclusions we reach may not be as provocative, or as memorable, but as we age, and read, we realize that being right is more valuable than being memorable or provocative. There is no doubt that the lessons evil men leave behind are monumental in history, but too often these provocative conversations leave out the dragon slayers that should, at least, be considered as prominent, if not more so.

To say that Winston Churchill hasn’t already achieved a prominent place in history would be foolish, as most historians continue to rank him in their top five most prominent figures of the 20th century, and most left-leaning historians will rank him in their top twenty. Does he deserve even greater prominence than we’ve already allowed, however?

One of the reasons Churchill is not higher on the list, I would submit, is that hindsight has proven that he was so obviously correct in his doomsayer predictions about Hitler. The idea that all of his warnings were so obviously on the mark, however, makes it almost boring to declare him the most prominent person of the 20th century.  It’s an of-course statement that causes readers to yawn over the headline, when a more prominent listing of others, such as Einstein, prove far more provocative, compelling, and newsworthy.

Churchill was, as Krauthammer writes, “A 19th century man parachuted into the 20th,” but “it took a 19th century man –traditional in habit, rational in thought, conservative in temper– to save the 20th century from itself.” Yawn. Such lines don’t play well on the cover of a magazine to suggest that Churchill was right about Hitler, and thus he should be nominated the Man of the Century for speaking out and saving us. Especially when compared to the exciting, and revolutionary, bullet points a writer can compile with points about Einstein’s accomplishments.

Before dismissing the obviousness of Churchill’s warnings, one has to examine what he was up against while still in the British Parliament. Most of the British Parliament, and Prime Minister Neville Chamberlain, dismissed Churchill’s warnings. They did not want to view Hitler through Churchill’s simplistic, black and white lens. Churchill’s warnings were viewed as the impulsive, irrational, and the unreasonable views of a war hawk. Neville Chamberlain has been viewed, by right and left historians as one of the obvious fools of the 20th century, but is it a glaring headline that Churchill should be viewed as the most obvious hero of the 20th century, no, because it is just so obvious. It doesn’t require any creativity to back up. It just is what it is, as we now say.

Churchill suggested that the year’s delay between the Munich Pact and what he deemed an inevitable war worsened Britain’s position, in direct opposition to Chamberlain’s assessment. (Editor’s note: Chamberlain would later declare that that year allowed the British to bolster their troops, and that the British military was not prepared for war during the previous year.) In that year, between Munich and World War II, Chamberlain also exhausted the possibility of diplomacy with détente, blockades, and anything and everything the world could use to achieve “peace in our time”. To refute the Chamberlain claims, Churchill stated Hitler could have been removed from power by a grand coalition of European states to prevent World War II from happening in the year in question.

That suggestion, that in some cases waiting too long can worsen one’s position, would rear its ugly head before Hitler’s body even went cold, when U.S. General George S. Patton’s warned General Eisenhower about Russia. Eisenhower, presumably recognizing that Patton’s warnings were not unfounded, responded that Americans were simply too war-weary to make any moves against Russia. The suggestion would later haunt the world in the 21st century, with Iraq in 2003, in a manner some would suggest the reverse of the Churchill suggestion, saying that we acted too impulsively, and the suggestion will probably haunt nations around the world for many more, because the human instinct is to avoid war at all costs, no matter how black and white, and simplistic, and obvious the need for action becomes.

In later writings, “Churchill depicted Chamberlain as well-meaning but weak, blind to the threat posed by Hitler, and oblivious to the fact that (according to Churchill) Hitler could have been removed from power by a grand coalition of European states. Churchill suggested that the year’s delay between Munich and war worsened Britain’s position, and criticized Chamberlain for both peacetime and wartime decisions. In the years following the publication of Churchill’s books, few historians questioned his judgment.”{1}

It may appear redundant to call an historian a hindsight historian, since history is documented in hindsight. Some historians document the facts of the era while others provide hindsight commentary to historical events that were not as clear to the historical figures of the day, but these historians provide the unlimited omniscience that hindsight provides. Hindsight historians may document Churchill’s warnings as obvious now, but most hindsight historians will not tell you how popular Neville Chamberlains “peace in our time” efforts were at the time.

Another question those that believe Hitler’s quest for power was so obvious that it’s now redundant to talk about, should ask themselves how obvious it was to Neville Chamberlain at the time. How obvious was it to the British Parliament, the isolationists in America, and the world at large. Much like today, Churchill was regarded as a war hawk, and presumably a fear monger when he spoke of what he believed to be Hitler’s aspirations. Some have said that Churchill is almost solely responsible for the meetings that occurred at Tehran, Yalta, and Potsdam with FDR and Stalin that eventually won the war for the allied forces. 

We’ve all read hindsight historians document that America shouldn’t have been “so stupid” as to allow the attack on Pearl Harbor, when so many signs pointed to its eventuality. It’s easy for them to look at the decade preceding the terrorist attack on September 11, 2001, to declare that we were obviously naïve in trying terrorists as criminals rather than wartime adversaries. It’s also easy for them to write that that the call to war in Iraq, in 2003, was impulsive based on our inability to find weapons of mass destruction in Iraq. What’s not so easy, however, is for those figures that were involved in the present tense of history to stick their neck out and speak out against the conventional wisdom of their day and declare that it’s “weak and blind” to continue to follow the conventional line of thinking. Hindsight historians now slightly diminish Churchill’s role in 20th century, because it is now so obvious that Hitler was the epitome of evil. To read through an objective telling of the history, however, it obviously wasn’t so obvious to some at the time.

As Krauthammer wrote in Things That Matter:

“And who is the hero of that story?  (The story of the 20th century’s ability to defeat totalitarianism, and leave it as a “cul-de-sac” in the annals of human history.)  Who slew the dragon? Yes, it was the ordinary man, the taxpayer, the grunt who fought and won the wars. Yes, it was America and its allies. Yes, it was the great leaders: FDR, de Gaulle, Adenauer, Truman, John Paul II, Thatcher, Reagan. But above all, victory required one man without whom the fight would have been lost at the beginning. It required Winston Churchill.”{2}

Krauthammer, Charles.  Things That Matter: Three Decades of Passions, Pastimes and Politics.  New York, New York:  Random House, 2013.  Print.


James Joyce: Incomparable or Incomprehensible?

Those of us on the lookout for edgy, racy content have heard the term “Joycean” thrown about with little discretion over the years. Critics appear to be more interested in using the term than they are in properly applying it to the product they are reviewing. The question that those of us driven to the source would have for Joyce, if he were still alive, is: Were your final two works the most erudite, most complicated pieces of fiction ever written, or were they a great practical joke played on the literature community to expose these reference makers and your elitist, scholars for who they are?

James Joyce

James Joyce

Readers who seek to up their erudite status by reading difficult books, have heard of Joyce’s final two works: Ulysses and Finnegans Wake. Some literary scholars list the books as some of the most difficult, most complicated works of fiction ever created. Some of us have attempted to tackle them as the challenge that they are, others have attempted to read them for entrance into their subjective definition of elite status. Most are confused and disoriented by the books, but some have the patience, the wherewithal, and the understanding of all of the references made, and languages used, in these books necessary for comprehension. Those readers either deserve a hearty salute, or the scorn and laughter that Joyce provided, as a gift, to the havenots, who openly admit that they don’t understand these books.

I don’t understand either of these books, and I have gone back numerous times to try to further my understanding. Some have said that Ulysses is the more palatable of the two, but I have found it to be too elliptical, too erratic, and too detail-oriented to maintain focus, and I have purchased three different aides to guide me through it. Some of those same readers readily admit that Finnegans Wake is ridiculously incomprehensible.

Most people enjoyed Dennis Miller’s tenure as an announcer on Monday Night Football, but most of those same people complained that they didn’t understand two-thirds of the man’s references. I didn’t keep a journal on his references, but I’m willing to bet that at least a third of them were Joycean in nature. Miller stated that his goal, in using such obscure references, was to make fellow announcer Al Michaels laugh, but any fan who has followed Miller’s career knows that he enjoys the motif he gains by using complicated and obscure references to make himself sound erudite. There are, today, very few references more obscure than those who recall the work of James Joyce, a man who described his last book, Finnegans Wake, as “A book obscure enough to keep professors busy for 300 years.”

Andy Kaufman referenced James Joyce when trying to describe his method of operation. The import of the reference was that Kaufman wanted to be a comedian’s comedian, in the manner that Joyce was a writer’s writer. He wanted to perform difficult and complicated acts that the average consumer did not understand, and the very fact that they didn’t “get it” was what invigorated him. He wanted that “insider” status that an artist uses to gain entrée to the “in the know” groups. After achieving some fame, audiences began laughing with Kaufman in a manner that appears to have only bored him, and he spent the rest of his career trying to up that ante. By doing the latter, we can guess that there was something genuine about Kaufman’s path in that he was only trying to entertain himself, and his friends, and if anyone else wanted on board that was up to them. Perhaps, Joyce and Kaufman shared this same impulse.

Anytime an artist creates a difficult piece of work, there is going to be a divide between the haves (those that get it) and the havenots. When Mike Patton formed the band Fantomas, he never did so with the illusion that he was going to unseat the Eagles Greatest Hits, or Michael Jackson’s Thriller, atop the list of greatest selling albums of all time. He knew, or should’ve known, that he was playing to a very select audience.

What is the audience for such difficult subject matter? Most people seek music, as either background noise, something to dance to, or something to tap their finger to. Most people read a book to gain a little more characterization and complication than a movie can provide, but they don’t want too much characterization, or too much complication. Most people only buy art to feng shui their homes. Most people don’t seek excessively difficult art, and those who do are usually seeking something more, something more engaging, and something more provocative that can only be defined by the individual. The audience for the difficult generally have such a strong foundation in the arts that they reach a point where their artistic desires can only satiated by something different.

Different can mean different things at different times to different people. Different can be complicated, and discordant, but it can also be limited to style. At this point in history, it’s difficult to be different, in a manner that cannot be called derivative of someone or something, so most people seek whatever separations they can find. When the latest starlet of the moment twerks in a provocative manner, has a construction worker find her pornographic video, or accidentally has her reproductive organ photographed, we know that these are incidents created by the starlet, and her people, to get noticed after they have exhausted all other attempts to be perceived as artistically brilliant and different.

There are also some other artists who are different for the sole sake of being different. This is often less than organic, and it often disinterests those of us seeking a true separation from the norm, because we feel that this has been thoroughly explored to the point of exhaustion. Andy Kaufman created something organically different that can never be completely replicated, in much the same manner Chuck Palahniuk, Mike Patton, David Bowie, Quentin Tarantino, and Jerry Seinfeld and Larry David did. Can it be said that James Joyce’s final two books were different in an artistically brilliant, and cutting edge manner that all of these artists’ creations were, or were James Joyce’s writings more symbolism over substance? Put another way, was Joyce a substantive artist who’s true messages need to be unearthed through careful examination, or was he simply twerking in a provocative manner with the hope of getting noticed by the elite scholars of his generation after exhausting the limits of his talent in other works?

Judging by his short stories, James Joyce could’ve written some of the best novels in history. Those who say that he already did, would have to admit that his final two works were not overly concerned with story, or plot. Those that defend his final two works would probably say that I am judging Joyce’s final two works by traditional standards, and that they were anything but traditional. They would probably also argue that the final two works sought to shake up the traditional world of literature, and anyone that dared to take up the challenge of reading these works. They would probably say Joyce sought to confound people, more than interest them, and if they did concede to the idea that the final two works were different for the sole sake of being different, they would add that he was one of the first to do so. Those who defend his final two works say that they are not as difficult to read, or as complex, as some would lead you to believe. These people suggest that reading these two works only requires more patience, and examination, than the average works. Anyone who states such a thing is attempting to sound either hyper intelligent, or hyper erudite, for it was Joyce’s expressed purpose to be difficult, complicated, and hyper-erudite.

To understand Ulysses, one needs an annotated guide of 1920-era Dublin, a guide that describes the Irish songs of the day, some limericks, mythology, and a fluent understanding of Homer’s The Odyssey. If the reader don’t have a well-versed knowledge of that which occurred nearly one-hundred years prior to today, they may not understand the parodies, or jokes Joyce employs in Ulysses. Yet, it was considered, by the Modern Library, in 1998, to be the greatest work of fiction ever produced.

“Everyone I know owns Ulysses, but no one I know has finished it.”  —Larry King.

To fully understand, and presumably enjoy, Finnegans Wake, the reader needs to have a decent understanding of Latin, German, French, and Hebrew, and a basic understanding of the Norwegian linguistic and cultural elements. The reader will also need to be well-versed in Egypt’s Book of the Dead, Shakespeare, The Bible, and The Qur’an. They also need to understand the English language on an etymological level, for one of Joyce’s goals with Finnegans Wake, was to mess with the conventions of the English language.

Some have opined that one of Joyce’s goals, in Ulysses, was to use every word in the English language, and others have stated that this is a possibility since he used approximately 40,000 unique words throughout the work. If this is true, say others, his goal for Finnegans Wake, was to extend the confusion by incorporating German, French, Latin, Hebrew, and other languages into his text. When he did use English, in Finnegans Wake, Joyce sought to use it in unconventional and etymological ways to describe what he believed to be the language of the night. He stated that Finnegans Wake was “A book of the night” and Ulysses was “A book of the day”.

“In writing of the night, I really could not, I felt, use words in their ordinary connections . . . that way they do not express how things are in the night, in the different stages – conscious, then semi-conscious, then unconscious. I found that it could not be done with words in their ordinary relations and connections. When morning comes of course everything will be clear again . . .  I’ll give them back their English language. I’m not destroying it for good.” —James Joyce on his novel Finnegans Wake.

This use of the “language of the night” could lead one to say that Joyce was one of the first deconstructionists, and thus ahead of his time by destroying the meaning of meaning in the immediate sense. Those obsessed with James Joyce could interpret the quote, and the subsequent methodology used in Finnegans Wake, to mean that Joyce had such a profound understanding of linguistics that normal modes of communicating an idea, bored him. He wanted something different. He wanted to explore language, and meaning, in a manner that made his readers question their fundamentals. Readability was not his goal, nor was storytelling, or achieving a best-seller list. He sought to destroy conventions, and common sense, and achieve a higher realm of perfect, in which timeless abstractions cannot be communicated to those that adhere to common sense. This makes for an interesting conversation on high art, and philosophy, but does it lend itself to quality reading?

“What is clear and concise can’t deal with reality,” Joyce is reported to have told friend Arthur Power,  “For to be real is to be surrounded by mystery.”

In the modern age, there is much discussion of the widening gap between the haves and the have nots. That particular discussion revolves around economic distinctions, as it has for time immemorial, but in the Joycean world, the gap involves those that “get” his works, and those who do not. Those that get it usually prefer to have deeper meanings shrouded in clever wordplay. They usually prefer symbolism over substance; writing over storytelling; and interpretation over consistent and concretized thoughts.

The two schools of thought between the haves and the havenots can probably best be explained by breaking them down to the Hemingway manner of writing and that of Joyce. Hemingway wrote clear and concise sentences. Hemingway stated that his methodology was to write something that was true:

“The hardest thing is to make something really true and sometimes truer than true.”—Ernest Hemingway.

Putting Joyce’s final two works through the Hemingway school of thought, one could say that Joyce’s methodology was: Some of the times, it’s easier to make it false and let others define it as true.

“Though people may read more into Ulysses than I ever intended, who is to say that they are wrong: do any of us know what we are creating? … Which of us can control our scribblings? They are the script of one’s personality like your voice or your walk.” —James Joyce

Those of us who have had a deep discussion, on a deep, multifaceted topic, with a deep thinker know that sooner or later a declarative distinction will be made if we stubbornly insist that we are not wrong. “You don’t get it, and you probably never will,” is something they will say in a variety of ways. We all know what it feels like to be summarily dismissed as an anti-intellectual by a deep thinker? Those who aren’t snobbish in an anti-social manner, often avoid openly dismissing you, but even the polite snobs give you a vibe, a look, or a chuff that is intended to let us know our place.

“Well, what do you think of it then?” is the response some of us have given, after being backed into an anti-intellectual corner by deep thinkers.

If they are an anti-social, elite intellectual snob, they will say something along the lines of: “I simply choose to think deeper!” It’s a great line, and it purportedly puts us stubborn types in our place, but it’s a self-serving non-answer. Those of us who are more accustomed to interaction with deep thinkers, will then ask them to expound upon their complicated, deep thinking? Pushing deep thinkers deeper will often reveal a lack of substance beneath their piles of style, and the careful observer will find that the results of their deep thinking is no deeper than the deep thinker cap they wear to the pub.

A number of attempts at reading Joyce have led me to believe that he probably didn’t have much substance beneath his piles of style, so he muddied the waters of his message with puns, songs, gibberish, abstractions, foreign languages, and overly complicated complications. He did this, in my opinion, to conceal the fact that when compared to his colleagues, he didn’t have all that much to say, but that he was definitely artistically accomplished in saying it.

Who can forget the many sayings that Finnegans Wake dropped on our culture, such as the transcendental sound of the thunderclap that announced the fall of Adam and Eve from the garden of Eden:


What about the mirthsome giggles we have had in social gatherings with the catchphrase:

“A way a lone a last a loved a long the riverrun, past Eve and Adam’s, from swerve of shore to bend of bay, brings us by a commodius vicus of recirculation back to Howth Castle and Environs.”

Or the ever present: 

“(Stoop) if you are abcedminded, to this claybook, what curios of sings (please stoop), in this allaphbed!  Can you rede (since We and Thou had it out already) its world?”

If you just read those sentences three or four times, and you still have no idea what it says, and you just went back to read them again, because you want to be a have that “gets it”, you’re not alone. If these passages were merely anecdotal evidence of the difficulty involved in reading Finnegans Wake, that would be one thing, but these difficulties litter just about every sentence of every paragraph of the book, as evidenced by the exhaustive assistance provided at the site for readers who have no idea what this writer is going on about. 

Finnegans Wake is reported to be in English, but it’s not the standard version of English where words have specific meaning. The “language of the night” was intended for linguists who are tired of reading words that have exact meanings, and it was intended to be playful and mind-altering, and rule breaking. James Joyce made references that were intended to be obscure even to the reader of his day that may not have Joyce’s wealth of knowledge of history, or the manner in which the meaning of the words in the English language have changed throughout history.

“What is really imaginative is the contrary to what is concise and clear.” —James Joyce

James Joyce was a stream of consciousness writer who believed that all “mistakes” were intended on some level that superseded awareness. In the 500+ page book, Finnegans Wake, Joyce found 600 errors after publication. He was informed of some, if not all of these errors, and he was reported to have fought his publishers to keep them in. Later editions were written to correct many of these errors, and provide readers “the book in the manner Joyce had intended.” If Joyce didn’t believe in errors, however, how can those that corrected them state that the corrected edition is the definitive edition that “Joyce intended”?

“The man of genius makes no mistakes, his errors are volitional and portals of discovery.” –James Joyce

Throughout the seventeen years Joyce spent writing Finnegans Wake, he began to go blind, so he had a friend named, Samuel Beckett, take dictation over the phone to complete the novel. At one point in this dictation setting, someone knocked on Joyce’s door.  Joyce said, “Come in!” to the knocker, and Beckett wrote the words “Come in!” into the narrative of Finnegans Wake. When this error was spotted by Joyce, and the confusion was sorted out, Joyce insisted that Beckett, “Leave it in!” On another occasion, when a printer’s error was pointed out he said, “Leave it. It sounds better that way than the way I wrote it.”

There are three different versions of the text: The first and second are the editions that Joyce submitted for publications with all of the errors intact. The third edition has the errors that the editors located, and the 600 corrections that Joyce spent two years locating, corrected. Some would have you believe that first two editions are the definitive editions, but you have to be a Joyce purist to appreciate them.

Can it be called anything short of egotistical for an author to believe that his subconscious choices and decisions, are somehow divine? If, as Joyce said, and Picasso later repeated in regard to his paintings, mistakes are portals of discovery, then we can say that’s great, and incredibly artistic in the process of creation. To leave it in the finished product, however, and subject your readers to the confusion, just seems narcissistic. “Here’s what I was thinking at the time,” Joyce is basically telling his readers.  “I don’t know what it means, but this is a higher plane of thinking than simple conscious thought. Isn’t it magical? Maybe you can make some sense of it. Maybe you can attribute it to your life in some manner.” This method of operation may say something profound about the random nature of the universe, but when we’re reading a novel we don’t necessarily want to know about the randomness of the universe, unless it’s structured in a manner that leads us to your statement. 

Not everyone can write a classic, and some realize this after a number of failed attempts. Once they arrive at this fork in the road, they can either write simple books that provide them and theirs an honest living, or they can grow so frustrated by their inability to write classics that they separate themselves from the pack through obscurity. The advantage of creating such an alleged contrivance is that beauty is in the eye of the beholder, and the beholder can assign their own relative beauty to it. Some would say this is the very definition of art, but others would say even that definition has limits. Some would say that the most obscure painting is art, because they “see it”, where others see only schlock for elitists to crib note to death, until meaning is derived.

James Joyce is considered the exception to this rule, fellow writers have told me, and if you are going to attempt to write an important novel in the 21st Century, you had better be familiar with him. I’ve tried, and I now believe that I’m destined to be a havenot in the Joycean world … Even with Ulysses. The question that arises out of these ashes is, am I going a long way to becoming more intelligent by recognizing my limits, or should it be every aspiring intellect’s responsibility to continue to push themselves beyond any self-imposed limits to a point where they can finally achieve a scholarly understanding of difficult material? If this is a conundrum that every person encounters when facing challenges to their intelligence, is Ulysses, or more pointedly Finnegans Wake, the ultimate barometer of intelligence, or is it such an exaggerated extension that it had to have been a practical joke James Joyce played on the elitist literary community to expose them as the in-crowd, elitist snobs that they are when they “get it” just to get it. Do they really “get it”, or are they falling prey to Joyce’s clever ruse to expose them as people that “get” something that was never meant to be “got”?

Don’t Go Chasing Eel Testicles: A Brief, Select History of Sigmund Freud

We all envy those who knew, at a relatively young age, what they wanted to do for a living. Most of us experience some moments of inspiration that might lead us toward a path, but few of us ever read medical journals, law reviews, or business periodicals during our formative years. Most of the young people I knew preferred an NFL preview guide of some sort, teenage heartthrob magazines, or one of the many other periodicals that offer soft entertainment value. Most of us opted out of reading altogether and chose to play something that involved a ball instead. Life was all about playtime for the kids I grew up around, but there were other, more serious kids, who we wouldn’t meet until we were older. Few of them knew they would become neurosurgeons, but they were so interested in medicine that they devoted huge chunks of their young lives to learning everything their young minds could retain. “How is this even possible?” some of us ask. How could they achieve that level of focus at such a young age, we wonder. Are we even the same species?

At an age when so many minds are so unfocused, these people claim they had tunnel vision. “I didn’t have that level of focus,” some said to correct the record, “not the level of focus to which you are alluding.” They may have diverged from the central focus, but they had more direction than anyone I knew, and that direction put them on the path of doing what they ended up doing, even if it wasn’t as specific as I guessed.

The questions we have about what to do for a living have plagued so many for so long that comedian Paula Poundstone captured it with a well-placed joke, and I apologize, in advance, for the paraphrasing: “Didn’t you hate it when your relatives asked what you wanted to do for a living? Um, Grandpa I’m 5. I haven’t fully grasped the mechanics or the importance of brushing my teeth yet. Those of us of a certain age have now been on both sides of this question. We’ve been asking our nieces and nephews this question for years without detecting any irony. What do you want to do when you grow up? Now that I’ve been asking this question long enough, I’ve finally figured out why we ask it. Our aunts and uncles asked us this question when we were growing up, because they were looking for ideas. I’m in my forties now, and I’m still asking my nieces and nephews these questions. I’m still looking for ideas.”

Pour through the annals of great men and women of history, and that research will reveal legions of late bloomers who didn’t accomplish anything of note until late in life. The researcher will also discover that most of the figures who achieved success in life were just as dumb and carefree as children as the rest of us were, until the seriousness of adulthood directed them to pursue a venture in life that would land them in the annals of history. Some failed more than once in their initial pursuits, until they discovered something that flipped a switch.

Those who know anything about psychology, and many who don’t, are familiar with the name Sigmund Freud. Those who know anything about Freud are aware of his unique theories about the human mind and human development. Those who know anything about his psychosexual theory know we are all repressed sexual beings plagued with unconscious desires to have relations with some mythical Greek king’s mother. What we might not know, because we consider it ancillary to his greater works, is that some of his theories might have originated from Freud’s pursuit of the Holy Grail of nineteenth-century science, the elusive eel testicles.

Although some annals state that an Italian scientist named Carlo Mondini discovered eel testicles in 1777, other periodicals state that the search continued up to and beyond the search of an obscure 19-year-old Austrian’s in 1876.[1] Other research states that the heralded Aristotle conducted his own research on the eel, and his studies resulted in postulations that stated either that the beings came from the “guts of wet soil”, or that they were born “of nothing”.[2] One could guess that these answers resulted from great frustration, since Aristotle was so patient with his deductions in other areas. On the other hand, he also purported that maggots were born organically from a slab of meat. “Others, who conducted their own research, swore that eels were bred of mud, of bodies decaying in the water. One learned bishop informed the Royal Society that eels slithered from the thatched roofs of cottages; Izaak Walton, in The Compleat Angler, reckoned they sprang from the ‘action of sunlight on dewdrops’.”

Before laughing at any of these findings, one must consider the limited resources these researchers had at their disposal, concerning the science of their day. As is oft said with young people, the young Freud might not have had the wisdom yet to know how futile this task would be when a nondescript Austrian zoological research station employed him. It was his first real job, he was 19, and it was 1876. He dissected approximately 400 eels, over a period of four weeks, “Amid stench and slime for long hours” as the New York Times described Freud’s working environment. [3] His ambitious goal was to write a breakthrough research paper on an animal’s mating habits, one that had confounded science for centuries. Conceivably, a more seasoned scientist might have considered the task futile much earlier in the process, but an ambitious, young 19-year-old, looking to make a name for himself, was willing to spend long hours slicing and dicing eels, hoping to achieve an answer no one could disprove.

Unfortunate for the young Freud, but perhaps fortunate for the field of psychology, we now know that eels don’t have testicles until they need them. The products of Freud’s studies must not have needed them at the time he studied them, for Freud ended up writing that his total supply of eels were “of the fairer sex.” Freud eventually penned that research paper over time, but it detailed his failure to locate the testicles. Some have said Freud correctly predicted where the testicles should be and that he argued that the eels he received were not mature eels. Freud’s experiments resulted in a failure to find the testicles, and he moved into other areas as a result. The question on the mind of this reader is how profound of an effect did this failure to find eel testicles have on his research into human sexual development?

In our teenage and young adult years, most of us had odd jobs that affected us in a variety of ways, for the rest of our working lives. For most, these jobs were low-paying, manual labor jobs that we slogged through for the sole purpose of getting paid. Few of us pined over anything at that age, least of all a legacy that we hoped might land us in annals of history. Most of us wanted to do well in our entry-level jobs, to bolster our character, but we had no profound feelings of failure if we didn’t. We just moved onto other jobs that we hoped we would find more financially rewarding and fulfilling.

Was Freud’s search for eel testicles the equivalent of an entry-level job, or did he believe in the vocation so much that the failure devastated him? Did he slice the first 100 or so eels open and throw them aside with the belief that they were immature? Was there nothing but female eels around him, as he wrote, or was he beginning to see what had plagued the other scientists for centuries, including the brilliant Aristotle? There had to be a moment, in other words, when Sigmund Freud realized that they couldn’t all be female. He had to know, at some point, that he was missing the same something everyone else missed. He must have spent some sleepless nights struggling to come up with a different tactic. He might have lost his appetite at various points, and he may have shut out the world in his obsession to achieve infamy in marine biology. He sliced and diced over 400 after all. If even some of this is true, even if it only occupied his mind for four weeks of his life, we can feasibly imagine that the futile search for eel testicles affected Sigmund Freud in a profound manner.


If Freud Never Existed, Would There Be a Need to Create Him?


Every person approaches a topic of study from a subjective angle. It’s human nature. Few of us can view people, places, or things in our lives, with total objectivity. The topic we are least objective about, say some, is ourselves. Some say that we are the central topic of speculation when we theorize about humanity. All theories are autobiographical, in other words, and we pursue such questions in an attempt to understand ourselves better. Bearing that in mind, what was the subjective angle from which Sigmund Freud approached his most famous theory on psychosexual development in humans? Did he bring objectivity to his patients? Could he have been more objective, or did Freud have a blind spot that led him to chase the elusive eel testicles throughout his career in the manner Don Quixote chased windmills?

After his failure, Sigmund Freud would switch his focus to a field of science that would later become psychology. Soon thereafter, patients sought his consultation. We know now that Freud viewed most people’s problems through a sexual lens, but was that lens tinted by the set of testicles he couldn’t find a lifetime ago? Did his inability to locate the eel’s reproductive organs prove so prominent in his studies that he saw them everywhere he went, in the manner that a rare car owner begins to see his car everywhere, soon after driving that it off the lot? Some say that if this is how Freud conducted his sessions, he did so in an unconscious manner, and others might say that this could have been the basis for his theory on unconscious actions. How different would Freud’s theories on sexual development have been if he found his Holy Grail, and the Holy Grail of science at the time? How different would his life have been? We could also wonder if Freud would have even switched his focus if he found fame as a marine biologist with his findings.

How different would the field of psychology be today if Sigmund Freud remained a marine biologist? Alternatively, if he still made the switch to psychology after achieving fame in marine biology, for being the eel testicle spotter, would he have approached the study of the human development, and the human mind from a less subjective angle? Would his theory on psychosexual development have occurred to him at all? If it didn’t, is it such a fundamental truth that it would’ve occurred to someone else over time, even without Freud’s influence?

We can state, without too much refutation, that Sigmund Freud’s psychosexual theory has sexualized the beliefs many have about human development, a theory others now consider disproved. How transcendental was that theory, and how much subjective interpretation was involved in it? How much of the subjective interpretation derived from his inability to find the eel testicle fueled it? Put another way, did Freud ever reach a point where he began overcompensating for that initial failure?

Whether it’s an interpretive extension, or a direct reading of Freud’s theory, modern scientific research theorizes that most men want some form of sexual experience with another man’s testicles. This theory, influenced by Freud’s theories, suggests that those who claim they don’t are lying in a latent manner, and the more a man says he doesn’t, the more repressed his homosexual desires are.

The Williams Institute at UCLA School of Law, a sexual orientation law think tank, released a study in April 2011 that stated that 3.6 percent of males in the U.S. population are either openly gay or bisexual.[4] If these findings are even close to correct, this leaves 96.4 percent who are, according to Freud’s theory, closeted homosexuals in some manner. Neither Freud nor anyone else has been able to put even a rough estimate on the percentage of heterosexuals who harbor unconscious, erotic inclinations toward members of the same sex, but the very idea that the theory has achieved worldwide fame leads some to believe there is some truth to it. Analysis of some psychological studies on this subject provides the quotes, “It is possible … Certain figures show that it would indicate … All findings can and should be evaluated by further research.” In other words, no conclusive data and all findings and figures are vague. Some would suggest that these quotes are ambiguous enough that they can be used by those who would have their readers believe that most of the 96.4 percent who express contrarian views are actively suppressing their desire to not just support the view, but to actively involve themselves in that way of life.[5]

Some label Sigmund Freud as history’s most debunked doctor, but his influence on the field of psychology and on the ways society at large views human development and sexuality is indisputable. The greater question, as it pertains specific to Freud’s psychosexual theory, is was Freud a closet homosexual, or was his angle on psychological research affected by his initial failure to find eel testicles? To put it more succinct, which being’s testicles was Freud more obsessed with finding during his lifetime?






If you enjoyed this unique perspective on Sigmund Freud, you might also enjoy the following:

Charles Bukowski Hates Mickey Mouse

The History of Bloodletting by Mark Twain

The Perfect Imperfections of Franz Kafka’s Metamorphosis

James Joyce: Incomparable or Incomprehensible?

Rasputin I: Rasputin Rises

Rasputin II: A Miracle at Spala

Rasputin III: Rasputin’s Murder

What the World Needs Now is Another Calvin Coolidge

If we were able to conduct an objective, dispassionate analysis of U.S. presidents, history would judge the administration of Calvin Coolidge in a far more favorable light. If scholarly surveys conducted on presidents, gave points for general restraint and Constitutional restraint in his adherence to the ideas of limited government and popular sovereignty, I think Calvin Coolidge might even end up in the top 5. The current top 10 list is largely comprised of those presidents who sat in office during wars, and moments of general and economic turmoil. None of which occurred before or during Coolidge’s tenure, but some might suggest that the lack of economic turmoil was largely due to the Harding/Coolidge policies. 

The list of greatest presidents is also generally comprised of presidents who said yes, and we can surmise that throughout her history, America has always sided with presidents who said yes to them. Yet, one could argue that all of the creative ways American presidents, since Coolidge, have found to say yes to the American public has led to a federal debt currently spiraling out of control to a point of probable economic disaster at some point. One would think that Americans would recognize an urgent need to find a political leader who is able to say “no” more often. Yet, recent administrations of both parties show that Americans find variations of yes far more appealing.

Conservatives impulsively leap for the name Ronald Reagan when anyone mentions the president who exhibited noteworthy restraint, but is the Reagan administration the perfect model for the problems that currently sit before us? Reagan was a tax cutter, of course, reducing the top marginal rate from 70 to 28 percent. But his tax cuts —which vindicated those supply-side economists influenced by Coolidge, by vastly increasing federal revenue —were bought partly through a bargain in which Reagan said yes to Democrats who were eager to spend all of the revenue that the Reagan tax cuts generated. Compared to Coolidge, Reagan was not a true budget cutter, as the federal budget rose by over a third during his administration.

A better model for conservatives who prefer a more libertarian who prefer austerity measures, might be the Calvin Coolidge administration. While president from 1923 to 1929, Coolidge sustained a budget surplus and left office with a smaller budget than the one he inherited. Over the same period, America experienced a proliferation of jobs, a dramatic increase in the standard of living, higher wages, and three to four percent annual economic growth. The key to this level of success was Coolidge’s penchant for saying “no.” If Reagan was the Great Communicator, Coolidge was the Great Refrainer, a title Reagan gave Coolidge. 

Calvin Coolidge

Calvin Coolidge

Following the Warren G. Harding/Coolidge ticket’s 1920 victory for the office of the president, President Warren G. Harding’s inaugural address set a dramatically different tone from that of the outgoing Woodrow Wilson administration:

“No altered system,” Harding said, “will work a miracle. Any wild experiment will only add to the confusion. Our best assurance lies in efficient administration of our proven system.”

Harding’s ego-less approach was that he would be nothing more than a steward of the American system that had worked just fine in the 130 years of America that preceded his election. Harding’s stance —as opposed to Wilson’s— was that his administration wouldn’t try to outdo the prosperous model that The Founders created. Put in this light, what kind of ego looks at the model of America —that was, and is, the envy of the world— and thinks they can do it better, and how many of them succeeded in this venture? Harding was basically saying that he didn’t regard himself as a “miracle worker” who would step into office with his think tank notions to tell the nation that he has a “new and improved” cure to all that ails us?

He was basically telling the American public that he wouldn’t present what we now call the “New Coke” campaign that no one has ever thought of before. The “New Coke” campaign involved the Coca-Cola Company attempting to gain greater market share in 1985, by copying the formula of its less popular competition Pepsi-Cola. Similarly, numerous narcissist U.S. presidents, before and after Harding and Coolidge, have attempted to impose formulas that have been tried and tested by other countries in history. The idea that those formulas have failed in other countries, and America’s is the envy of the world, doesn’t stop “New Coke” advocates, for they believe they can administrate it to success. The legacy of “New Coke” campaign, and “New Coke” ideas in politics is influential as a cautionary tale against tampering with a well-established and successful brand. By saying that he would act as nothing more than a steward for the prosperous model The Founders created Harding was displaying what some call the pinnacle of intelligence by stating that he was smart enough know what he doesn’t know. 

One of Warren G. Harding’s first steps was to shepherd through Congress the Budget and Accounting Act of 1921. This bill allowed Harding to create a special budget bureau— the forerunner to today’s Office of Management and Budget— where Harding’s director of the bureau could cajole and shame Congress into making spending cuts. Unfortunately, some of Harding’s privatization policies, combined with some ill-advised appointments, led to bribery and favoritism, and ultimately to what would be called the Teapot Dome Scandal.

Enter Coolidge

Calvin Coolidge entered office, after Warren G. Harding’s sudden death, and separated himself almost immediately from Harding with his willingness to say “no” to appointees, Congressman, and to various “New Coke” bills. (Coolidge ended up vetoing fifty bills, a total that ends up being more than the last three presidents combined.) Coolidge summed up his penchant for vetoing these bills saying:

“It is much more important to kill bad bills than to pass good ones.”

Calvin Coolidge was the type of president, the type of person, that if you asked him what time it was, he would tell you. Modern presidents get their tongues so tied up with advice from advisers, pollsters, and focus group testing, that they’re almost afraid to tell you what time it is based on the fact that a direct answer might be taken seven different ways by seven different networks that appeal to a 24-7 audience.

Within 24 hours of arriving in Washington after Harding’s death, Calvin Coolidge met with his budget director, Herbert Lord, and together they went on offense, announcing deepened cuts in two politically sensitive areas: spending on veterans and District of Columbia public works. In his public statements, Coolidge made clear he would have scant patience with anyone who didn’t go along:

“We must have no carelessness in our dealings with public property or the expenditure of public money. Such a condition is characteristic of undeveloped people, or of a decadent generation.”

Perhaps reflecting his temperament, Coolidge favored the pocket veto —a way for the president to reject a bill without actually vetoing it, while giving Congress no chance to override it. Grover Cleveland, who Coolidge admired, used this type of veto in his day, as had Theodore Roosevelt. But Coolidge raised its use to an art form. The New York Times referred to it as “disapproval by inaction.”

The words “perhaps reflecting his temperament” paint a nice portrait of President Calvin Coolidge, for when given the choice between grandstanding on an issue and quietly advocating or dismissing a bill, Coolidge opted for the quiet approach. Perhaps my personal favorite story on this theme of restraint involved one of the greatest tragedies of Coolidge’s presidency. The great Mississippi River flood of 1927 was the Coolidge administration’s Hurricane Katrina. Rather than appear in a photo op, Coolidge chose not to appear on the grounds of the devastation fearing that doing so might encourage federal spending on relief. Another issue that might define the Coolidge administration in an historical manner involved the Klu Klux Klan. When faced with the problem of what to do with the powerful Klu Klux Klan, Coolidge quietly avoided appointing any Klan members to prominent positions in his cabinet, and he thereby decimated the power of that group in America. When faced with the dilemma of what to do with farming subsidies, the man from farming country, chose to veto the subsidies. He also vetoed veterans’ pensions and government entry into the utilities sector. 

If a modern politician even thought of doing any of these things (the maneuver with the Klan excluded), and they listed one of them in their campaign, how many of us would laugh them off the stage? The party’s leaders wouldn’t even consider them for their party’s ticket. The only obstacle for modern politicians is how to find a creative way to say yes that doesn’t tick off too many constituents who want them to say no. 

Yet, how many tragedies does a nation as large as America face every day? How many constituents suffer as a result? The impulsive reaction for any person, politician, and president is to do whatever they can to end their suffering, yet how many unintended consequences arise from a president’s, and Congress’s decision to provide federal aid? How many of these problems could’ve been avoided if we had more presidents do whatever they could to train the country’s expectations to be more limited when the subject involves the federal government’s ability to fix their problems. As many informed politicos will tell us, it’s too late now. The country, thanks to nearly 100 years of conditioning from ego-driven presidents, seeking praise and adulation for their administration, has come to expect the president to do something. It’s a fait accompli now, and there’s little to nothing anyone can do to roll that back now. All of this may be true, but what if Harding’s special budget bureau survived the politics of the 70’s, and the president and Congress conditioned the country to accept the idea that the federal government has attained from taxpayer’s is finite. Would the American public let the locale drown, or would the most generous country in the world do whatever they can to help their fellow American out? Would the American citizen learn to look to their state, local, and even their own communities to aid them in times of crisis? It’s easier and far more popular for a president to just say yes, but I don’t think many objective, dispassionate observers would argue that America would be in a far better place if the presidents who followed Coolidge invested more of their political capital in his politics of no?  

“Four-fifths of all our troubles would disappear if we would only sit down and keep still.”

What came first the chicken or the egg? Did the “yes” politicians condition us to expect more yes from them, or did we condition our candidates for the office to say “yes” to everything? How many candidates stubbornly insist that we need to say no more often? Long question short, are we in unprecedented debt, because of the ruling class, or because Americans have the country we want? 

Whereas the current barometer of the presidency is set on how much, and how often, they spend other people’s money, Coolidge exhibited a level of restraint politicians often reserve only for their own money.

Despite the budget surpluses the Coolidge administration accrued during his presidency, he met with his budget director every Friday morning before cabinet meetings to identify budget cuts and discuss how to say “no” to the requests of cabinet members. Think about that for just a moment before reading on. Think about how a modern politician, on any level, would react to even a momentary surplus. The impulsive reaction, some might even say instinctive reaction, is to find the best way to allocate that surplus for greater political gain, and to reward those who played a pivotal role in securing the surplus by allocating funds for a bridge in the Congressman’s district for example. How many politicians, by comparison, would meet with budget directors, Congressmen, etc., to find further ways to cut. Most presidents give in after a time —Eisenhower being a good example— but Coolidge did not, despite the budget surpluses accrued during his presidency. 

In a conference call with Jewish philanthropists, Coolidge explained his consistency this way:

“I believe in budgets. I want other people to believe in them. I have had a small one to run my own home; and besides that, I am the head of the organization that makes the greatest of all budgets, that of the United States government. Do you wonder then that at times I dream of balance sheets and sinking funds, and deficits and tax rates and all the rest?”

Speaking of tax rates, in December 1923, Coolidge and Treasury Secretary Andrew Mellon launched a campaign to lower top rates from the fifties to the twenties. Mellon believed, and informed Coolidge, that these cuts might result in additional revenue. This was referred to as “scientific taxation”—an early formulation that would later influence economist Art Laffer to develop what we know as the Laffer curve. Coolidge passed word of this insight on:

“Experience does not show that the higher tax rate produces larger revenue. Experience is all the other way,” he said in a speech in early 1924. “When the surtax on incomes of $300,000 and over was but 10 percent, the revenue was about the same as it was at 65 percent.”

The more recent egos who have occupied the tax payer funded seat of president would likely show a blush at the mention of the power and prestige they have achieved by attaining residence in The White House. That humble blush would be shown in the manner a 70’s comedian would show one hand to reject the applause he was receiving, while the other, jokingly, asked for more applause. Calvin Coolidge rejected congratulatory mentions of his power completely. When Senator Selden Spencer took a walk with Coolidge around the White House grounds, the Senator asked the president playfully, “Who lives there?”

“Nobody,” Coolidge replied. “They just come and go.”

For all the praise that authors like Amity Shales heap on Coolidge, some of his critics state that his policies caused The Great Depression and others say they did not prevent them.

“That is an argument I take up at length in my previous book, The Forgotten Man, and is a topic for another day,” Shales said. “Here let me just say that the Great Depression was as great and as long in duration as it was because, as economist Benjamin Anderson put it, the government under both Hoover and Franklin Roosevelt, unlike under Coolidge, chose to “play God.”

Three lessons we can learn from the Coolidge presidency:

Beyond the inspiration of Coolidge’s example of principle and consistency, what are the lessons of his story that are relevant to our current situation? One certainly has to do with the mechanism of budgeting: The Budget and Accounting Act of 1921 provided a means for Harding and Coolidge to control the budget and the nation’s debt, and at the same time give the people the ability to hold someone responsible. That law was gutted in the 1970s, when it became collateral damage in the anti-executive fervor following Watergate. The law that replaced it tilted budget authority back to Congress and has led to over-spending and lack of responsibility ever since. On this note, one could say that Congressional control of the budget is outlined in The Constitution, and that Congress is more representative of the American citizenry. As I wrote above, however, the budget director’s primary job was to cajole and shame Congress into making spending cuts. That wouldn’t play in the 70’s, and it definitely wouldn’t play in the modern era. As such, Coolidge’s quote, “I don’t fit in with these times” would definitely describe a modern day Coolidge, as he probably couldn’t be elected dog catcher. The American people have stated that they prefer an out of control budget with massive spending.

A second lesson we can derive from the Coolidge administration concerns how we view tax rates. Our natural inclination is to believe that higher tax rates produce larger revenue. As Coolidge states, “Experience is all the other way.” The reason behind this is a complicated formula that some suggest raising taxes results in more people or corporations engaging in less taxable activity. Coolidge’s experience with the code suggested that lowering taxes, until we find that sweet spot, encourages greater taxable activity, and thus more taxable revenue arriving in the government’s coffers. Tax policy can also be a mechanism to expand government. The goals of legitimate government —American freedom and prosperity — are left by the wayside. Thus the best case for lower taxes is the moral one — and as Coolidge well understood, a moral tax policy does not demand higher taxes but tougher budgeting.

Finally, a lesson about politics. The popularity of Harding and Coolidge, and the success of their policies — especially Coolidge’s — following a long period of Progressive ascendancy, should give today’s conservatives hope. Coolidge in the 1920s, like Democrat Grover Cleveland in the previous century, distinguished government austerity from private-sector austerity, combined a policy of deficit cuts with one of tax cuts, and made a moral case for saying “no.” A political leader who does the same today is likely to find an electorate more inclined to respond “yes” than he or she expects. {1}

The point, I believe, is that in the current climate of “yes” in Washington D.C., we could use a little “no”. In the event of a natural disasters, there will always be “unprecedented” disasters in a land mass as large as America, “yes” ingratiates the president to the people of the area, the media, the nation, and history, but it is also “yes” that ends up contributing to the national debt, the idea that the federal government is a parent that should clean up the messes of her children, and it discourages smaller scale charity and communities seeing themselves through a disaster of this sort.

“Yes” also lends itself to the already massive egos of those that will sit in our most prestigious seat of representation, and it leads them to believe they can invent “New Coke” formula, until we’re swirling around the drain in it. These massive egos can’t withstand one commentator saying negative things about them, so they start saying “yes” to everything and everyone before them, because “yes” doesn’t have the political consequences of “no”. Saying no to Congressman and Senators can bruise egos and cause negative sentiments and statements; saying no to Governors who ask for state aid will lead to political fallout in the media as every story on that tragedy of the day would be accompanied by their “no”; telling a woman that asks for a car in a town hall debate the meaning of the word no, and telling her exactly what time of the day it is, would lead to utter devastation for that candidate’s campaign. Why would a politician, in today’s media cycle, say no and expound on that by saying that’s not the federal government’s role, and refrain from engaging in photo ops that might encourage Americans to believe that it is the government’s role? By saying no, a politician puts his or her nose out, and it takes courage and humility for a politician to risk everything by denying a power grab in this sense. While Coolidge did not face the 24-7 news cycle modern politicians do, a decent search of his history will reveal that his “no” policies did face a relatively intense amount of scrutiny, and he continued to say “no” throughout.

It would probably be a fool’s errand to try and find another person in our current political climate who has the temerity and resolve to say no as often as Coolidge did. The nation has stated that they would much rather live in the fairy tale land of yes, even if that means that the New Coke ideas lead to greater complexities, long-term consequences, and probable economic turmoil. The greater question, that appears to be approaching closer every day, is not whether a “a great refrainer” is a better president than one who believes the nation can “yes” their way out of every problem, but if the nation will ever be ready for such an answer without the assistance of a cataclysmic economic incident that affects them directly.

Calvin Coolidge’s obituary states that his prestige at the time of his impending third-term* was such “that the leaders of the Republican Party wished to override the tradition* that no President should have a third term.” His response was, “I do not choose to run for President in 1928.” When a “draft Coolidge” movement arose to select him for the GOP ticket, Coolidge said no. When they attempted to override his desire, believing Coolidge’s refusal to run was a shrewd attempt to avoid revealing his ambition, he told them no.  


*Calvin Coolidge ended up serving six years, as a result of Harding’s death two years into his presidency. 

*In 1928, the idea of a president serving more than two terms was still a tradition, until the 22nd amendment passed to Constitutionally limit a president to serving two terms. This “tradition” began with George Washington refusing to run for a third term, Theodore Roosevelt continued the tradition, initially, before running again, and some suggest Harry Truman could have run for a third term, because the 1947, 22nd amendment only applied to presidents after the then-current one (which was Truman), but Truman was deemed too unpopular to seek a third-term.    



Details, Details, Details

Epiphanies, like women, can pop up when you least expect them, and they can free you from a troubling part of your life you didn’t recognize as a problem until they were revealed.

In a PBS documentary on Mark Twain, a number of incidents arose in the building of Twain’s home, and the construction team began “badgering” Twain with questions regarding how he wanted them handled. The questions regarded the construction of his home, the place he would presumably live for the rest of his life, so the observer should forgive the construction crew’s chief for the badgering. The team didn’t know what he wanted, and there were presumably hundreds of questions they had on his desired specifics. What the team did not know, however, was that Twain had an oft expressed aversion for details.


Putting myself in a similar situation, I realize that, like Twain, I’m not a detail-oriented guy. I’ll listen to every question put to me, but I’ll be listening with a sense of guilt. Details make me feel stupid, they start firing far too many neurons in my brain for me to handle, and I usually get overwhelmed and exhausted by them. I know that I should be listening to every question, and I know I should be pondering the details they give me to come up with the ideal solution for my family, but my capacity for such matters is limited.

In the beginning of the process, I’m all hopped up. My mind is acutely focused, and I’m knocking out every question with focused answers. I’m considering every perspective involved, and I’m asking for advice from all of those not involved. I’m reading what others have done, and I’m gathering as much information as possible to make an informed decision, but I will eventually grow overwhelmed and exhausted because I’m not a detail oriented guy.

By the time we reach the 7th and 8th questions, I’ll be out of gas. I’ll be mentally saying, “Whatever, just get it done!” I’ll be falling away from creative answers and onto what is expected in the situation, or what it is that those still paying attention want. I will be answering in an autonomic manner. “Yes, that sounds fine,” I’ll say without knowing what has been said. I’ll just want the damn thing to be built already by that point, because I’m not a details-oriented guy. I’ll want to make the big decisions, but I’ll want to leave all of the “inconsequential” details-oriented questions to others.

I do feel guilty about being this way.  I want to be involved, informed, and constantly making acutely focused decisions throughout the process.  I’ll feel guilty when others start making the decisions that affect me, because I know I’m an adult now, and I should be making all these decisions.  There is also some fear that drives me to constantly pretend that I’m in prime listening mode, based on the fact that I may not like the finished product if I’m not involved in every step.  I may not like, for example, the manner in which the west wing juts out on the land and makes the home appear ostentatious, or obtuse, or less pleasing to the eye with various incongruities, and I’ll wish I would not have been so obvious with my “Whatever just do it!” answers. Details exhaust me, though, and they embarrass me when I don’t know the particulars that the other is referencing.

I don’t know if this guilt is borne of the fact that I know I’m an intelligent being, and I should be able to make these decisions in a more consistent manner, or if I’m just too lazy to maintain acute focus.  I do have a threshold though, and I know how my brain works.  I know that if there are seven ways to approach a given situation, I will usually select one that falls in the first two selections offered.  I usually do this, because I’m not listening after the second one.  Everything beyond that involves the other party showing off the fact that they know more than I do.  I know this isn’t always the case, but it’s the only vine I can cling to when having to deal with my limited attention span and the limited arsenal of my brain.

Knowing my deficiencies for retaining verbosity, I will ask for literature on the subject that provides the subject a tangible quality that can be consumed at my pace. If I do that, and I have, I will then pretend to read every excruciating word, but I will usually end up selected one of the first two selections offered.  I like to think I have a complex brain.  I like to think that I display all that I’m about in my own way, but I’m always reminded of the fact that most of the people around me give full participation to the details of life no matter how overwhelming and exhausting they can be to me.  It’s humbling to watch these brains, I like to consider inferior, operate on planes of constant choices, and decisions, and retentions, and details I am incapable of retaining.

I have this daydream that I will one day be afforded an excuse for having a limited brain by the relative brilliance I reveal to the world in the form of a novel.  I am interviewed in this dream, and I am asked, “So, what does it mean to you to have crafted such a fine book?”  I am far wittier than reality would suggest in this dream when I reply: “It will help me deal with my faults better.  The fact that I cannot fix my own plumbing, can now be countered with, but he wrote a fine book.  The fact that I cannot fix my own car, compete with my wife in certain areas of intelligence, or hold down a decent job can now be countered with, but he wrote a fine book that is held up as a fine book in certain quarters.”

We’ve all heard the line “Everybody’s brain works differently,” but until we learn something regarding the fact that the brilliant brain that composed Huckleberry Finn has similar deficiencies, we cannot help but feel guilty about them.  “Well, work on your deficiencies,” those around us suggest, and we do when that next project comes about.  We’re out to prove ourselves in that next project.  We answer every question, from the first few to the 7th and 8th, with prolonged mental acuity.  When that third and fourth project rolls around, however, we’ll revert back to those inferior brains that can’t retain details, and it is then that we’ll envy those “inferior” brains, consistently showing their superiority.  This could lead those of us that never knew we were suffering from such a recognized deficiency into feelings of incompletion, until someone like Mark Twain recognizes and vocalizes his defeciencies for us.

A President’s Day Guide through obscure presidents, and Lincoln

Most people know the major events conducted by the major presidents that shaped our nation. Most people memorized facts and tidbits of information for the American History tests and quizzes. There are lesser-known presidents that affected this nation in their own way, and had they been defeated in their election, this nation would be very different. There have been times, in our nation’s history when we needed a strong man with a bold hand, such as the one Abraham Lincoln displayed during the Civil War. There have been times when our nation laid in the balance, and we needed a Lincoln to come along and do what he could to preserve what George Washington, John Adams, and all the Founders envisioned. There have been other times, times far less documented in historical records, when our nation needed a humble leader that displayed restraint in times of national scandal and turmoil.

Were it not for the statesmanship restraint displayed by a Calvin Coolidge, for example, we would be a less free nation.Quiet, obscure presidents, like Coolidge, quietly vetoed legislation and exhibited restraint throughout his tenure. Restraint, vetoing legislation, and acting in a manner to preserve individual freedoms is less sexy than passing sweeping legislation and pressing the thumb of government on the throat of individuals and businesses for the purpose of helping people.

Our nation’s history is composed of the strong, Lincoln types and the quiet, Coolidge types that have shaped our country in unique ways, and on this President’s Day I thought we should all be reminded how we came to be the nation we are today, through the more obscure presidents (and Lincoln) that helped guide us to modernity.

Grover Cleveland

Grover Cleveland

1) Stephen Grover Cleveland (March 18, 1837 – June 24, 1908)

The 22nd and 24th President

Cleveland was a Democrat that served the people from 1885–1889 and 1893–1897 in non-consecutive terms.  Cleveland was the only president to do so.

Stephen Grover Cleveland won the popular vote for president on three different occasions, but he lost, in the second election, to Benjamin Harrison in the Electoral College tallies.  He was the only Democrat to defeat a Republican for office during the period of Republican domination that dated back to Abraham Lincoln’s first electoral victory. He was the second president to marry while in office, and the only president (as depicted above) to marry at the white house.  During his tenure, he and the Republican Congress, admitted North Dakota, South Dakota, Montana, Washington, Idaho, Wyoming, and later Utah to the union. His last words were “I have tried so hard to do right.”{1}

Ronald Reagan may have been the president that “tried to give the government back to the people” but Grover Cleveland was one of two presidents of the 19th and 20th centuries –Calvin Coolidge being the other– to accomplish the feat.  By the time their tenure ended, the size and scope of government was more limited than when they began their terms.

Others spoke of limiting the size of government, the others failed.  His first goal was to end the spoils of the political system. He did not fire any of the previous administration’s Republicans that were doing their job well. He cut down the number of federal employees, and he attempted to slow the growth of what he perceived to be a bloated government. He attempted to always place appointments in positions based on merit, as opposed to the usual spoils system that dictated position holders in previous administrations. He also used his veto power far more than any other president of his day.  Although Cleveland was a Democrat, he was one the few that sided with business. Cleveland opposed high tariffs, free silver, inflation, imperialism, and subsidies to business, farmers or veterans.  His battles for political reform and fiscal conservatism made him an icon for American conservatives of the era. Cleveland’s reform ideas and ideals were so strong and influential that a reform wing of the Republican Party, called the “Mugwumps”, bolted from the GOP ticket and swung to his support in 1884.

The great Abraham Lincoln

The great Abraham Lincoln

2) Abraham Lincoln

The 16th President.

Lincoln was a Republican that served the people from March 4, 1861 – April 15, 1865.

Abraham Lincoln, it could be said, is our most famous president. If one were to chart fame by the number of books written about an historical figure, Lincoln has had more books written about him than any other president. By some accounts, he has had more written about him than any historical figure alive or dead save for Jesus of Nazareth.

His fame is derived from serving as president during The Civil War, and the fight to abolish slavery. Lincoln’s fierce abolitionist views were so well known that some have suggested that the reason the South seceded was based on his election victory.  Others suggest that tensions were so fierce due to Lincoln’s presidential predecessor James Buchanan’s mismanagement, and in the Nebraska and Kansas territories, that the succession and the eventual war were inevitable. Lincoln was also made famous by his assassination at the hands of an actor named John Wilkes Booth.

Quick Quip: Democrat rival in the 1960 election for the President Stephen A. Douglas once called Abraham Lincoln two-faced. “If I had two faces,” Lincoln replied, “do you honestly think I would wear this one?”{2}

William Henry Harrison

William Henry Harrison

3) William Henry Harrison

9th President

Harrison was a member of the short lived Whig party, and he served the people as president from March 4, 1841 to April 4, 1841

William Henry Harrison is most famous for dying after serving one month in office as president.  He took the oath on a cold and rainy day, and he refused to wear a coat or a hat.  He also rode into the inaugural on horseback rather than in the closed carriage that had been offered to him. He then proceeded, after the oath, to deliver the longest inaugural in American history. It took him almost two hours to complete it. He then rode away from the inaugural on horseback. Some believe that this reckless regard for his health brought on the illness that his sixty-eight year old body could not recover from, but historians make note that the illness did not set in until three weeks after the inaugural.  Regardless how he contracted the cold, it progressed into pneumonia and pleurisy.  His last words presumed to be to his successor John Tyler were: “Sir, I wish you to understand the true principles of the government.  I wish them carried out.  I ask nothing more.” {3}

Quick Quip: There was some debate over whether W.H. Harrison’s 8,460 word inaugural address (the longest in history) led to his demise.  Harrison refused to dress appropriate for the forecast cold rain, or follow any of advice of those concerned with his well-being. As a result of his demise, Harrison’s grandson Benjamin Harrison, made sure his own inaugural was a little over half what his grandfather’s was.

Martin Van Buren

Martin Van Buren

4) Martin Van Buren

8th President

Van Buren was a Democrat that served the people from March 4, 1837 to March 4, 1841.

Van Buren is regarded, in some quarters, as the father of the Democrat Party, even though Andrew Jackson was the first Democrat to be elected president. He was the first individual born as a U.S. citizen to be elected president. He was the first non-British, non-Irish man to serve as president. He was Dutch. He was also the first self-made man to become President: all earlier Presidents had acquired wealth through inheritance or marriage, while van Buren was born into poverty and became wealthy through his law practice. Van Buren’s presidency was marked by a depression, named the panic of 1837, that lasted throughout his presidency.  As a result of this, Van Buren issued a statement that is also famous regarding his tenure as president: “As to the presidency, the two happiest days of my life were those of my entrance upon the office and my surrender of it.”{4}

James A. Garfield

James A. Garfield

5) James A. Garfield

20th President

Garfield was a Republican that served the people from March 4, 1881 to September 19, 1881.

Garfield was another president known, in history, more for his death, than his life, or tenure as president. Garfield was taken down by an communist assassin by the name of Charles J. Guiteau. Though Garfield only had four months of health while serving the people as president, he did manage to give resurgence to the president’s authority over Senatorial courtesy in making executive appointments. He also energized naval power, he purged the corruption in the Post Office, and he appointed several African-Americans to prominent positions. During the eighty days in which Garfield suffered through the cruelty of the assassin’s bullet, he signed one, single extradition paper. Some historians have suggested that Garfield may have been one of our most talented and eloquent presidents had he lived long enough to expose this to the nation, but he was able to serve the nation in Congress having served nine consecutive terms, and he was able to do what he could in the short time that he served as president.  Candice Millard’s brilliant book Destiny of The Republic captures the essence of Garfield with the quote: “Born into abject poverty, he rose to become a wunderkind scholar, a Civil War hero, a renowned congressman, and a reluctant presidential candidate who took on the nation’s corrupt political establishment.”

Knowing his death was imminent, James A. Garfield’s final words were: “My work is done.” {5}

Benjamin Harrison

Benjamin Harrison

6) Benjamin Harrison

23rd President

Harrison was a Republican that served the people from March 4, 1889 to March 4, 1893.

Harrison is most notable for being the grandson of William Henry Harrison, and the man that defeated the mighty Grover Cleveland in the Electoral College vote in 1888.  Harrison’s tenure was also famous for passing the McKinley Tariff and the Sherman Antitrust Act. He was also famous for allowing federal spending to reach one billion dollars. Harrison also advocated for federal funding for education, he was unsuccessful in that regard. He also pushed for legislation that would protect the voting rights of African Americans.  The latter would be the last attempts made at civil rights in the country until the 1930’s. Learning from the after effects of a long inaugural, courtesy of his Grandfather’s record long speech that some believe led to his death, Benjamin Harrison kept his inaugural address brief. Though historians tend to disregard Harrison as a prominent president, they regard his foreign policies as laying the groundwork for much that would be accomplished in the 20th century. {6}

Calvin Coolidge

Calvin Coolidge

7) Calvin Coolidge

30th President

Calvin Coolidge was a Republican that served the people from August 2, 1923 to March 4, 1929.

Coolidge would not stand a chance in today’s 24-7 news network, internet definition of politics. In the current climate of celebrity presidential candidates climbing all over one another for more air time, a better sound bite, and a better image, “Silent Cal” Calvin Coolidge would have been run over.  In this age of bigger and better governments, where politicians on both sides of the aisle try to flex their legislative muscle in bill signings that are celebrated media events, Calvin Coolidge signed legislation into law in the privacy of the office.  In a quote that could be attributed to the current, progression of big government, Calvin Coolidge said: “The requirements of existence have passed beyond the standard of necessity into the region of luxury.” Calvin Coolidge would be a laughing stock in our day and age, a man on the outside looking in, a statesman that would’ve faded into the woodwork of our society.

Social critic and satirist Dorothy Parker once said: “Mr. Coolidge, I’ve made a bet against a fellow who said it was impossible to get more than two words out of you.”

Coolidge’s famous reply: “You lose.”

After hearing that Coolidge passed away, four years after leaving office, Parker remarked: “How can they tell?”

Although Coolidge was known to be a skilled and effective public speaker, in private he was a man of few words and was referred to as “Silent Cal” in most quarters. On this reputation, Coolidge said:

“The words of a President have an enormous weight, and ought not to be used indiscriminately.” 

Although known as a quiet man, Coolidge participated in over five hundred press conferences during his 2000 days as president, that is an average of one press conference every four days. Coolidge took over the office of president after his predecessor’s death, amid his predecessor’s controversy, that was called the Teapot Dome Scandal. The Teapot Dome Scandal was regarded as the “greatest and most sensational scandal in the history of American politics” until the media discovered the Watergate scandal. In the wake of this scandal, Coolidge told a reporter:

“I think the American people want a solemn ass as a President, and I think I will go along with them.”

Coolidge may have been the last statesman the American people had to serve as president. He was against the Klu Klux Klan, for instance, but he didn’t make grandstanding statements against the Klan, he just didn’t appoint Klan members to positions in his administration. This may seem to be such an obvious move that it’s not worth discussion, but the KKK had a lot of influence at this time in America, and Coolidge’s move caused them to lose much of it. Coolidge tried to take this one step further, calling for anti-lynching laws, but the attempts to pass this legislation were stopped by Democrat filibusters. He attempted to make war illegal in the Kellogg-Briand act, but that law proved ineffective. Coolidge was a laissez-faire president that didn’t believe that the federal government should have a role in farm subsidies or flood relief. As much as he wanted to help these people, he wanted to avoid setting the precedent of the federal government resolving problems that he believed could better be solved, on a case-by-case basis, locally. By the end of his administration, he achieved a tax bill that had all but the top 2% paying no federal income taxes. Coolidge disdained federal regulation and appointed commissioners that followed his philosophy that believed in state’s rights, and this caused a divide in historical opinion of his administration.

Some believe that this laissez-faire approach led to “The Roaring Twenties”, others argue that it led to “The Great Depression.” As with all matters such as these, the opinions are based on where the historian lies on the ideological divide. Some historians say that “The Roaring Twenties” was built on a bubble similar to the 1990’s tech bubble in that it wasn’t built on hard assets, and when that bubble did burst, as it did in the 90’s, a recession occurred as a result. That recession, say other historians, was prolonged into a depression that lasted to the forties by the recovery measures put in place by future administrations. The latter argument has it that the economy may have experienced a dip as a result of the bubble bursting, but the extended duration of this natural, down cycle was caused by the measures put in place by future administrations to recover from what may have otherwise been a temporary dip. Arguments such as these are impossible to resolve, however, because one cannot remove some facts to prove others.

Historians from both sides of the aisle have also defined his last words in varying ways. Those that oppose Coolidge’s actions, state that his last words were a lamentable admission that his limited government policies didn’t work. Those that favor his policies state that he was lamenting the course America was on, into a country of big government policies. They state that Coolidge’s administration was, itself, a temporary blip in a progression that Theodore Roosevelt started, and they suggest that based on everything Coolidge saw during his tenure, he foresaw this.

His last words were: “I feel I no longer fit in with these times.”{7}








Presidential trivia for President’s Day

With President’s Day approaching, we thought we would compile a list of relatively obscure facts, trivia, and some interesting stories about the forty-four men that have served in the office of president for the United States’ citizens throughout our nation’s relatively young history. These are not fun trivia questions, and one of my friends informed me that people enjoy questions that they have a chance of answering correctly. For those that sit in an office, and send out trivia questions to your team members, we thought we would provide some trivia for those that want one or two questions that office workers cannot Google up as easily. (Unless they cheat and Google up this page of course.)  

10) Which president was never on a ticket that won a presidential election?

Answer: Gerald Ford. Andrew Johnson never personally won a presidential election, but he was on the 1865, winning ticket as Abraham Lincoln’s vice-president. Gerald Ford was not present on Richard M. Nixon’s 1968, winning ticket. That honor went to Spiro Agnew, originator of the famous, erudite insult “Nattering nabobs of negativism”. A third trivia question spawns from Ford’s unsuccessful run for president in 1976. The vice-president listed on his 1976 ticket was future presidential candidate Bob Dole. The victor of the 1976 election was James Earl Carter, and his vice-president was future presidential candidate Walter Mondale. Both of these vice-presidents were unsuccessful in their future runs for office. 

9) Other than President Bill Clinton, what president was successfully impeached by the House of Representatives?

Andrew Johnson

Andrew Johnson

Answer: Andrew Johnson. Both were acquitted by the Senate, but Johnson’s presidency survived by a single vote, while Clinton’s presidency survived four separate charges of impeachment. Two of the charges passed in the House of Representatives, a vote that included five Democrats voting in favor of three of the four charges. As opposed to the House’s requirement of a simple majority to impeach, the Senate required a two-thirds majority to impeach. Both of the charges brought against Clinton failed to indict the impeached president. The obstruction of justice charge  failed by seventeen votes, and the perjury charge failed to reach the two-thirds majority requirement by twenty-two votes. Some say these votes were cast along party lines, and they were, almost exclusively, while others say that the charges themselves were partisan by nature. For those that suggest that Richard Nixon was impeached, he probably would have been, but he resigned from office before impeachment proceedings could begin.  


8) Historians list President Donald J. Trump as the forty-fifth president, but he is the forty-fourth man to serve in this role. Is this a discrepancy or an error?

Grover Cleveland

Grover Cleveland

Answer: President Grover Cleveland served two terms that were non-consecutive. Thus, he is considered both the 22nd and the 24th president in U.S. History.


7) We all use the idiom O.K. to inform others that we are doing well. “I’m O.K. How are you?” Some have stated that the idiom may have been mistakenly applied to a presidential candidate to describe his qualifications for office. What president was this?

Van BurenAnswer: President Martin Van Buren. The origins of O.K. are widely disputed, and there are many theories about its etymology. The most interesting one I’ve heard comes from the candidacy of Van Buren. He was from Old Kinderhook, New York. While in office, associates and voters began referring to him as Old Kinderhook, or O.K., as opposed to New Kinderhook, or N.K., and he continued to be referred to as O.K. in speeches and in print. O.K. clubs formed in support of Van Buren, and some began to believe that this idiom referred to his qualifications when supporters began chanting that Van Buren was “O.K.” at rallies, Voters soon began to believe that he was not as “O.K.” as they once thought when they booted him from power and refused him re-election on two other, subsequent bids for the office, but those losses did not affect the power of the idiom that some believed described his qualifications. Others state that the idiom predates Van Buren, and it only achieved national prominence through Van Buren’s successful use of it.

6) Is Abraham Lincoln related to Tom Hanks?

LincolnAnswer: Abraham Lincoln and Tom Hanks were first cousins and childhood friends. Lincoln’s Mother’s maiden name was Hanks, and there was a cousin on that side named Thomas, and the two of them were quite close. So, I cheated. The star of Forrest Gump that we know today was not the same as the one that Abe palled around, but Abe did have a first cousin named Tom Hanks. Recent genealogy tests have also revealed that the Forrest Gump actor, Tom Hanks, is a third cousin, four times removed, from Abraham Lincoln, so the question and answer works both ways for those seeking to trip their friends and colleagues up. {1}

5) Many people know that George H.W. Bush and George W. Bush were the second father-son presidents, following John Adams and John Quincy Adams, but there was one former Congressman that was the son of one president and the father of another. Who was that man?

John ScottAnswer: John Scott Harrison. Congressman John Scott was the son of President William Henry Harrison, and the father of President Benjamin Harrison. Unfortunately, William Henry did not live long enough to see his John Scott win his seat in Congress, and John Scott did not live long enough to see his son, Benjamin, win the presidency. The three of them did achieve quite a legacy in politics however, and we have to feel for all of the generations of Harrisons that followed in their attempts to continue and further such an historic legacy.

4) Which president was the most successful former president?

TaftAnswer: This is debatable, of course, but one-term President William Howard Taft was the only former president to achieve a nomination of Chief Justice of the Supreme Court. In an argument devoted solely to the positions former presidents  achieved, no former president matches Taft’s level of prestige.

3) Which president survived the first attempt at an assassination?

AndrewJackson(1)Answer: Andrew Jackson. This is primarily noteworthy, in history, because after two unsuccessful attempts to fire his pistol at Jackson, failed assassin Richard Lawrence secured his place in history as not only a failed assassin, but as a man that got beat down for his efforts by an angry old man. (It is reported that after the failed attempts at taking his life, Andrew Jackson participated in the subduing of the failed assassin by beating him down with his cane.){2}

2) Which president was the first to have an underwater car?

Answer: Lyndon Baines Johnson. This question is also included less for the mind bending quality and more for the story. History has it that LBJ loved to take unsuspecting aides and dignitaries for a ride in this submersible car to a lake. When approaching the lake, LBJ would begin screaming and hollering hysterically that the brakes were failing, only to say something along the lines of “Gentlemen, I’d like to introduce you to the world’s first submersible car” when they were all underwater together.{3} One has to imagine that the feminine shrieks these dignitaries would issue when approaching the lake would eventually give LBJ a lot of power in geopolitical negotiations.

1) Other than William Henry Harrison, which president served the shortest tenure?

GarfieldAnswer: James A. Garfield. Although Garfield technically served six months and fifteen days, he was shot four months into his tenure as president. He, then, suffered for eleven weeks following this assassination attempt, while attending doctors attempted to locate the bullet. Alexander Graham Bell was even brought in with an invention called the metal detector to try and locate the bullet. Most historians and medical experts now believe that Garfield probably would have survived had the attending doctors not placed their unsterilized fingers into the president’s wounds searching for the bullet. Some have even theorized that Garfield may have had a better chance at survival if these doctors did nothing, and that it was the anti-sepsis measures these doctors employed to locate the bullet that led to Garfield’s death.{4}

*bonus) The numbers:

  1. There have been 56 presidential elections.
  2. Five future presidents lost the popular vote and become presidents.
  3. Thirteen presidents have been reelected to office and served out that second term.
  4. The youngest president to ever take office was Theodore Roosevelt. He was not yet forty-three at the time. (The interesting note on this point is that Theodore Roosevelt stated that the one bad thing about being elected so young was that I had nowhere to go but down after that.) He was not yet fifty-years-old when he decided not to seek reelection. Roosevelt assumed office after President McKinley was assassinated, and Roosevelt ended up serving almost eight years. He considered that enough at the time, but changed his mind. This might be another mind bender, who was the last president to serve seven and a half years and decide not to seek reelection.   

{1} {2} {3} {4}