Krauthammer on Churchill: The Indispensable Man


One of the goals of every writer should be to have those that read his work regard him brilliant. Another goal, and a far more difficult and impressive one, is to have the reader think brilliant thoughts while reading that work. Whether or not Charles Krauthammer’s new book Things That Matter: Three Decades of Passions, Pastimes and Politics accomplishes the former is relative to the reader, but in my humble opinion, the books definitely accomplishes the latter.

Book%20Cover_0In the second chapter, following the requisite intro, and the requisite chapter describing the author’s days of youth –playing baseball– Charles Krauthammer posits the notion that Time Magazine got it wrong when they nominated Albert Einstein “Man of the Century”. Einstein may have been vital, argues Krauthammer, and he is “certainly the best mind of the century”, but Britain’s Prime Minister Winston Churchill “carried that absolutely required criterion: indispensability” in the 20th century, and to the 20th century.

One thought this reader had, while reading, is that provocative, bar stool discussion would have it that no person had a more prominent effect on the 20th century than Adolf Hitler. While that is arguably true, a question to that provocative notion should be, were the lessons of Hitler’s evil transgressions more transcendent than Winston Churchill’s efforts to, as Krauthammer later describes it, “slay that dragon”?

Hitler was, of course, indispensable to any study of the 20th century, in that he illustrated much of what’s wrong with human nature, and he gave us a template for how we should treat countries after war (after World War I). Though evil can take many forms, Hitler provided students of history a model of unprecedented evil that we can now use as a guide to detect evil, based on the precedent he set. We will hopefully no longer allow an evil despot to rise to such levels of prominence in their country that they would be in a position to coerce its citizens to do such evil things to one another. With all these lessons and precedents regarding absolute evil, students of the 20th century say that Hitler has to be the man of that century.

It’s a provocative notion, and it would probably give Hitler the stature, and historical value, that he sought all along. How many men, and how many precedents of the 20th century, will be cited more often than those Hitler provided humanity for centuries to come? Young people, involved in bar stool discussions, love such provocative notions, for they provide all listeners the impression that the provocateur is intelligent with such shock and awe proclamations. Most of us love such impressions, when we’re younger. As we age, and move past the desire to be perceived as intelligent, we actually become more intelligent, and we realize that most provocative thoughts should go through careful examination and attempts to disprove. The final conclusions we reach may not be as provocative, or as memorable, but as we age, and read, we realize that being right is more valuable than being memorable or provocative. There is no doubt that the lessons evil men leave behind are monumental in history, but too often these provocative conversations leave out the dragon slayers that should, at least, be considered as prominent, if not more so.

To say that Winston Churchill hasn’t already achieved a prominent place in history would be foolish, as most historians continue to rank him in their top five most prominent figures of the 20th century, and most left-leaning historians will rank him in their top twenty. Does he deserve even greater prominence than we’ve already allowed, however?

One of the reasons Churchill is not higher on the list, I would submit, is that hindsight has proven that he was so obviously correct in his doomsayer predictions about Hitler. The idea that all of his warnings were so obviously on the mark, however, makes it almost boring to declare him the most prominent person of the 20th century.  It’s an of-course statement that causes readers to yawn over the headline, when a more prominent listing of others, such as Einstein, prove far more provocative, compelling, and newsworthy.

Churchill was, as Krauthammer writes, “A 19th century man parachuted into the 20th,” but “it took a 19th century man –traditional in habit, rational in thought, conservative in temper– to save the 20th century from itself.” Yawn. Such lines don’t play well on the cover of a magazine to suggest that Churchill was right about Hitler, and thus he should be nominated the Man of the Century for speaking out and saving us. Especially when compared to the exciting, and revolutionary, bullet points a writer can compile with points about Einstein’s accomplishments.

Before dismissing the obviousness of Churchill’s warnings, one has to examine what he was up against while still in the British Parliament. Most of the British Parliament, and Prime Minister Neville Chamberlain, dismissed Churchill’s warnings. They did not want to view Hitler through Churchill’s simplistic, black and white lens. Churchill’s warnings were viewed as the impulsive, irrational, and the unreasonable views of a war hawk. Neville Chamberlain has been viewed, by right and left historians as one of the obvious fools of the 20th century, but is it a glaring headline that Churchill should be viewed as the most obvious hero of the 20th century, no, because it is just so obvious. It doesn’t require any creativity to back up. It just is what it is, as we now say.

Churchill suggested that the year’s delay between the Munich Pact and what he deemed an inevitable war worsened Britain’s position, in direct opposition to Chamberlain’s assessment. (Editor’s note: Chamberlain would later declare that that year allowed the British to bolster their troops, and that the British military was not prepared for war during the previous year.) In that year, between Munich and World War II, Chamberlain also exhausted the possibility of diplomacy with détente, blockades, and anything and everything the world could use to achieve “peace in our time”. To refute the Chamberlain claims, Churchill stated Hitler could have been removed from power by a grand coalition of European states to prevent World War II from happening in the year in question.

That suggestion, that in some cases waiting too long can worsen one’s position, would rear its ugly head before Hitler’s body even went cold, when U.S. General George S. Patton’s warned General Eisenhower about Russia. Eisenhower, presumably recognizing that Patton’s warnings were not unfounded, responded that Americans were simply too war-weary to make any moves against Russia. The suggestion would later haunt the world in the 21st century, with Iraq in 2003, in a manner some would suggest the reverse of the Churchill suggestion, saying that we acted too impulsively, and the suggestion will probably haunt nations around the world for many more, because the human instinct is to avoid war at all costs, no matter how black and white, and simplistic, and obvious the need for action becomes.

In later writings, “Churchill depicted Chamberlain as well-meaning but weak, blind to the threat posed by Hitler, and oblivious to the fact that (according to Churchill) Hitler could have been removed from power by a grand coalition of European states. Churchill suggested that the year’s delay between Munich and war worsened Britain’s position, and criticized Chamberlain for both peacetime and wartime decisions. In the years following the publication of Churchill’s books, few historians questioned his judgment.”{1}

It may appear redundant to call an historian a hindsight historian, since history is documented in hindsight. Some historians document the facts of the era while others provide hindsight commentary to historical events that were not as clear to the historical figures of the day, but these historians provide the unlimited omniscience that hindsight provides. Hindsight historians may document Churchill’s warnings as obvious now, but most hindsight historians will not tell you how popular Neville Chamberlains “peace in our time” efforts were at the time.

Another question those that believe Hitler’s quest for power was so obvious that it’s now redundant to talk about, should ask themselves how obvious it was to Neville Chamberlain at the time. How obvious was it to the British Parliament, the isolationists in America, and the world at large. Much like today, Churchill was regarded as a war hawk, and presumably a fear monger when he spoke of what he believed to be Hitler’s aspirations. Some have said that Churchill is almost solely responsible for the meetings that occurred at Tehran, Yalta, and Potsdam with FDR and Stalin that eventually won the war for the allied forces. 

We’ve all read hindsight historians document that America shouldn’t have been “so stupid” as to allow the attack on Pearl Harbor, when so many signs pointed to its eventuality. It’s easy for them to look at the decade preceding the terrorist attack on September 11, 2001, to declare that we were obviously naïve in trying terrorists as criminals rather than wartime adversaries. It’s also easy for them to write that that the call to war in Iraq, in 2003, was impulsive based on our inability to find weapons of mass destruction in Iraq. What’s not so easy, however, is for those figures that were involved in the present tense of history to stick their neck out and speak out against the conventional wisdom of their day and declare that it’s “weak and blind” to continue to follow the conventional line of thinking. Hindsight historians now slightly diminish Churchill’s role in 20th century, because it is now so obvious that Hitler was the epitome of evil. To read through an objective telling of the history, however, it obviously wasn’t so obvious to some at the time.

As Krauthammer wrote in Things That Matter:

“And who is the hero of that story?  (The story of the 20th century’s ability to defeat totalitarianism, and leave it as a “cul-de-sac” in the annals of human history.)  Who slew the dragon? Yes, it was the ordinary man, the taxpayer, the grunt who fought and won the wars. Yes, it was America and its allies. Yes, it was the great leaders: FDR, de Gaulle, Adenauer, Truman, John Paul II, Thatcher, Reagan. But above all, victory required one man without whom the fight would have been lost at the beginning. It required Winston Churchill.”{2}

Krauthammer, Charles.  Things That Matter: Three Decades of Passions, Pastimes and Politics.  New York, New York:  Random House, 2013.  Print.

Advertisements

James Joyce: Incomparable or Incomprehensible?


Those of us on the lookout for edgy, racy content have heard the term “Joycean” thrown about with little discretion over the years. Critics appear to be more interested in using the term than they are in properly applying it to the product they are describing. The question that those of us driven to the source would have for Joyce, if he were still alive, is: Were your final two works the most erudite, most complicated pieces of fiction ever written, or were they a great practical joke played on the literature community to expose these reference makers and your elitist, scholars for who they are?

James Joyce

James Joyce

Most readers that seek to up their erudite status by reading difficult books, have heard of Joyce’s final two works: Ulysses and Finnegans Wake. Even literary scholars list the books as some of the most difficult, most complicated works of fiction ever created. Some of us have attempted to tackle them as the challenge that they are, others have attempted to read them for entrance into their subjective definition of elite status. Most are confused and disoriented by the books, but some have the patience, the wherewithal, and the understanding of all of the references made, and languages used, in these books necessary for comprehension. Those readers either deserve a hearty salute, or the scorn and laughter that Joyce provided, as a gift, to the havenots, who openly admit that they don’t understand these books.

I don’t understand either of these books, and I have gone back numerous times to try and further my understanding. Some have said that Ulysses is the more palatable of the two, but I have found it to be too elliptical, too erratic, and too detail-oriented to maintain focus, and I have purchased three different aides to guide me through it. Some of those same readers readily admit that Finnegans Wake is ridiculously incomprehensible.

Most people enjoyed Dennis Miller’s tenure as an announcer on Monday Night Football, but most of those same people complained that they didn’t understand two-thirds of the man’s references. I didn’t keep a journal on his references, but I’m willing to bet that at least a third of them were Joycean in nature. Miller stated that his goal, in using such obscure references, was to make fellow announcer Al Michaels laugh, but any fan that has followed Miller’s career knows that he enjoyed the motif gained by using complicated and obscure references to make himself sound erudite. There are, today, very few references more obscure than those who recall the work of James Joyce, a man that described his last book, Finnegans Wake, as “A book obscure enough to keep professors busy for 300 years.”

Andy Kaufman referenced James Joyce when trying to describe his method of operation. The import of the reference was that Kaufman wanted to be a comedian’s comedian, in the manner that Joyce was a writer’s writer. He wanted to perform difficult and complicated acts that the average consumer did not understand, and the very fact that they didn’t “get it” was what invigorated him. He wanted that “insider” status that an artist uses to gain entrée into the “in the know” groups. After achieving some fame, audiences began laughing with Kaufman in a manner that appears to have only bored him, and he spent the rest of his career trying to up that ante. By doing the latter, we can guess that there was something genuine about Kaufman’s path in that he was only trying to entertain himself, and his friends, and if anyone else wanted on board that was up to them. Perhaps, Joyce and Kaufman shared this impulse.

Anytime an artist creates a difficult piece of work, there is going to be a divide between the haves (those that get it) and the havenots. When Mike Patton formed the band Fantomas, he never did so with the illusion that he was going to unseat the Eagles Greatest Hits, or Michael Jackson’s Thriller, atop the list of greatest selling albums of all time. He knew, or should’ve known, that he was playing to a very select audience.

What is the audience for such difficult subject matter? Most people seek music, as either background noise, something to dance to, or something to tap their finger to. Most people read a book to gain a little more characterization and complication than a movie can provide, but they don’t want too much characterization, or too much complication. Most people only buy art to feng shui their homes. Most people don’t seek excessively difficult art, and those who do are usually seeking something more, something more engaging, and something more provocative that can only be defined by that individual. The audience for the difficult generally have such a strong foundation in the arts that they reach a point where their artistic desires can only satiated by something different.

Different can mean different things at different times to different people. Different can be complicated, and discordant, but it can also be limited to style. At this point in history, it’s difficult to be different, in a manner that cannot be called derivative of someone or something, so most people seek whatever separations they can find. When the latest starlet of the moment twerks in a provocative manner, has a construction worker find her pornographic video, or accidentally has her reproductive organ photographed, we know that these are incidents created by the starlet, and her people, to get noticed after they have exhausted all other attempts to be perceived as artistically brilliant and different.

There are also some other artists who are different for the sole sake of being different. This is often less than organic, and it often disinterests those of us seeking a true separation from the norm, because we feel that this has been thoroughly explored to the point of exhaustion. Andy Kaufman created something organically different that can never be completely replicated, in much the same manner Chuck Palahniuk, Mike Patton, David Bowie, Quentin Tarantino, and Jerry Seinfeld and Larry David did. Can it be said that James Joyce’s final two books were different in an artistically brilliant, and cutting edge manner that all of these artists were creations were, or were James Joyce’s writings more symbolism over substance? Put another way, was Joyce a substantive artist who’s true messages need to be unearthed through careful examination, or was he simply twerking in a provocative manner with the hope of getting noticed by the elite scholars of his generation after exhausting the limits of his talent in other works?

Judging by his short stories, James Joyce could’ve written some of the best novels in history. Those that say that he already did, would have to admit that his final two works were not overly concerned with story, or plot. Those that defend his final two works would probably say that I am judging Joyce’s final two works by traditional standards, and that they were anything but traditional. They would probably also argue that the final two works sought to shake up the traditional world of literature, and anyone that dared to take up the challenge of reading these works. They would probably say Joyce sought to confound people, more than interest them, and if they did concede to the idea that the final two works were different for the sole sake of being different, they would add that he was one of the first to do so. Those that defend his final two works say that they are not as difficult to read, or as complex, as some would lead you to believe. These people suggest that reading these two works only requires more patience, and examination, than the average works. Anyone that states such a thing is attempting to sound either hyper intelligent, or hyper erudite, for it was Joyce’s expressed purpose to be difficult, complicated, and hyper-erudite.

To understand Ulysses, one needs an annotated guide of 1920-era Dublin, a guide that describes the Irish songs of the day, some limericks, mythology, and a fluent understanding of Homer’s The Odyssey. If the reader don’t have a well-versed knowledge of that which occurred nearly one-hundred years prior to today, they may not understand the parodies, or jokes Joyce employs in Ulysses. Yet, it was considered, by the Modern Library, in 1998, to be the greatest work of fiction ever produced.

“Everyone I know owns Ulysses, but no one I know has finished it.”  —Larry King.

To fully understand, and presumably enjoy, Finnegans Wake, the reader needs to have a decent understanding of Latin, German, French, and Hebrew, and a basic understanding of the Norwegian linguistic and cultural elements. The reader will also need to be well-versed in Egypt’s Book of the Dead, Shakespeare, The Bible, and The Qur’an. The reader also needs to understand the English language on an etymological level, for one of Joyce’s goals with Finnegans Wake, was to mess with the conventions of the English language.

Some have opined that one of Joyce’s goals, in Ulysses, was to use every word in the English language, and others have stated that this is a possibility since he used approximately 40,000 unique words throughout the work. If this is true, say others, his goal for Finnegans Wake, was to extend the confusion by incorporating German, French, Latin, Hebrew, and other languages into his text. When he did use English, in Finnegans Wake, Joyce sought to use it in unconventional and etymological ways to describe what he believed to be the language of the night. He stated that Finnegans Wake was “A book of the night” and Ulysses was “A book of the day”.

“In writing of the night, I really could not, I felt, use words in their ordinary connections . . . that way they do not express how things are in the night, in the different stages – conscious, then semi-conscious, then unconscious. I found that it could not be done with words in their ordinary relations and connections. When morning comes of course everything will be clear again . . . . I’ll give them back their English language. I’m not destroying it for good.” —James Joyce on his novel Finnegans Wake.

This use of the “language of the night” could lead one to say that Joyce was one of the first deconstructionists, and thus ahead of his time by destroying the meaning of meaning in the immediate sense. Those obsessed with James Joyce could interpret the quote, and the subsequent methodology used in Finnegans Wake, to mean that Joyce had such a profound understanding of linguistics that normal modes of communicating an idea, bored him. He wanted something different. He wanted to explore language, and meaning, in a manner that made his readers question their fundamentals. Readability was not his goal, nor was storytelling, or achieving a best-seller list. He sought to destroy conventions, and common sense, and achieve a higher realm of perfect, in which timeless abstractions cannot be communicated to those that adhere to common sense. This makes for an interesting conversation on high art, and philosophy, but does it lend itself to quality reading?

“What is clear and concise can’t deal with reality,” Joyce is reported to have told friend Arthur Power,  “For to be real is to be surrounded by mystery.”

In the modern age, there is much discussion of the widening gap between the haves and the have nots. That particular discussion revolves around economic distinctions, as it has for time immemorial, but in the Joycean world, the gap involves those that “get” his works, and those who do not. Those that get it usually prefer to have deeper meanings shrouded in clever wordplay. They usually prefer symbolism over substance; writing over storytelling; and interpretation over consistent and concretized thoughts.

The two schools of thought between the haves and the havenots can probably best be explained by breaking them down to the Hemingway manner of writing and that of Joyce. Hemingway wrote clear and concise sentences. Hemingway stated that his methodology was to write something that was true:

“The hardest thing is to make something really true and sometimes truer than true.”—Ernest Hemingway.

Putting Joyce’s final two works through the Hemingway school of thought, one could say that Joyce’s methodology was: “Some of the times, it’s easier to make it false and let others define it as true.”

“Though people may read more into Ulysses than I ever intended, who is to say that they are wrong: do any of us know what we are creating? … Which of us can control our scribblings? They are the script of one’s personality like your voice or your walk.” —James Joyce

Those of us who have had a deep discussion, on a deep, multifaceted topic, with a deep thinker know that sooner or later a declarative distinction will be made if we stubbornly insist that we are not wrong. “You don’t get it, and you probably never will,” is something they will say in a variety of ways. We all know what it feels like to be summarily dismissed as an anti-intellectual by a deep thinker? Those who aren’t snobbish in an anti-social manner, often avoid openly dismissing you, but even the polite snobs give you a vibe, a look, or that chuff that is intended to let us know our place.

“Well, what do you think of it then?” is the response some of us have given, after being backed into an anti-intellectual corner by deep thinkers.

If they are an anti-social, elite intellectual snob, they will say something along the lines of: “I simply choose to think deeper!” It’s a great line, and it purportedly puts stubborn types in our place, but it’s a non-answer. Those of us who are more accustomed to interaction with deep thinkers, will then ask them to expound upon their complicated, deep thinking? Pushing deep thinkers deeper will often reveal a lack of substance beneath their piles of style, and the careful observer will find that the results of their deep thinking is no deeper than the deep thinker cap they wear to the pub.

A number of attempts at reading Joyce have led me to believe that he probably didn’t have much substance beneath his piles of style, so he muddied the waters of his message with puns, songs, gibberish, abstractions, foreign languages, and overly complicated complications. He did this, in my opinion, to conceal the fact that when compared to his colleagues, he didn’t have all that much to say, but that he was artistically accomplished in saying it.

Who can forget the many sayings that Finnegans Wake dropped on our culture, such as the transcendental sound of the thunderclap that announced the fall of Adam and Eve from the garden of Eden:

“bababadalgharaghtakamminarronnkonnbronntonnerronntuonnthunntrovarrhounawnskawntoohoohoordenenthur-nuk!”

What about the mirthsome giggles we have had in social gatherings with the catchphrase:

“A way a lone a last a loved a long the riverrun, past Eve and Adam’s, from swerve of shore to bend of bay, brings us by a commodius vicus of recirculation back to Howth Castle and Environs.”

Or the ever present: 

“(Stoop) if you are abcedminded, to this claybook, what curios of sings (please stoop), in this allaphbed!  Can you rede (since We and Thou had it out already) its world?”

If you just read those sentences three or four times, and you still have no idea what it says, and you just went back to read them again, because you want to be a have that “gets it”, you’re not alone. If these passages were merely anecdotal evidence of the difficulty involved in reading Finnegans Wake, that would be one thing, but these difficulties litter just about every sentence of every paragraph of the book, as evidenced by the exhaustive assistance provided at the site Finwake.com for readers that have no idea what this writer is writing about. 

Finnegans Wake is reported to be in English, but it’s not the standard version of English where words have specific meaning. The “language of the night” was intended for linguists who are tired of reading words that have exact meanings, and it was intended to be playful and mind-altering, and rule breaking. James Joyce made references that were intended to be obscure even to the reader of his day that may not have Joyce’s wealth of knowledge of history, or the manner in which the meaning of the words in the English language have changed throughout history.

“What is really imaginative is the contrary to what is concise and clear.” —James Joyce

James Joyce was a stream of consciousness writer who believed that all “mistakes” were intended on some level that superseded awareness. In the 500+ page book, Finnegans Wake, Joyce found 600 errors after publication. He was informed of some, if not all of these errors, and he was reported to have fought his publishers to keep them in. Later editions were written to correct many of these errors, and provide readers “the book in the manner Joyce had intended.” If Joyce didn’t believe in errors, however, how can those that corrected them state that the corrected edition is the definitive edition that “Joyce intended”?

“The man of genius makes no mistakes, his errors are volitional and portals of discovery.” –James Joyce

Throughout the seventeen years Joyce spent writing Finnegans Wake, he began to go blind, so he had a friend named, Samuel Beckett, take dictation over the phone to complete the novel. At one point in this dictation setting, someone knocked on Joyce’s door.  Joyce said, “Come in!” to the knocker, and Beckett wrote the words “Come in!” into the narrative of Finnegans Wake. When this error was spotted by Joyce, and the confusion was sorted out, Joyce insisted that Beckett, “Leave it in!” On another occasion, when a printer’s error was pointed out he said, “Leave it. It sounds better that way than the way I wrote it.”

There are three different versions of the text: The first and second are the editions that Joyce submitted for publications with all of the errors intact. The third edition has the errors that the editors located, and the 600 corrections that Joyce spent two years locating, corrected. Some would have you believe that first two editions are the definitive editions, but you have to be a Joyce purist to appreciate them.

Can it be called anything short of egotistical for an author to believe that his subconscious choices and decisions, are somehow divine? If, as Joyce said, and Picasso later repeated in regard to his paintings, mistakes are portals of discovery, then we can say that’s great, and incredibly artistic in the process of creation. To leave it in the finished product, however, and subject your readers to the confusion, just seems egotistical. “Here’s what I was thinking at the time,” Joyce is basically telling his readers.  “I don’t know what it means, but this is a higher plane of thinking than simple conscious thought. Isn’t it magical? Maybe you can make some sense of it. Maybe you can attribute it to your life in some manner.” This method of operation may say something profound about the random nature of the universe, but when we’re reading a novel we don’t necessarily want to know about the randomness of the universe, unless it’s structured in a manner that leads us to your statement. 

Not everyone can write a classic, and some realize this after a number of failed attempts. Once they arrive at this fork in the road, they can either write simple books that provide them and theirs an honest living, or they can grow so frustrated by their inability to write classics that they separate themselves from the pack through obscurity. The advantage of creating such an alleged contrivance is that beauty is in the eye of the beholder, and the beholder can assign their own relative beauty to it. Some would say this is the very definition of art, but others would say even that definition has limits. Some would say that the most obscure painting is art, because they “see it”, where others see only schlock for elitists to crib note to death, until meaning is derived.

James Joyce is considered the exception to this rule, fellow writers have told me, and if you are going to attempt to write an important novel in the 21st Century, you had better be familiar with him. I’ve tried, and I now believe that I’m destined to be a havenot in the Joycean world … Even with Ulysses. The question that arises out of these ashes is, am I going a long way to becoming more intelligent by recognizing my limits, or should it be every aspiring intellect’s responsibility to continue to push themselves beyond any self-imposed limits to a point where they can finally achieve a scholarly understanding of difficult material? If this is a conundrum that every person encounters when facing challenges to their intelligence, is Ulysses, or more pointedly Finnegans Wake, the ultimate barometer of intelligence, or is it such an exaggerated extension that it had to have been a practical joke James Joyce played on the elitist literary community to expose them as the in-crowd, elitist snobs that they are when they “get it” to get it. Do they really “get it”, or are they falling prey to Joyce’s clever ruse to expose them as people that “get” something that was never meant to be “got”?

Don’t Go Chasing Eel Testicles: A Brief, Select History of Sigmund Freud


We all envy those who knew, at a relatively young age, what they wanted to do for a living. Most of us experience some moments of inspiration that might lead us toward a path, but few of us ever read medical journals, law reviews, or business periodicals during our formative years. Most of the young people I knew preferred an NFL preview guide of some sort, teenage heartthrob magazines, or one of the many other periodicals that offer soft entertainment value. Most of us opted out of reading altogether and chose to play something that involved a ball instead. Life was all about playtime for the kids I grew up around, but there were other, more serious kids, who we wouldn’t meet until we were older. Few of them knew they would become neurosurgeons, but they were so interested in medicine that they devoted huge chunks of their young lives to learning everything their young minds could retain. “How is this even possible?” some of us ask. How could they achieve that level of focus at such a young age, we wonder. Are we even the same species?

At an age when so many minds are so unfocused, they claimed to have tunnel vision. “I didn’t have that level of focus,” some said to correct the record, “not the level of focus to which you are alluding.” They may have diverged from the central focus, but they had more direction than anyone I knew, and that direction put them on the path of doing what they ended up doing, even if it wasn’t as specific as I guessed.

The questions we have about what to do for a living have plagued so many for so long that comedian Paula Poundstone captured it with a well-placed joke, and I apologize, in advance, for the paraphrasing: “Didn’t you hate it when your relatives asked what you wanted to do for a living? Um, Grandpa I’m 5. I haven’t fully grasped the mechanics or the importance of brushing my teeth yet. Those of us of a certain age have now been on both sides of this question. We’ve been asking our nieces and nephews this question for years without detecting the irony. What do you want to do when you grow up? Now that I’ve been asking this question long enough, I’ve finally figured out why we ask it. Our aunts and uncles asked us this question, because they were looking for ideas. I’m in my forties now, and I’m still asking my nieces and nephews these questions. I’m still looking for ideas.”

Pour through the annals of great men and women of history, and that research will reveal legions of late bloomers who didn’t accomplish anything of note until late in life. The researcher will also discover that most of the figures who achieved success in life were just as dumb and carefree as children as the rest of us were, until the seriousness of adulthood directed them to pursue a venture in life that would land them in the annals of history. Some failed more than once in their initial pursuits, until they discovered something that flipped a switch.

Those who know anything about psychology, and many who don’t, are familiar with the name Sigmund Freud. Those who know anything about Freud are aware of his unique theories about the human mind and human development. Those who know anything about his psychosexual theory know we are all repressed sexual beings plagued with unconscious desires to have relations with some mythical Greek king’s mother. What we might not know, because we consider it ancillary to his greater works, is that some of his theories might have originated from Freud’s pursuit of the Holy Grail of nineteenth-century science, the elusive eel testicles.

Although some annals state that an Italian scientist named Carlo Mondini discovered eel testicles in 1777, other periodicals state that the search continued up to and beyond the search of an obscure 19-year-old Austrian’s in 1876.[1] Other research states that the heralded Aristotle conducted his own research on the eel, and his studies resulted in postulations that stated either that the beings came from the “guts of wet soil”, or that they were born “of nothing”.[2] One could guess that these answers resulted from great frustration, since Aristotle was so patient with his deductions in other areas. On the other hand, he also purported that maggots were born organically from a slab of meat. “Others, who conducted their own research, swore that eels were bred of mud, of bodies decaying in the water. One learned bishop informed the Royal Society that eels slithered from the thatched roofs of cottages; Izaak Walton, in The Compleat Angler, reckoned they sprang from the ‘action of sunlight on dewdrops’.”

Before laughing at any of these findings, one must consider the limited resources these researchers had at their disposal, concerning the science of their day. As is oft said with young people, the young Freud might not have had the wisdom yet to know how futile this task would be when a nondescript Austrian zoological research station employed him. It was his first job, he was 19, and it was 1876. He dissected approximately 400 eels, over a period of four weeks, “Amid stench and slime for long hours” as the New York Times described Freud’s working environment. [3] His ambitious goal was to write a breakthrough research paper on an animal’s mating habits, one that had confounded science for centuries. Conceivably, a more seasoned scientist might have considered the task futile much earlier in the process, but an ambitious, young 19-year-old, looking to make a name for himself, was willing to spend long hours slicing and dicing eels, hoping to achieve an answer no one could disprove.

Unfortunate for the young Freud, but perhaps fortunate for the field of psychology, we now know that eels don’t have testicles until they need them. The products of Freud’s studies must not have needed them at the time he studied them, for Freud ended up writing that his total supply of eels were “of the fairer sex.” Freud eventually penned that research paper over time, but it detailed his failure to locate the testicles. Some have said Freud correctly predicted where the testicles should be and that he argued that the eels he received were not mature eels. Freud’s experiments resulted in a failure to find the testicles, and he moved into other areas as a result. The question on the mind of this reader is how profound of an effect did this failure to find eel testicles have on his research into human sexual development?

In our teenage and young adult years, most of us had odd jobs that affected us in a variety of ways, for the rest of our working lives. For most, these jobs were low-paying, manual labor jobs that we slogged through for the sole purpose of getting paid. Few of us pined over anything at that age, least of all a legacy that we hoped might land us in annals of history. Most of us wanted to do well in our entry-level jobs, to bolster our character, but we had no profound feelings of failure if we didn’t. We just moved onto other jobs that we hoped we would find more financially rewarding and fulfilling.

Was Freud’s search for eel testicles the equivalent of an entry-level job, or did he believe in the vocation so much that the failure devastated him? Did he slice the first 100 or so eels open and throw them aside with the belief that they were immature? Was there nothing but female eels around him, as he wrote, or was he beginning to see what had plagued the other scientists for centuries, including the brilliant Aristotle? There had to be a moment, in other words, when Sigmund Freud realized that they couldn’t all be female. He had to know, at some point, that he was missing the same something everyone else missed. He must have spent some sleepless nights struggling to come up with a different tactic. He might have lost his appetite at various points, and he may have shut out the world in his obsession to achieve infamy in marine biology. He sliced and diced over 400 after all. If even some of this is true, even if it only occupied his mind for four weeks of his life, we can feasibly imagine that the futile search for eel testicles affected Sigmund Freud in a profound manner.

 

If Freud Never Existed, Would There Be a Need to Create Him?

 

Every person approaches a topic of study from a subjective angle. It’s human nature. Few of us can view people, places, or things in our lives, with total objectivity. The topic we are least objective about, say some, is ourselves. Some say that we are the central topic of speculation when we theorize about humanity. All theories are autobiographical, in other words, and we pursue such questions in an attempt to understand ourselves better. Bearing that in mind, what was the subjective angle from which Sigmund Freud approached his most famous theory on psychosexual development in humans? Did he bring objectivity to his patients? Could he have been more objective, or did Freud have a blind spot that led him to chase the elusive eel testicles throughout his career in the manner Don Quixote chased windmills?

After his failure, Sigmund Freud would switch his focus to a field of science that would later become psychology. Soon thereafter, patients sought his consultation. We know now that Freud viewed most people’s problems through a sexual lens, but was that lens tinted by the set of testicles he couldn’t find a lifetime ago? Did his inability to locate the eel’s reproductive organs prove so prominent in his studies that he saw them everywhere he went, in the manner that a rare car owner begins to see his car everywhere, soon after driving that it off the lot? Some say that if this is how Freud conducted his sessions, he did so in an unconscious manner, and others say this might have been the basis for his theory on unconscious actions. How different would Freud’s theories on development have been if he found his Holy Grail, and the Holy Grail of science at the time? How different would his life have been? We could also wonder if Freud would have even switched his focus if he found fame as a marine biologist with his findings.

How different would the field of psychology be today if Sigmund Freud remained a marine biologist? Alternatively, if he still made the switch to psychology after achieving fame in marine biology, for being the eel testicle spotter, would he have approached the study of the human development, and the human mind from a less subjective angle? Would his theory on psychosexual development have occurred to him at all? If it didn’t, is it such a fundamental truth that it would’ve occurred to someone else over time, even without Freud’s influence?

We can state, without too much refutation, that Sigmund Freud’s psychosexual theory has sexualized the beliefs many have about human development, a theory others now consider disproved. How transcendental was that theory, and how much subjective interpretation was involved in it? How much of the subjective interpretation derived from his inability to find the eel testicle fueled it? Put another way, did Freud ever reach a point where he began overcompensating for that initial failure?

Whether it’s an interpretive extension, or a direct reading of Freud’s theory, modern scientific research theorizes that most men want some form of sexual experience with another man’s testicles. This theory, influenced by Freud’s theories, suggests that those that claim they don’t are lying in a latent manner, and the more a man says he doesn’t, the more repressed his homosexual desires are.

The Williams Institute at UCLA School of Law, a sexual orientation law think tank, released a study in April 2011 that stated that 3.6 percent of males in the U.S. population are either openly gay or bisexual.[4] If these findings are even close to correct, this leaves 96.4 percent who are, according to Freud’s theory, closeted homosexuals in some manner. Neither Freud nor anyone else has been able to put even a rough estimate on the percentage of heterosexuals who harbor unconscious, erotic inclinations toward members of the same sex, but the very idea that the theory has achieved worldwide fame leads some to believe there is some truth to it. Analysis of some psychological studies on this subject provides the quotes, “It is possible … Certain figures show that it would indicate … All findings can and should be evaluated by further research.” In other words, no conclusive data and all findings and figures are vague. Some would suggest that these quotes are ambiguous enough that they can be used by those who would have their readers believe that most of the 96.4 percent who express contrarian views are actively suppressing their desire to not just support the view, but to actively involve themselves in that way of life.[5]

Some label Sigmund Freud as history’s most debunked doctor, but his influence on the field of psychology and on the ways society at large views human development and sexuality is indisputable. The greater question, as it pertains specific to Freud’s psychosexual theory, is was Freud a closet homosexual, or was his angle on psychological research affected by his initial failure to find eel testicles? To put it more succinct, which being’s testicles was Freud more obsessed with finding during his lifetime?

 

[1]https://en.wikipedia.org/wiki/Eel_life_history

[2]http://www.theguardian.com/environment/2010/oct/27/the-decline-of-the-eel

[3]http://www.nytimes.com/2006/04/25/health/psychology/analyze-these.html

[4]https://en.wikipedia.org/wiki/Demographics_of_sexual_orientation

[5]http://www.pbs.org/wgbh/pages/frontline/shows/assault/roots/freud.html

 

If you enjoyed this unique perspective on Sigmund Freud, you might also enjoy the following:

Charles Bukowski Hates Mickey Mouse

The History of Bloodletting by Mark Twain

The Perfect Imperfections of Franz Kafka’s Metamorphosis

James Joyce: Incomparable or Incomprehensible?

Rasputin I: Rasputin Rises

What the World Needs Now is Another Calvin Coolidge


With the federal debt spiraling out of control, many Americans sense an urgent need to find a political leader that is able to say “no” to spending. Yet they fear that finding such a leader is impossible. Conservatives long for another Ronald Reagan. But is Reagan the right model for the problems that currently sit before us? Reagan was of course a tax cutter, reducing the top marginal rate from 70 to 28 percent. But his tax cuts—which vindicated supply-side economics by vastly increasing federal revenue—were bought partly through a bargain with Democrats who were eager to spend all that (new found) revenue that the Reagan tax cuts generated. Reagan was no budget cutter—indeed, the federal budget rose by over a third during his administration.

An alternative model for conservatives may be Calvin Coolidge. President from 1923 to 1929, Coolidge sustained a budget surplus and left office with a smaller budget than the one he inherited. Over the same period, America experienced a proliferation of jobs, a dramatic increase in the standard of living, higher wages, and three to four percent annual economic growth. And the key to this was Coolidge’s penchant for saying “no.” If Reagan was the Great Communicator, Coolidge was the Great Refrainer.

Calvin Coolidge

Calvin Coolidge

Following the Warren G. Harding/Coolidge ticket’s 1920 victory for the office of the president, President Warren G. Harding’s inaugural address set a dramatically different tone from that of the outgoing Woodrow Wilson administration (and from that of the Barack Obama administration’s today):

“No altered system,” Harding said, “will work a miracle. Any wild experiment will only add to the confusion. Our best assurance lies in efficient administration of our proven system.”

Harding was basically saying that he would be a steward of the American system that had worked just fine in the 130 years of America that preceded his election. Harding was saying—as opposed to what Wilson said—that he didn’t believe he was inordinately gifted in bettering the prosperous model that the founders had created. He was basically saying that he wasn’t the type of “miracle worker” that would step into office with his think tank notions to tell the nation that he has a “new and improved” cure to all that ails us? He was basically telling the American public that he wouldn’t be presenting a “New Coke” formula that no president has thought of before, as most of the “New Coke” formulas presented by brilliant, think tank presidents have been tried and tested before and proven to be as successful as New Coke was. In making such a statement, many would say that Harding was displaying what some call the pinnacle of intelligence by stating that he was knowledgeable enough know what he doesn’t know. Put another way, why would the number one, most powerful nation in the world tinker with the formula that made them number one?  Why would they attempt to create a different Coke?

One of Warren G. Harding’s first steps was to shepherd through Congress the Budget and Accounting Act of 1921. This bill allowed Harding to create a special budget bureau—the forerunner to today’s Office of Management and Budget—where the director of the bureau could cajole and shame Congress into making spending cuts. Unfortunately, some of Harding’s privatization policies, combined with some ill-advised appointments, led to bribery and favoritism, and ultimately to what would be called the Teapot Dome Scandal.

Enter Coolidge

Calvin Coolidge entered office, after Warren G. Harding’s sudden death, and separated himself almost immediately from Harding with his willingness to say “no” to appointees, Congressman, and to various “New Coke” bills. (Coolidge ended up vetoing fifty bills, a total that ends up being more than the last three presidents combined.) Coolidge summed up his penchant for vetoing these bills saying:

“It is much more important to kill bad bills than to pass good ones.”

Calvin Coolidge was the type of president, the type of person, that if you asked them what time it was, he would tell you. Modern presidents get their tongues so tied up with advice from advisers, pollsters, and focus group testing, that they’re almost afraid to tell you what time it is based on the fact that a direct answer might be taken seven different ways by seven different networks that need to appeal to a 24-7 audience.

Within 24 hours of arriving in Washington after Harding’s death, Calvin Coolidge met with his budget director, Herbert Lord, and together they went on offense, announcing deepened cuts in two politically sensitive areas: spending on veterans and District of Columbia public works. In his public statements, Coolidge made clear he would have scant patience with anyone who didn’t go along:

“We must have no carelessness in our dealings with public property or the expenditure of public money. Such a condition is characteristic of undeveloped people, or of a decadent generation.”

Perhaps reflecting his temperament, Coolidge favored the pocket veto—a way for the president to reject a bill without a veto message and without affording Congress a chance to override a veto. Grover Cleveland, who Coolidge admired, used this veto in his day, as had Theodore Roosevelt. But Coolidge raised its use to an art form. The New York Times referred to it as “disapproval by inaction.”

The words “perhaps reflecting his temperament” paint a nice portrait of President Calvin Coolidge, for when given the choice between grandstanding on an issue and quietly advocating or dismissing a bill, Coolidge opted for the quiet approach. When faced with the Hurricane Katrina of his day, the great Mississippi River flood of 1927, Coolidge chose not to appear on the grounds of the devastation fearing that that might encourage federal spending on relief; when faced with the problem of what to do with the powerful Klu Klux Klan, Coolidge quietly avoided appointing any Klan members to prominent positions in his cabinet, and he thereby decimated the power of that group in America; when faced with the dilemma of what to do with farming subsidies, the man from farming country, chose to veto the subsidies. He also vetoed veterans’ pensions and government entry into the utilities sector.

Whereas the current barometer of the presidency is set on how much, and how often, they spend other people’s money, Coolidge exhibited a restraint politicians often reserve only for their own money.

Coolidge and his budget director met every Friday morning before cabinet meetings to identify budget cuts and discuss how to say “no” to the requests of cabinet members. Most presidents give in after a time—Eisenhower being a good example—but Coolidge did not, despite the budget surpluses accrued during his presidency. He held 14 meetings with his budget director after coming to office in late 1923, 55 meetings in 1924, 52 in 1925, 63 in 1926, and 51 in 1927.

In a conference call with Jewish philanthropists, Coolidge explained his consistency this way:

 “I believe in budgets. I want other people to believe in them. I have had a small one to run my own home; and besides that, I am the head of the organization that makes the greatest of all budgets, that of the United States government. Do you wonder then that at times I dream of balance sheets and sinking funds, and deficits and tax rates and all the rest?”

Speaking of tax rates, in December 1923, Coolidge and Treasury Secretary Andrew Mellon launched a campaign to lower top rates from the fifties to the twenties. Mellon believed, and informed Coolidge, that these cuts might result in additional revenue. This was referred to as “scientific taxation”—an early formulation what would later be called the Laffer Curve. And Coolidge passed the word on:

“Experience does not show that the higher tax rate produces larger revenue. Experience is all the other way,” he said in a speech in early 1924. “When the surtax on incomes of $300,000 and over was but 10 percent, the revenue was about the same as it was at 65 percent.”

The more recent egos that have occupied the tax payer funded seat of president would likely show a blush at the mention of the power and prestige they have achieved by attaining residence in The White House. That humble blush would be shown in the manner a 70’s comedian would show one hand to reject the applause he was receiving, while the other, jokingly, acted in a manner to usher more applause forward. Calvin Coolidge rejected congratulatory mentions of his power completely. When Senator Selden Spencer took a walk with Coolidge around the White House grounds, the Senator asked the president playfully, “Who lives there?”

“Nobody,” Coolidge replied. “They just come and go.”

For all the praise that authors like Amity Shales heap on Coolidge, his critics state that his policies did not prevent The Great Depression.

Shales replies: “That is an argument I take up at length in my previous book, “The Forgotten Man”, and is a topic for another day. Here let me just say that the Great Depression was as great and as long in duration as it was because, as economist Benjamin Anderson put it, the government under both Hoover and Franklin Roosevelt, unlike under Coolidge, chose to “play God.”

Three lessons we can learn from the Coolidge presidency:

Beyond the inspiration of Coolidge’s example of principle and consistency, what are the lessons of his story that are relevant to our current situation? One certainly has to do with the mechanism of budgeting: The Budget and Accounting Act of 1921 provided a means for Harding and Coolidge to control the budget and the nation’s debt, and at the same time gave the people the ability to hold someone responsible. That law was gutted in the 1970s, when it became collateral damage in the anti-executive fervor following Watergate. The law that replaced it tilted budget authority back to Congress and has led to over-spending and lack of responsibility ever since.

A second lesson concerns how we look at tax rates. When tax rates are set and judged according to how much revenue they bring in due to the Laffer Curve—which is how most of today’s tax cutters present them, thereby agreeing with tax hikers that the goal of tax policy is to increase revenue—tax policy can become a mechanism to expand government. The goals of legitimate government—American freedom and prosperity—are left by the wayside. Thus the best case for lower taxes is the moral case—and as Coolidge well understood, a moral tax policy demands tough budgeting.

Finally, a lesson about politics. The popularity of Harding and Coolidge, and the success of their policies—especially Coolidge’s—following a long period of Progressive ascendancy, should give today’s conservatives hope. Coolidge in the 1920s, like Grover Cleveland in the previous century, distinguished government austerity from private-sector austerity, combined a policy of deficit cuts with one of tax cuts, and made a moral case for saying “no.” A political leader who does the same today is likely to find an electorate more inclined to respond “yes” than he or she expects. {1}

The point, I believe, is that in the current climate of “yes” in Washington D.C., we could use a little “no”. In the event of a natural disaster, and there will always be “unprecedented” disasters in a land mass as large as America, “yes” ingratiates the president to the people of the area, the media, the nation, and history, but it is also “yes” that ends up contributing to the national debt, the idea that the federal government is a parent that should clean up the messes of her children, and it discourages smaller scale charity and communities seeing themselves through a disaster of this sort.

“Yes” also lends itself to the already massive egos of those that will sit in our most prestigious seat of representation, and it leads them to believe they can invent “New Coke” formula, until we’re swirling around the drain in it. These massive egos can’t withstand one commentator saying negative things about them, so they start saying “yes” to everything and everyone before them, because “yes” doesn’t have the political consequences of “no”. Saying no to Congressman and Senators can bruise egos and cause negative sentiments and statements; saying no to Governors that ask for state aid will lead to political fallout in the media as every story on that tragedy of the day would be accompanied by their “no”; telling a woman that asks for a car in a town hall debate the meaning of the word no, and telling her exactly what time of the day it is, would lead to utter devastation for that candidate’s campaign. While Coolidge did not face the 24-7 news cycle, a decent search of his history will reveal that his “no” policies did face a relatively intense amount of scrutiny, and he continued to say “no” throughout.

It would probably be a fool’s errand to try and find another person in our current political climate with the temerity and resolve to say no as often as Coolidge did. The nation has stated that they would much rather live in the fairy tale land of yes, even if that means New Coke ideas that lead to greater complexities, and more economic turmoil. The greater question, that appears to be approaching closer every day, is not whether a “a great refrainer” is a better president than one that believes the nation can user tax payer money to spend the nation out of every problem it has with a “yes we can” president, but if the nation will ever be ready to answer that question without the assistance of a cataclysmic economic incident that affects them directly.

{1}http://www.hillsdale.edu/news/imprimis/archive/issue.asp?year=2013&month=02

Are they celebrating the legacy, or the death, of Margaret Thatcher?


Many conservatives wonder why intellectuals from the left would attack the recently departed, former Prime Minister Margaret Thatcher with such blood lust.  Non-political types wonder how anyone could take to the streets to celebrate the death of a leader that didn’t commit mass genocide, incarcerate her enemies, or commit any attrocities.  There’s nothing wrong with disagreeing with the politics of a leader, those that don’t follow politics say, but why would anyone riot, loot, and injure other people to celebrate their death?

Thatcher and Reagan

Thatcher and Reagan

As for the revelers that took to the streets to celebrate the former Prime Minister’s death, IB Times UK reporter Ewan Palmer stated that: “There was the notion that this morbid celebration has been planned in thousands of people’s heads for more than 30 years.”{1} Dominic Gover’s IBT World article continued, saying, “Many revelers appeared younger than the 23 years which have passed since Thatcher left office.”

Josef Stalin had a term for these people.  He called them “useful idiots”.  It was a term he reserved for those people he perceived to be uninformed propagandists that cynical leaders could use to promote a cause.  Most useful idiots will “support” any cause that allows for public drinking, violence, theft, looting, and violence against police under the guise of a “noble” cause they know little to nothing about.

The answer regarding why a more academic theoretician, like Paul Krugman, would join the dance upon Thatcher’s grave is simple: Politics.  If “The Iron Lady’s” legacy were allowed to flourish, unchallenged, it would not speak well of Krugman’s Keynesian goals for worldwide acceptance of government controlled economies, and the thrust of President Barack Obama’s beliefs that he can turn the U.S. economy around through government infusions of cash.  Margaret Thatcher may not have provided the world the antithesis of Obama’s policies—she was for socialized medicine, increased taxes, some gun control legislation, and she believed in global warming—but the changes in taxes she passed, and the limits on the regulations of business she called for, could undoubtedly be called different from those policies Obama and Krugman favor.  It’s vital to their continued progress, therefore, that Krugman, and all leftist intellectuals, take any opportunity to diminish any conservative’s historical record…Even if that opportunity arises before the dirt is on their coffin.

“Did Thatcher turn Britain around,” Paul Krugman asks in his most recent column.  The thrust of Krugman’s argument that she didn’t necessarily do so, is based on two points.  The first that Krugman never explicitly states is that Britain’s turnaround was coincidental to Thatcher’s tenure as Prime Minister, and that Thatcher happened to be in the right place at the right time.  He elucidates this point with a graph that compares Britain’s per capita GDP, and unemployment rates, with those of France’s during Thatcher’s tenure.  The title of the chart is UK GDP per capita relative to France, but this author doesn’t see where the France line appears on the chart.  Perhaps it was done in total comparison, but that’s difficult to discern.  Regardless, Krugman notes: “A long decline ended and turned into a revival.”  The second graph shows that France’s unemployment rate was lower for much of her tenure, until around 1994 when Britain’s rate of unemployment figures went below France’s and has stayed there to the present day.  This leads to Krugman’s second argument: If Thatcher did, in fact, turn Britain’s economy around, “why did it take so long?”

“Thatcher came to power in 1979, and imposed a radical change in policy almost immediately.  But the big improvement in British performance doesn’t really show in the data until the mid-1990s.  Does she get credit for a reward so long delayed?

This is, by the way, somewhat like a similar issue in America: right-wingers were eager to give Ronald Reagan credit for the productivity boom of the Clinton years, which also didn’t start until around 1995; if Reagan could get credit for events that were 14 years or more after his 1981 tax cut, shouldn’t Richard Nixon be given credit for anything good that happened in the Reagan years?”{3}

The latter argument can be diffused with one name James Earl Carter.  If Nixon did anything to turn the economy around in the manner that Reagan or Thatcher did, with the same degree of lag before it could take full effect, that would’ve been thrashed by the inept policies of the Carter administration.  Reagan was followed by George H.W. Bush, and Bush continued, for the most part, the policies of Ronald Reagan, so the analogy doesn’t hold up on that front.  On the administrative front, Nixon was not the conservative that Reagan was.  He enacted wage and price controls, expanded social security, and he continued LBJ’s Great Society programs.  Reagan’s policies were almost in direct contrast to many of Nixon’s, even though many historians now say that Nixon was a secret advisor of Reagan’s.

As for the question regarding why Thatcher’s policies, and Reagan’s, had such a lag in terms of results—that Krugman admits coincidentally “didn’t start until around 1995″—Krugman was involved in a CNBC debate with Fox News host Bill O’Reilly where Krugman rhetorically posed this question.  Bill O’Reilly responded:

“Call any corporation — any high-tech corporation in Silicon Valley and just ask them when their R&D ramped up and when their machinery that has led the world, the United States and the world, when it started getting — they will all tell you it happened during the Reagan administration when corporate taxes were cut.  There was more income to devote to that.”{4}

The point O’Reilly was making was that a company’s R&D (Research and Development) departments allow that company to create better products, and that it may take some time to create these products, get them to market, and eventually show a profit on them.  One could say that Microsoft became Microsoft during this period, and that the many companies that would make up the tech bubble—that fueled the soaring stock market of the mid-‘90’s—were made during this period.  But it’s tough to definitively say that the Reagan tax cuts on corporations definitively led to the corporate profits, that funded greater R&D, and eventually fueled the mid-‘90’s tech bubble.  We do know that the time it takes a pharmaceutical company to research and develop a product, test that product, and eventually show a profit on that product can be around twelve years, but that could be disregarded based on the impediments put into that process by the FDA.  We can assume, however, that putting out something as complicated as a computer’s operating system can involve a great deal of the same trial and error measures of those involved in the pharmaceutical industry, at least when it comes to the time it takes to perfect it, make it market ready, and then profitable.  We also know that any projections Krugman puts forth on this topic are at least as informed, if not less so, than ours, for he has never spent any time in private sector, and that the basis for his initial interest in economics are the novels of science fiction writer Isaac Asimov. {5}

Due to the fact that it took more than a decade for all the changes that Thatcher and Reagan enacted on their respected economies, it is difficult to definitively say that their policies coincidentally turned their country’s economies at the same time.  It’s difficult to definitively say that economic cycles being what they are, that an upswing was not inevitable regardless who was in office, but with all of these coincidental circumstances lining up perfectly, leftist economists, like Krugman, feel it incumbent upon them to insert as many question marks as possible into their recounts of the historical record.

{1}http://www.ibtimes.co.uk/articles/455054/20130409/thatcher-death-party.htm

{2}http://www.examiner.com/article/progressive-reaction-to-thatcher-death-shows-no-amount-of-appeasement-enough

{3}http://krugman.blogs.nytimes.com/2013/04/08/did-thatcher-turn-britain-around/

{4}http://www.foxnews.com/on-air/oreilly/2004/08/10/nbcs-russert-refs-debate-between-bill-and-krugman?page=4#ixzz2PzRbcM6r

{5}http://en.wikipedia.org/wiki/Paul_Krugman

Details, Details, Details


Epiphanies, like women, can pop up when you least expect them, and they can free you from a troubling part of your life you didn’t recognize as a problem until they were revealed.

In a PBS documentary on Mark Twain, a number of incidents arose in the building of Twain’s home, and the construction team began “badgering” Twain with questions regarding how he wanted them handled. The questions regarded the construction of his home, the place he would presumably live for the rest of his life, so the observer should forgive the construction crew’s chief for the badgering. The team didn’t know what he wanted, and there were presumably hundreds of questions they had on his desired specifics. What the team did not know, however, was that Twain had an oft expressed aversion for details.

Twain

Putting myself in a similar situation, I realize that, like Twain, I’m not a detail-oriented guy. I’ll listen to every question put to me, but I’ll be listening with a sense of guilt. Details make me feel stupid, they start firing far too many neurons in my brain for me to handle, and I usually get overwhelmed and exhausted by them. I know that I should be listening to every question, and I know I should be pondering the details they give me to come up with the ideal solution for my family, but my capacity for such matters is limited.

In the beginning of the process, I’m all hopped up. My mind is acutely focused, and I’m knocking out every question with focused answers. I’m considering every perspective involved, and I’m asking for advice from all of those not involved. I’m reading what others have done, and I’m gathering as much information as possible to make an informed decision, but I will eventually grow overwhelmed and exhausted because I’m not a detail oriented guy.

By the time we reach the 7th and 8th questions, I’ll be out of gas. I’ll be mentally saying, “Whatever, just get it done!” I’ll be falling away from creative answers and onto what is expected in the situation, or what it is that those still paying attention want. I will be answering in an autonomic manner. “Yes, that sounds fine,” I’ll say without knowing what has been said. I’ll just want the damn thing to be built already by that point, because I’m not a details-oriented guy. I’ll want to make the big decisions, but I’ll want to leave all of the “inconsequential” details-oriented questions to others.

I do feel guilty about being this way.  I want to be involved, informed, and constantly making acutely focused decisions throughout the process.  I’ll feel guilty when others start making the decisions that affect me, because I know I’m an adult now, and I should be making all these decisions.  There is also some fear that drives me to constantly pretend that I’m in prime listening mode, based on the fact that I may not like the finished product if I’m not involved in every step.  I may not like, for example, the manner in which the west wing juts out on the land and makes the home appear ostentatious, or obtuse, or less pleasing to the eye with various incongruities, and I’ll wish I would not have been so obvious with my “Whatever just do it!” answers. Details exhaust me, though, and they embarrass me when I don’t know the particulars that the other is referencing.

I don’t know if this guilt is borne of the fact that I know I’m an intelligent being, and I should be able to make these decisions in a more consistent manner, or if I’m just too lazy to maintain acute focus.  I do have a threshold though, and I know how my brain works.  I know that if there are seven ways to approach a given situation, I will usually select one that falls in the first two selections offered.  I usually do this, because I’m not listening after the second one.  Everything beyond that involves the other party showing off the fact that they know more than I do.  I know this isn’t always the case, but it’s the only vine I can cling to when having to deal with my limited attention span and the limited arsenal of my brain.

Knowing my deficiencies for retaining verbosity, I will ask for literature on the subject that provides the subject a tangible quality that can be consumed at my pace. If I do that, and I have, I will then pretend to read every excruciating word, but I will usually end up selected one of the first two selections offered.  I like to think I have a complex brain.  I like to think that I display all that I’m about in my own way, but I’m always reminded of the fact that most of the people around me give full participation to the details of life no matter how overwhelming and exhausting they can be to me.  It’s humbling to watch these brains, I like to consider inferior, operate on planes of constant choices, and decisions, and retentions, and details I am incapable of retaining.

I have this daydream that I will one day be afforded an excuse for having a limited brain by the relative brilliance I reveal to the world in the form of a novel.  I am interviewed in this dream, and I am asked, “So, what does it mean to you to have crafted such a fine book?”  I am far wittier than reality would suggest in this dream when I reply: “It will help me deal with my faults better.  The fact that I cannot fix my own plumbing, can now be countered with, but he wrote a fine book.  The fact that I cannot fix my own car, compete with my wife in certain areas of intelligence, or hold down a decent job can now be countered with, but he wrote a fine book that is held up as a fine book in certain quarters.”

We’ve all heard the line “Everybody’s brain works differently,” but until we learn something regarding the fact that the brilliant brain that composed Huckleberry Finn has similar deficiencies, we cannot help but feel guilty about them.  “Well, work on your deficiencies,” those around us suggest, and we do when that next project comes about.  We’re out to prove ourselves in that next project.  We answer every question, from the first few to the 7th and 8th, with prolonged mental acuity.  When that third and fourth project rolls around, however, we’ll revert back to those inferior brains that can’t retain details, and it is then that we’ll envy those “inferior” brains, consistently showing their superiority.  This could lead those of us that never knew we were suffering from such a recognized deficiency into feelings of incompletion, until someone like Mark Twain recognizes and vocalizes his defeciencies for us.

A President’s Day Guide through obscure presidents, and Lincoln


Most people know the major events conducted by the major presidents that shaped our nation. Most people memorized facts and tidbits of information for the American History tests and quizzes. There are lesser-known presidents that affected this nation in their own way, and had they been defeated in their election, this nation would be very different. There have been times, in our nation’s history when we needed a strong man with a bold hand, such as the one Abraham Lincoln displayed during the Civil War. There have been times when our nation laid in the balance, and we needed a Lincoln to come along and do what he could to preserve what George Washington, John Adams, and all the Founders envisioned. There have been other times, times far less documented in historical records, when our nation needed a humble leader that displayed restraint in times of national scandal and turmoil.

Were it not for the statesmanship restraint displayed by a Calvin Coolidge, for example, we would be a less free nation.Quiet, obscure presidents, like Coolidge, quietly vetoed legislation and exhibited restraint throughout his tenure. Restraint, vetoing legislation, and acting in a manner to preserve individual freedoms is less sexy than passing sweeping legislation and pressing the thumb of government on the throat of individuals and businesses for the purpose of helping people.

Our nation’s history is composed of the strong, Lincoln types and the quiet, Coolidge types that have shaped our country in unique ways, and on this President’s Day I thought we should all be reminded how we came to be the nation we are today, through the more obscure presidents (and Lincoln) that helped guide us to modernity.

Grover Cleveland

Grover Cleveland

1) Stephen Grover Cleveland (March 18, 1837 – June 24, 1908)

The 22nd and 24th President

Cleveland was a Democrat that served the people from 1885–1889 and 1893–1897 in non-consecutive terms.  Cleveland was the only president to do so.

Stephen Grover Cleveland won the popular vote for president on three different occasions, but he lost, in the second election, to Benjamin Harrison in the Electoral College tallies.  He was the only Democrat to defeat a Republican for office during the period of Republican domination that dated back to Abraham Lincoln’s first electoral victory. He was the second president to marry while in office, and the only president (as depicted above) to marry at the white house.  During his tenure, he and the Republican Congress, admitted North Dakota, South Dakota, Montana, Washington, Idaho, Wyoming, and later Utah to the union. His last words were “I have tried so hard to do right.”{1}

Ronald Reagan may have been the president that “tried to give the government back to the people” but Grover Cleveland was one of two presidents of the 19th and 20th centuries –Calvin Coolidge being the other– to accomplish the feat.  By the time their tenure ended, the size and scope of government was more limited than when they began their terms.

Others spoke of limiting the size of government, the others failed.  His first goal was to end the spoils of the political system. He did not fire any of the previous administration’s Republicans that were doing their job well. He cut down the number of federal employees, and he attempted to slow the growth of what he perceived to be a bloated government. He attempted to always place appointments in positions based on merit, as opposed to the usual spoils system that dictated position holders in previous administrations. He also used his veto power far more than any other president of his day.  Although Cleveland was a Democrat, he was one the few that sided with business. Cleveland opposed high tariffs, free silver, inflation, imperialism, and subsidies to business, farmers or veterans.  His battles for political reform and fiscal conservatism made him an icon for American conservatives of the era. Cleveland’s reform ideas and ideals were so strong and influential that a reform wing of the Republican Party, called the “Mugwumps”, bolted from the GOP ticket and swung to his support in 1884.

The great Abraham Lincoln

The great Abraham Lincoln

2) Abraham Lincoln

The 16th President.

Lincoln was a Republican that served the people from March 4, 1861 – April 15, 1865.

Abraham Lincoln, it could be said, is our most famous president. If one were to chart fame by the number of books written about an historical figure, Lincoln has had more books written about him than any other president. By some accounts, he has had more written about him than any historical figure alive or dead save for Jesus of Nazareth.

His fame is derived from serving as president during The Civil War, and the fight to abolish slavery. Lincoln’s fierce abolitionist views were so well known that some have suggested that the reason the South seceded was based on his election victory.  Others suggest that tensions were so fierce due to Lincoln’s presidential predecessor James Buchanan’s mismanagement, and in the Nebraska and Kansas territories, that the succession and the eventual war were inevitable. Lincoln was also made famous by his assassination at the hands of an actor named John Wilkes Booth.

Quick Quip: Democrat rival in the 1960 election for the President Stephen A. Douglas once called Abraham Lincoln two-faced. “If I had two faces,” Lincoln replied, “do you honestly think I would wear this one?”{2}

William Henry Harrison

William Henry Harrison

3) William Henry Harrison

9th President

Harrison was a member of the short lived Whig party, and he served the people as president from March 4, 1841 to April 4, 1841

William Henry Harrison is most famous for dying after serving one month in office as president.  He took the oath on a cold and rainy day, and he refused to wear a coat or a hat.  He also rode into the inaugural on horseback rather than in the closed carriage that had been offered to him. He then proceeded, after the oath, to deliver the longest inaugural in American history. It took him almost two hours to complete it. He then rode away from the inaugural on horseback. Some believe that this reckless regard for his health brought on the illness that his sixty-eight year old body could not recover from, but historians make note that the illness did not set in until three weeks after the inaugural.  Regardless how he contracted the cold, it progressed into pneumonia and pleurisy.  His last words presumed to be to his successor John Tyler were: “Sir, I wish you to understand the true principles of the government.  I wish them carried out.  I ask nothing more.” {3}

Quick Quip: There was some debate over whether W.H. Harrison’s 8,460 word inaugural address (the longest in history) led to his demise.  Harrison refused to dress appropriate for the forecast cold rain, or follow any of advice of those concerned with his well-being. As a result of his demise, Harrison’s grandson Benjamin Harrison, made sure his own inaugural was a little over half what his grandfather’s was.

Martin Van Buren

Martin Van Buren

4) Martin Van Buren

8th President

Van Buren was a Democrat that served the people from March 4, 1837 to March 4, 1841.

Van Buren is regarded, in some quarters, as the father of the Democrat Party, even though Andrew Jackson was the first Democrat to be elected president. He was the first individual born as a U.S. citizen to be elected president. He was the first non-British, non-Irish man to serve as president. He was Dutch. He was also the first self-made man to become President: all earlier Presidents had acquired wealth through inheritance or marriage, while van Buren was born into poverty and became wealthy through his law practice. Van Buren’s presidency was marked by a depression, named the panic of 1837, that lasted throughout his presidency.  As a result of this, Van Buren issued a statement that is also famous regarding his tenure as president: “As to the presidency, the two happiest days of my life were those of my entrance upon the office and my surrender of it.”{4}

James A. Garfield

James A. Garfield

5) James A. Garfield

20th President

Garfield was a Republican that served the people from March 4, 1881 to September 19, 1881.

Garfield was another president known, in history, more for his death, than his life, or tenure as president. Garfield was taken down by an communist assassin by the name of Charles J. Guiteau. Though Garfield only had four months of health while serving the people as president, he did manage to give resurgence to the president’s authority over Senatorial courtesy in making executive appointments. He also energized naval power, he purged the corruption in the Post Office, and he appointed several African-Americans to prominent positions. During the eighty days in which Garfield suffered through the cruelty of the assassin’s bullet, he signed one, single extradition paper. Some historians have suggested that Garfield may have been one of our most talented and eloquent presidents had he lived long enough to expose this to the nation, but he was able to serve the nation in Congress having served nine consecutive terms, and he was able to do what he could in the short time that he served as president.  Candice Millard’s brilliant book Destiny of The Republic captures the essence of Garfield with the quote: “Born into abject poverty, he rose to become a wunderkind scholar, a Civil War hero, a renowned congressman, and a reluctant presidential candidate who took on the nation’s corrupt political establishment.”

Knowing his death was imminent, James A. Garfield’s final words were: “My work is done.” {5}

Benjamin Harrison

Benjamin Harrison

6) Benjamin Harrison

23rd President

Harrison was a Republican that served the people from March 4, 1889 to March 4, 1893.

Harrison is most notable for being the grandson of William Henry Harrison, and the man that defeated the mighty Grover Cleveland in the Electoral College vote in 1888.  Harrison’s tenure was also famous for passing the McKinley Tariff and the Sherman Antitrust Act. He was also famous for allowing federal spending to reach one billion dollars. Harrison also advocated for federal funding for education, he was unsuccessful in that regard. He also pushed for legislation that would protect the voting rights of African Americans.  The latter would be the last attempts made at civil rights in the country until the 1930’s. Learning from the after effects of a long inaugural, courtesy of his Grandfather’s record long speech that some believe led to his death, Benjamin Harrison kept his inaugural address brief. Though historians tend to disregard Harrison as a prominent president, they regard his foreign policies as laying the groundwork for much that would be accomplished in the 20th century. {6}

Calvin Coolidge

Calvin Coolidge

7) Calvin Coolidge

30th President

Calvin Coolidge was a Republican that served the people from August 2, 1923 to March 4, 1929.

Coolidge would not stand a chance in today’s 24-7 news network, internet definition of politics. In the current climate of celebrity presidential candidates climbing all over one another for more air time, a better sound bite, and a better image, “Silent Cal” Calvin Coolidge would have been run over.  In this age of bigger and better governments, where politicians on both sides of the aisle try to flex their legislative muscle in bill signings that are celebrated media events, Calvin Coolidge signed legislation into law in the privacy of the office.  In a quote that could be attributed to the current, progression of big government, Calvin Coolidge said: “The requirements of existence have passed beyond the standard of necessity into the region of luxury.” Calvin Coolidge would be a laughing stock in our day and age, a man on the outside looking in, a statesman that would’ve faded into the woodwork of our society.

Social critic and satirist Dorothy Parker once said: “Mr. Coolidge, I’ve made a bet against a fellow who said it was impossible to get more than two words out of you.”

Coolidge’s famous reply: “You lose.”

After hearing that Coolidge passed away, four years after leaving office, Parker remarked: “How can they tell?”

Although Coolidge was known to be a skilled and effective public speaker, in private he was a man of few words and was referred to as “Silent Cal” in most quarters. On this reputation, Coolidge said:

“The words of a President have an enormous weight, and ought not to be used indiscriminately.” 

Although known as a quiet man, Coolidge participated in over five hundred press conferences during his 2000 days as president, that is an average of one press conference every four days. Coolidge took over the office of president after his predecessor’s death, amid his predecessor’s controversy, that was called the Teapot Dome Scandal. The Teapot Dome Scandal was regarded as the “greatest and most sensational scandal in the history of American politics” until the media discovered the Watergate scandal. In the wake of this scandal, Coolidge told a reporter:

“I think the American people want a solemn ass as a President, and I think I will go along with them.”

Coolidge may have been the last statesman the American people had to serve as president. He was against the Klu Klux Klan, for instance, but he didn’t make grandstanding statements against the Klan, he just didn’t appoint Klan members to positions in his administration. This may seem to be such an obvious move that it’s not worth discussion, but the KKK had a lot of influence at this time in America, and Coolidge’s move caused them to lose much of it. Coolidge tried to take this one step further, calling for anti-lynching laws, but the attempts to pass this legislation were stopped by Democrat filibusters. He attempted to make war illegal in the Kellogg-Briand act, but that law proved ineffective. Coolidge was a laissez-faire president that didn’t believe that the federal government should have a role in farm subsidies or flood relief. As much as he wanted to help these people, he wanted to avoid setting the precedent of the federal government resolving problems that he believed could better be solved, on a case-by-case basis, locally. By the end of his administration, he achieved a tax bill that had all but the top 2% paying no federal income taxes. Coolidge disdained federal regulation and appointed commissioners that followed his philosophy that believed in state’s rights, and this caused a divide in historical opinion of his administration.

Some believe that this laissez-faire approach led to “The Roaring Twenties”, others argue that it led to “The Great Depression.” As with all matters such as these, the opinions are based on where the historian lies on the ideological divide. Some historians say that “The Roaring Twenties” was built on a bubble similar to the 1990’s tech bubble in that it wasn’t built on hard assets, and when that bubble did burst, as it did in the 90’s, a recession occurred as a result. That recession, say other historians, was prolonged into a depression that lasted to the forties by the recovery measures put in place by future administrations. The latter argument has it that the economy may have experienced a dip as a result of the bubble bursting, but the extended duration of this natural, down cycle was caused by the measures put in place by future administrations to recover from what may have otherwise been a temporary dip. Arguments such as these are impossible to resolve, however, because one cannot remove some facts to prove others.

Historians from both sides of the aisle have also defined his last words in varying ways. Those that oppose Coolidge’s actions, state that his last words were a lamentable admission that his limited government policies didn’t work. Those that favor his policies state that he was lamenting the course America was on, into a country of big government policies. They state that Coolidge’s administration was, itself, a temporary blip in a progression that Theodore Roosevelt started, and they suggest that based on everything Coolidge saw during his tenure, he foresaw this.

His last words were: “I feel I no longer fit in with these times.”{7}

{1}http://en.wikipedia.org/wiki/Grover_Cleveland

{2}http://en.wikipedia.org/wiki/Abraham_Lincoln

{3}http://en.wikipedia.org/wiki/William_Henry_Harrison

{4}http://en.wikipedia.org/wiki/Martin_Van_Buren

{5}http://en.wikipedia.org/wiki/James_A._Garfield

{6}http://en.wikipedia.org/wiki/Benjamin_Harrison

{7}http://en.wikipedia.org/wiki/Calvin_Coolidge#cite_note-128