Willie and Kenneth


“I have a death voice,” Kenneth Greene said after interrupting a conversation I was having with my fellow employees on break. Kenneth Greene was the manager of this restaurant, and the only time he interrupted our conversations in the breakroom was to inform us that the restaurant was so busy that we would have to cut our breaks short to help the staff out. When he first entered our breakroom we thought that’s what he was doing, but he looked so insecure about it.

Kenneth Greene operated from a baseline of insecurity. Kenneth didn’t think the staff took him seriously enough in the first few months of his tenure as our manager, so he grew a Fu Manchu. Kenneth’s Fu Manchu did not have handlebars, a la Salvador Dali, it was more late 60’s Joe Namath. Kenneth would never admit that he grew a Fu Manchu for the sole purpose of generating respect from his peers, but when that Fu Manchu grew to fruition, the psychological effect on his was all but emanating around his head. Kenneth Greene went from a greasy, overweight ginger with a mullet to a greasy, overweight ginger with a mullet and a Fu Manchu.

The psychological influence of the Fu Manchu became apparent when he progressed from a manager that asked his employees if they wouldn’t mind cutting their breaks short for business needs to a manager that instructed us to do so. Thus, when the new Kenneth Greene stepped into our breakroom, it appeared that the Fu Manchu might have lost its psychological influence. After a moment of hesitation, in which it appeared that Kenneth had something to say, he left without saying a word. When he returned, after apparently recognizing how vital this moment was to the new Kenneth Greene, he stared at me with renewed conviction.

“What’s up?” I asked.

“I have a death voice,” Kenneth Greene said.

“What’s a death voice?” I asked.

“I front a death metal band,” Kenneth said. “In my off time.”

Kenneth Greene’s goal, I can only assume, was to display a talent that matched the subjects of the discussion he interrupted. In that discussion, a friend and I spoke about the various artistic talents of those on the staff, and Kenneth Greene wanted us to know that he had a talent equivalent to those that we were discussing. He wanted us to know that he was much more than a manager of a low-rent restaurant chain that would go out of business within a year, and he wanted us to know that this death voice was his gift and artistic calling.

‘Beauty is in the eye of the beholder,’ is an expression that dates back, in various forms, to the Ancient Greeks. The reason such a notion exists, as Benjamin Franklin’s version of the expression states, is that at the core of one’s definition of beauty is an opinion.

I would never consider myself an arbiter of art, in other words, but I thought Kenneth Greene would have a tough road ahead of him if he hoped to convince those of us sitting in a restaurant break room that we should consider a skilled death voice for our conversation of artistic talents. I was, as I always am, eager to have another prove me wrong.

I didn’t know what to do with this information, however, so I assumed that he wanted to show us. After several attempts to goad him into it, Kenneth decided against performing his death voice for us. I think he saw something in our faces that suggested that the moment after one lets loose a death voice in the middle of a restaurant breakroom, they become the person that let a death voice loose in the middle of a restaurant breakroom. When he invited us to hear it in person, at one of his shows, I could tell he knew we wouldn’t attend, but he needed to say something to get out of the uncomfortable situation he created.

***

I thought Willie Bantner was a real character when I met him. Willie and I found that our backgrounds were similar, and I thought this was odd considering that our outlooks were so dissimilar. Willie’s worldview was foreign to my own, yet there was something about him I couldn’t quite put my finger on. This sense of familiarity became so hard to deny that it stirred feelings of déjà vu, until Willie revealed to me the actual character he was playing in life.

My initial inclination was the once one meets a significant number of odd characters in life they begin some overlap. There are only so many odd characters out there, in other words, and I thought Willie reminded me of one of them.

These odd, weird sensibilities were the reason I was so fascinated with Willie Bantner. It was the reason I would go to him with very specific scenarios. I wanted to learn what he thought, why he thought what he did, and how someone can arrive at such a notion. The funny, thought-provoking things he said were the reasons that we became friends. This friendship lasted for over ten years. Over the course of those ten years, I grew so familiar with Willie that his peculiarities were not so peculiar, but there was still that nagging sense of familiarity about him that plagued me.

When we began one of those lists that seem indigenous to the male gender, this one of the best television shows ever, we mentioned the usual shows that we considered the best of their day. When we entered into the list of what we thought should be on a list of honorable mentions, the list was lengthy. I mentioned the show Family Ties. Willie agreed that show should be on the list of honorable mentions. I added, “If nothing else, the show gave us Michael J. Fox, and the character Alex P. Keaton, and I think Alex P. Keaton was one of the best TV characters ever written.”

“I modeled my life after him,” he said. After some confusion, Willie clarified that he did not model his life after Michael J. Fox. He modeled his life after Alex P. Keaton.

Over the years, I’ve learned that one of the reasons young men swear so often is that they lack confidence. They don’t know how to articulate an opinion in a manner that will impress their peers. They are also unable, at this point in their lives, to provide detailed analysis of the subject of their opinion, so they choose to coat those opinions in superlatives that they hope will provide cover for any unformed intellect. If one person says that Marlon Brando was the best actor of all time, another may agree with that person. Rather than enter into a detailed discussion of that sense of spontaneity Brando brought to his roles, or the fleshed out nuances he brought to method acting that influenced a generation of actors, they say, “I’ve built a personal shrine to him in my bedroom.” When one person says that a movie was the scariest movie they’ve ever watched, another might say, “That movie was so scary that I didn’t sleep right for weeks.” In most cases, there were no shrines built or hours of sleep lost, but in the absence of detailed analysis, a young man thinks he has to say something over the top to pound the point home. I thought this was all Willie when he said he modeled his life after Alex P. Keaton. The more I chewed on it, however, the more I began to see a truth mixed into that admission.

I would watch him, going forward, with that admission in mind. The idea that the man modeled his reactions, his physical gestures, and his life after a situation comedy character became obvious once I had a conclusion for my search for that nagging sense of familiarity. Once I saw that elusive sense of deja vu for what it was, I couldn’t believe I didn’t see it earlier. 

I was also disappointed that my initial assessment of Willie Bantner proved so prescient. I thought he was a character, and he was, but not in the general sense that I intended. I was disappointed to learn that individual experiences did not inform Willie Bantner’s personality as much as I thought, unless one considers tuning into NBC’s early to mid 80’s, Thursday night lineup at 7:30 central to be an individual experience.

Willie Bantner made me think, he made me laugh, and I thought he earned it all with ingenious, individualistic takes. After his admission, I began to wonder how many of those comments were off the cuff, and how many of them he lifted from Family Ties’ scripts. The unique personality that I wanted to explore became, to me, a carefully manufactured character created by some screenwriters in a boardroom on Melrose Avenue. The odd sense of familiarity plagued me as I wrote, but I can’t remember putting much effort into trying to pinpoint the core of Willie Bantner’s character. If I had, I probably would’ve over-estimated what influenced his core personality, but that’s what young men do. Even if I was able to temper my search to more reasonable concepts, I don’t think I would’ve considered something as banal as watching too much TV to be the sole influence for what I considered such a fascinating personality, until he admitted it.

Now, I have no illusions that I’ve scrubbed the influence of TV characters from my personality. I imagine I still have some remnants of the Fonz in my cavalcade of reactions, and I’m sure that Jack Tripper is in there somewhere. I also know that an ardent fan of David Letterman could spot his influence somewhere in how I react to the people, places and things that surround me, but I think it’s almost impossible to develop a personality without some degree of influence from the shows we watched every week for years. To model one’s entire life on one fictional, television character, however, speaks of a level of insecurity I think the American Psychiatric Association should consider in their next edition of the Diagnostic and Statistical Manual of Mental Disorders.

Advertisements

The Master Reset on Washing Machines


 

Our washing machine stopped spinning. It would reach the spin cycle and just stop, until the spin cycle ended. I went to the phone for answers. I pictured YouTube videos that would instruct me to tear the machine apart to get to a belt that needed replacing. I pictured an afternoon of frustration and uselessness, as I attempted to fix something above my pay grade.

The first internet page I pulled up, informed me that my first step was to perform what it called a “Master Reset”. It sounded complicated. I read the definition of the Master Reset. It said, “To perform a Master Reset, carefully unplug the washing machine from the power outlet and leave it unplugged for one minute. After one minute is up, plug the washer cord back into the wall. Next, open and close the door of the washing machine 6 times within 12 seconds to send a “reset” signal to all the components.” I read through those steps a couple of times. It seemed too simple, and I knew that a remedy this simple would not work for someone like me. My cynicism leads me to believe that corporations build these things to keep people like me from fixing them, and to keep the whole industry that surrounds washing machines, and repairmen afloat. I also thought this sounded like one of those “home remedies” that people spread via word of mouth, but no one uses, because they don’t work for “me”, and such solutions only leave those of us that are not able to fix anything with this inept feeling for being one of the few for which miracle cures don’t work.

In my mind, I was already at Sear’s writing the check for a new washing machine, but I considered the idea of trying this step-by-step process on a ‘what the heck’ basis. I thought this option might have a better chance of working than stabbing myself in the eye would, so I tried it and it … it worked. It worked so well that we did it twice just to convince ourselves that it actually worked.

I went back to the website that said, “This is a common fix that many appliance repair mechanics use – it works on about 50% of all washing machines.” This led me to wonder how many times has an appliance repairman has removed the back panel on a washing machine while we were in the room? How many of them fiddled with the machine, until we left that room? How many of them then executed the steps of this master reset and called us back to show us their prowess, and a bill of $130 for parts and labor?

“You just needed a new flux capacitor, and I happened to have one on me,” they said to our amazement.

How many of us were so relieved that our old washing machine now works, and that we do not have to pay $300 for a new one, that we didn’t question it. How many hundreds of thousands of dollars have passed from desperate customers to appliance repair mechanic over the years and decades in which this master reset option has been available to us? How many new washing machines have desperate customers purchased to replace a washing machine that most people, salesmen or not, will tell you are cheaper to replace than fix? How many of those same washing machines just needed a master reset? This led me to two conclusions, I could either become an appliance repairman that specializes in fixing washing machines, and fix 50% of them, or I could spread the word and hopefully prevent others from being duped by repairmen and salespeople that tell their customers it is in their best interests, over the long haul, just buy a new one.

I Could Be Wrong, But …


“You’re wrong,” a friend of mine said. “You’re wrong about me, you’re wrong about these little theories you have about other people, and you’re so wrong about so many things that I’m beginning to wonder if you might be just plain stupid.”

I don’t care what level of schooling one achieves, or the level of intelligence they gain through experience, a charge as harsh as that hurts. The subject of such an assessment might attempt to diffuse the power of the characterization by examining their accessor’s intelligence level, and their motivations for making such a charge, but it still leads to some soul searching.

“How can I be wrong about everything?” was the question I asked after she made the charge. “I may be wrong about some things, but how can I be wrong about everything?”

“I don’t know,” she said. “You just are.”

In the course of licking my wounds, I remembered something my eighth grade teacher once told me.

She gave me a harsh grade on a position paper that she assigned. I worked my tail off on that paper. I poured my soul into that paper. The reason I devoted so much energy on that paper had to do with the fact that I was not a good student. I rarely applied myself. I had this notion that that if I ever did apply myself, my true intelligence would finally be revealed. This particular paper, I thought, was that opportunity. I also thought it might prove something to this teacher I respected. As a result, I looked forward to receiving her grade and all of the effusive praise I felt sure would follow. It was one of the few times in my life I looked forward to receiving a grade.

“I worked my tail off on that assignment,” I said when I held that graded paper in hand.

“It was mealy mouthed,” she said. After she explained what mealy mouthed meant, I informed her that she instructed us to be careful to present both sides on this paper. I said I did that. “You were instructed to provide evidence of the opposing opinion,” she said. “You presented too much evidence,” she said. “The assignment involved taking a position. At the end of your paper, I wasn’t sure what side you were taking.” In the midst of the back and forth that followed, she added ten words that have stuck with me since. “If you’re going to be wrong, be wrong with conviction.”

***

“Have you ever considered the possibility that you might be wrong?” another person would ask me years later.

Some people pose this notion as often as possible. It’s a silky, smooth method of stating that they think the speaker is wrong, and so wrong that they might be stupid. They often pose the notion as if the speaker has never considered that idea before. If it’s not that, then they need the speaker to satisfy their needs, and their ego, before the speaker continues. As for the idea that I’ve never considered it before, I want to ask them if they’ve ever met my dad. The person that asked me this question, on this occasion, knew my dad well. They knew that my dad questioned everything that came out of my mouth. They also knew that my dad believed I was wrong about everything, and he assumed that I didn’t have the facilities to be an independent thinker. I considered this an insult in my younger years, but I now understand how difficult it is for a parent to believe that that person they knew as a toddler can arrive at independent thought, but it took me a while to reach that understanding. I don’t think my dad introduced this mindset to lead me to try to prove him wrong, but that was the result.

The interesting dynamic in these conversations is that prolonged involvement with a person that makes such a charge will reveal the idea that they’ve never considered that they could be wrong. Their vantage point is often that of the contrarian, and that contrarian challenges what they consider a status quo relative to their own life. This mindset does not lead to reflection on one’s own set of beliefs. They have focused their energy on refuting the speaker’s words and the “Have you ever considered the idea that you might be wrong?” is the best weapon they have in their arsenal.

The ideal method of refuting further questions of this sort is to qualify every statement a speaker makes with, “I could be wrong but-”. As I’ll note below, I used to do this, but I found it tedious after a while.

***

I could be wrong, but I think any attempt a person makes to describe human nature is going to be fraught with peril. Most people will not agree with such descriptions, and they might view that person’s conclusions as simplistic, trite, and anecdotal. Some might even view the positions a person takes as so wrong they could be stupid.

In one regard, I view such assessments with envy. I don’t understand how one person can unilaterally reject another’s opinion with such certitude. I still don’t, as evidenced by the fact that I still remember my friend’s ‘You might be stupid’ charge more than twenty years after she made it. I assume that she dismissed the assessments I made of her so well that she doesn’t remember them, as she was as certain then, as I assume she is now, that she was right and I was not only wrong, but I could be stupid.

Somewhere along the way, I learned that one’s definition of human nature relies on the perspective they’ve gained through their interactions and experiences. If it’s true that definitions of human nature are relative, and that one author’s assessments are based on the details of the their upbringing, then the only thing anyone can say with any certitude is that the best story an author can tell is that which is listed in their autobiography.

What if I am as wrong as my friends have stated, and my stories don’t even come close to achieving what some would call a comprehensive study of human nature. What if every belief I’ve had over the course of the last twenty years are so off the mark, or so wrong, that they might be stupid? These questions should haunt every writer, artist, and theoretician that attempts to explain the nouns (people, places, and things) that surround them. The answer for those plagued by the enormity of trying to explain the otherwise unexplainable, my advice is to pare it down to the knowable. An author can only write what they know, and often times what they know is that which is told to them.

Those that know me often say that for all of my faults, I am a great listener. They also say that my curiosity appears genuine. I don’t listen with an eye towards developing content, in other words, but content is a natural byproduct to those that are curious enough to learn another person’s truth. The trick to achieving such a truth is to go beyond whatever personal roadblocks we place in front of those with whom we interact to the point of experiencing their triumphs and failures vicariously, until we are processing their autobiographies so thoroughly that they become a part of our own. Go beyond hearing what a person wants others to hear, to fortify a thesis, and listen to what these people are saying.

Some will dismiss some of the stories I use to explain human nature as anecdotal evidence of human nature. Some of them may be. To my mind, they explain the motivations of the characters involved, and the stories and theories I arrived at that have shaped my definition of human nature, and presumably my autobiography, better than any other stories can.

If there is a grain a truth to the Chinese proverb, “A child’s life is like a piece of paper on which people leave a mark,” then those that preceded the author have shaped their definition of human nature. This is not to say that one’s definition of human nature is limited to experience. Yet, when we read theories and see movies that depict questions and answers, we’re apt to be most interested in those that apply to our own experience. So, the question a reader might ask is, ‘Why did these particular stories appeal to your theories?’ The only suitable answers I’ve been able to find are, “All theory is autobiography,” and “I’m telling my story, as I heard and responded to others.”

These quotes form the foundation of these pieces, coupled with an attachment, via a complicated circuitry, to the philosophy that drove Leonardo da Vinci’s numerous accomplishments. I don’t know if he said these actual words, but from that which I’ve read on da Vinci, questions informed his process more than answers, and I derived a quote: “The answers to that which plagues man can be found in the questions he asks of himself.” The second is a direct quote from playwright Anton Chekov: “It is the role of the storyteller to ask questions not to answer them.”

As such, the curious reader might find more questions than answers in these stories, and they may not derive anything beyond simple entertainment, but to the author each story comprises a central theme of the questions I have regarding motivation. The goal of each of these pieces was to explain, to one curious mind, the nature of mankind. The answers hit the author based on the questions I have asked people in the interactions I have had, from my very small corner of the world. Some of the people the author interacted with were on the fruitloppery index, and some of them were a bit delusional, but most of the characters of these stories appeared so normal that the author thought they might be boring, until they told their story. When they told their story, the author asked suitable questions, the characters opened up, and the author engaged with the storyteller until he all but physically entered the dark caverns of their mind.

Even though most of these stories are based on real life experiences, there will always be some readers that require “I may be wrong, but …” qualifications, lest they view the author as obnoxiously sure of himself. This reader should wonder how interesting it would be if an author qualified all of their characterizations and conclusions with various forms of “I could be wrong here, but …” For these readers, I would suggest they find another author. Those authors are out there, and I’ve read them. They spend so much of their time dutifully informing their readers that they’re not “obnoxious blowhards” that they end up saying little more. It’s so redundant and tedious that I can’t help thinking that if those authors fear they might be wrong, they should be so with conviction.

Rilalities X


Are you offended? Have you ever met someone that was easily offended? Have you ever told them that that gives the other side ammunition? Their response centers on the idea that it’s not them. They’re not easily offended. The just find the other party offensive. A younger talk show host, named Dennis Prager, took questions from the audience after a speech. A woman asked Mr. Prager a question. In the course of that question, she informed him that his views on the subject of her question offended her. When she finished her question, Dennis Prager answered it. He then went back to the idea that he offended her. “You said you were offended,” he said. “Why were you offended?” The two went back and forth for a bit before it became clear that her basis for declaring Dennis Prager offensive was that Dennis Prager had a different view on the subject. An older, wiser Dennis Prager looked back on that Q&A and suggested that the sole reason the woman was offended was that she disagreed with his opinion on the matter. “This,” Dennis Prager said, “Is what is going on in our culture today. Too many people confuse having a different opinion with being offended.”

We all believe that we have special insight on a given subject that leads us to know more than others. The others could be wrong, and they could be ill informed, but those that are offended believe that it’s more likely that they have a nefarious motivation for believing the way they do. Some of us do have a motive, and some of those motives are nefarious. We cannot discount that. We can say, however, that not everyone that disagrees with another has a nefarious reason for doing so. This is what we call painting with a broad brush. When a loved one disagrees with us, we know that we can’t paint them with this broad brush, so we find, or fabricate, a motive for them disagreeing with our impassioned pursuit of the truth. It seems impossible that educated people that have put some thought into their opinions can disagree with ours, so the only answer can be that they’ve arrived at their notion by nefarious means, and that offends us. Claiming offense seems like a shortcut to persuading another of their views. It’s a way of saying that I hold passionate beliefs based upon my special insight into the human condition, and you are not only wrong and lacking by virtue of your limited insight, but you are irredeemable.

Here’s how to do it, for the uninformed. If the member of an audience hears a comment from someone that audience member shares a worldview, or they like on a personal level, it doesn’t matter what that comment is, the audience members finds a way to support, excuse, or forgive that provocateur’s comment. The general thesis of their reply is, “I know what’s in his heart.” If a provocative point comes from an individual that has an opposing worldview, it doesn’t matter what’s in that person’s heart. In an attempt to portray themselves as well informed, the offended will react to the provocateur’s point. In the face of what they deem to be an offensive statement, they react. They don’t argue against the merits of the case the provocateur presents, and they don’t offer a substantive counter argument. They react, and that reaction is to claim offense. Being offended permits them, and some would say obligates them, to be offensive in return.

Bowie: The difference between rock stars and musicians/artists is a wide chasm. The groups AC/DC, Eagles, and ZZ Top developed a formula that consumers enjoyed, and they enjoyed the formula. The bond between the two was such that the rock stars didn’t venture outside the formula. If their fans would argue that point, they would have to concede the groups put less effort into making their albums as different as the albums in David Bowie’s catalog. The consumer never knew what to expect from David Bowie. Most of us now know the history of David Bowie, and we now assume that long-term success was a forgone conclusion, but a broader look at his career suggests that Bowie could’ve rested on his laurels after delivering Hunky Dory and Ziggy Stardust and the Spiders from Mars. How many artists would’ve sold their souls for half of the longevity of these two albums? How many artists would’ve done whatever it took to carve out the niche Bowie did in the rock world with these two albums? How many artists would’ve then released various incarnations of the formula found with those two albums, at such a young age? How many of those same artists would’ve been so grateful for the financial support that the record industry offered them to achieve such success that they would’ve been susceptible to their advice? Bowie could’ve had a successful career based almost entirely on the Ziggy Stardust character. What David Bowie decided to do was retire the Ziggy character soon after he achieved a peak with it. Three years later, he delivered an entirely different sound in Young Americans, and five years after Ziggy Stardust, he delivered three albums (in the space of two years) that demolished everything he built to that point and rebuilt a new sound for himself that some call his Berlin trilogy.

The thing with invention, and reinvention, is that an artist is bound to disappoint those that expect a more regular, consistent product. The thing with experimentation, on par with some of David Bowie’s discography, is that not all of it will work. No one that listened to Ziggy Stardust for the first time would expect that artist to produce the Low and Scary Monsters albums. Those albums are a stark departure from that which preceded them, as are Hours, Heathen, and Blackstar. I’ve listed but a few albums here, but most of the albums in Bowie’s catalogue had an individual beauty that any music lover should explore. Not everything Bowie touched turned to gold, of course, but I would say that he, more than just about any artist in his rarified air, believed that the essential ingredient of the artist was to take a risk and pursue avenues their audience might not, and often times did not, find entertaining. It is for this reason that I list David Bowie at, or near, the top of the list of rock stars that also happen to be musicians and artists.

These Dreams: Every person has dreams, hopes, and aspirations. Our individual dreams describe us as well as anything else does. I knew a person that believed that he discovered a ticket to ride. He spent a number of years compiling a VHS tape of nude scenes from movies and television shows. Before anyone begins assigning modern techniques to this pursuit, they should know that my friend make this tape in the basement of a home, in the Midwest, in the 80’s. My friend had no technical equipment. He had a pen, a notepad, and a VCR. My friend had no idea how many hours he logged compiling this tape, but he had to watch a movie, document the minute mark at which the actress removed her top, wait for the scene to arrive in the second viewing, and hit record at the perfect moment. For those that don’t remember, the cable channels of that era assisted my friend by replaying the same movie repeatedly. My friend spent years collecting these scenes, and he swears he had a three-hour tape almost full of, on average, four to five second scenes, when his sister found the tape and recorded over it. She recorded the movie Vamp for those interested in history.

Much later on in our friendship, I found a book that documented these scenes for him, so he no longer had to do it. I gave it to him for his birthday. He considered that book a bittersweet present. I was confused. I didn’t see how he could be anything less than overjoyed at the prospect that he was onto something with that tape. I told him the book had become a best seller.

“I should’ve written that book,” he said. “That book led me to the realization that I wasted years of my life making a tape that wouldn’t have seen the light of day. I was a dumb kid,” he furthered. “I didn’t know anything about licensing and attaining a person’s rights to using their image. That book had those scenes documented down to the minute, and the description of those scenes, just as I had. You joked about those little notepads, but I filled with the descriptions of the scenes in which Hollywood’s brightest stars showed their hoo hoos. It also had the exact minutes and seconds into the movie in which those top stars removed their tops. I didn’t think of rating those scenes, like that author did, but if I had spent some time writing a book on it, I probably would have come up with that. I could’ve made some real money off a book like that.”

Objectivity versus obliviousness: A friend of mine, we’ll call her Fawn, opened a story from her life with the qualifier, “This is not a story that you will view from objectively.” She said, “I don’t want you playing devil’s advocate with me. I just want you to listen.” When she finished, I went silent, as a form of rebellion to her direct order. “Well, what do you think?” she asked. I told her that she did not permit me to answer. She said I was. She just said that it had to be within the parameters that she drew up.

Everyone wants their listener to side with them in a story from their life, some just want the listener to listen with a comment the sides with them. Most are not this blatant. I thought it was a hilarious comment on the idea that I rarely take her side.

Another friend, a woman named Maddie, informed me that her friend Patricia invited Maddie to lunch at a restaurant. Maddie informed me that she reluctantly agreed to meet Patricia in this restaurant. After agreeing to go, Maddie decided that she wouldn’t be going. Maddie informed me that she had no other plans. She just didn’t want to go. Maddie also admitted that she never attempted to call Patricia beforehand to inform Patricia that she had changed her mind. Maddie then informed me that by not going, she would be leaving Patricia alone at that table in the restaurant. As the morning hours crept toward to noon, Maddie decided that she was not going to go. Rather than go through the painstaking process of developing an excuse “I just blew her off,” Maddie told me.

After the questions and answers established the particulars of this situation, I said, “How would you like that if she did that to you?” I considered this a time-honored question that my dad asked me so many times that it’s an ingrained response. I dare say that most people have a version of this question ingrained in their brain. I didn’t consider this question a brilliant display of my skills, and I didn’t consider it confrontational. I consider it a question that my dad would’ve asked of me, if I relayed such a story to him. It’s a question we ask of one another, when we think the other side doesn’t see the error of their ways.

Maddie had apparently never had a parent put them through the character-building exercise of viewing matters such as these, objectively. She informed me that Patricia wouldn’t do that to her. “You’re missing the point,” I said. In the course of this email, I set off a firestorm by saying, “I have to tell you that I think what you did was wrong. It would be one thing, if you had conflicting plans, or if you called Patricia to cancel your lunch, but leaving her at the restaurant alone was wrong.” This was the gist of my reply. It might have been a little longer, but I can report with confidence that I did not disparage Maddie’s character in any way. I did write that email, I must confess, but I deleted it. I wrote a second email that omitted my personal feelings on the matter. I wanted Maddie to continue to be my friend, but I thought someone needed to tell her what she did to Patricia was wrong.

This set Maddie off, I would later learn. She couldn’t understand why I would do this. She spoke to her brother, my best friend, to try to understand why I would say such a thing to her. They both knew that I had no allegiances to Patricia, so they couldn’t understand how I could condemn Maddie’s character in such a way.

“What’s wrong with your friend?” Maddie asked her brother. “He’s freaking out. Accusing me of stuff. He’s hysterical.” My friend, her brother, asked her for the details of the story. Maddie told him. He believed that it was all about him. He had a history of telling me that he would show up to a restaurant, and he wouldn’t show. He did this to me more than ten times. He believed I was harboring ill will towards him. I could see how he would think that, I could even see that he might have viewed me as empathetic to Patricia in that regard, based on our history, but I can tell you that I didn’t consciously call upon those moments in my defense of Patricia. I am the type that will judge people for their actions, as often as I expect them to hold me accountable for my actions, but that had no bearing on my exchange with Maddie. In the email exchange I had with her, she provided me a scenario, and I reacted. If there were never any prior occasions to match that one, and I would’ve provided that qualification in my answer.

As for the hysterical charge, our conversation occurred via email, so there was no way she could’ve determined if I was hysterical. I wasn’t hysterical. I just thought it was wrong, and I think 99% of the population would agree with me. Maddie is a princess though. Maddie lived a life where she could do no wrong, and she never had people call her out like that. Therefore, even though she couldn’t say I was wrong, she found an interesting way to make me the bad guy.

I think the two parties concerned should applaud me for my objectivity in this matter. It’s true I have no allegiances to Patricia, but Maddie was my friend until this argument. I could have viewed this episode from her perspective, but I didn’t. I made an effort to be objective. I tried to give Maddie every out possible. I asked her if Patricia had ever done this to her in the past, I asked if Patricia had ever done anything that warranted such as action on Maddie’s part, as a form of revenge, and Maddie assured me that Patricia hadn’t. I don’t think she knew what I was getting at.

No matter how many times I experience a situation similar to this, it amazes me how oblivious some people can be. My dad raised me to ask that “How would you like that if they did that to you” question. He raised me to abide by the “Treat others the way you want to be treated” credo that we all know, and we all shake our heads in agreement to it. The years I’ve spent interacting with people have taught me that most people don’t abide by tenants they shake their head to, but the obliviousness to confronting it in a given situation often shocks me.

I can see how an outsider, that doesn’t know Maddie, might think some form of guilt guided her into projecting me into the role of the bad guy, but I know Maddie. I know that guilt is not on her wheel of emotions. I believe her attempts to understand my simple reaction to her real-life scenario was genuine. When she couldn’t find my motivation for condemning her, because it made no sense to her that I should consider her actions wrong, she deemed me hysterical. When that didn’t make sense to her, she approached her brother. He came up with an answer, a plausible answer that I didn’t consider, and the two of them were satisfied with that answer. The idea that telling someone you will have lunch with them, only to blow them off and leave them at that restaurant alone, is the wrong thing to do didn’t even enter their conversation. It may sound like there’s more to story on the part of Maddie and her brother, but I can assure you there isn’t. They simply didn’t see it as wrong.

Prescription Drugs: “I think that we should take away the control doctors have over prescriptions.” How many problems in our country are drug-related? How many people have progressed from using illicit drugs to prescription drugs? How many more problems would result from the population having unfettered access to prescription drugs? At this point in a theoretical situation such as this one, the libertarian would suggest that we don’t give people enough credit. One could suppose that suspecting widespread chaos is unilaterally cynical. Yet, my counter proposal is that it’s not cynical to state that good and honest people, experiencing chronic pain, can accidentally develop a habit for taking painkillers as a means of soothing chronic pain. It’s also possible that these good and honest people can either ignore the harm these drugs can do in their quest to seek relief, or they might not have a thorough understanding of the harm some of these drugs can do. A possible overdose could occur, if informed third parties do not govern usage. Some of these informed third parties rely on test studies, and outside research to understand the benefits and harm of these drugs. They might better inform the chronic pain sufferer of the damage they may do, they might advise curtailing, and they might suggest a less addictive, alternative.

I attempt to be as libertarian as anyone else and I try to maintain an openness for suggestions regarding how America can become more libertarian, but I would suspect that one of the most libertarian politicians in Washington, ophthalmology physician Rand Paul, would agree that keeping restrictions on access to prescription drugs limited is a good thing. The answer my friend had was to take away all controls, so that we might thin the herd. You’ll have to trust me on the characterization of my friend, when I say that he was not joking.

Death: We will exit our celestial plane on a waterslide. A centrifugal force greater than gravity will pull us to the portal. The force will be such that it takes our breath away. It will dawn on us, before we hit the portal that we are dead. We will consider all we’ve left undone before we hit the slide that will take us to our next existence. Those thoughts will consume us to the point that not only will we not enjoy the ride. We won’t remember it. At the end of our ride, we will enter our local bar. The bar will be so close to our home that we will see our house on the hill. The lights will be out. Our family members will still be sleeping. We will wonder about the effect our departure will have on their lives. We will sit with many people in this bar, some of the associates we knew in life, some of the friends, and some of our loved ones that have passed on. They will tell us that this bar is our way station, our purgatory if you will, to ease us into the transition of our afterlife. They will tell us that the mystery of life is beyond most mortals, and that the only thing we do understand is that it moves on. This will soothe us and depress us. We were never as vital to their existence as we once thought. We will eventually run into an individual whose existence mirrors ours. They will tell us, “Life goes on. My son even laughed at my funeral. He wasn’t disrespectful. He wasn’t laughing at me. Somebody told a well-placed joke, that had nothing to do with anything, and he laughed. He laughed hard. I find it a depressing exclamation point on the idea that life goes on.”

Rilalities XI


Electoral College: We can provide an answer to the debate over whether the Electoral College is an outmoded way of electing presidents in two simple sentences. America is a Representative Republic. It is not, as some have suggested in a variety of ways, a democracy. The distinction, as it pertains to the Electoral College and presidential elections, is that the American voter is not voting for a presidential candidate when they cast a ballot, but for a representative that votes on their behalf in the Electoral College meeting that occurs a month after the election to determine the official winner of the election.

Those of us that are not scholars cannot claim to know all of the ideas that went into the formation of America’s federal government, but one of their goals was to create a system that made change difficult. They made it difficult to pass legislation, they made the Amendment process even more difficult, and they instituted numerous checks and balances on the powers of the branches. The Founders also instituted federalism to give the states more power, and thus provide an even greater check on federal power. By doing so, we can make the educated guess that for all the consternation that the system the Founders created has caused legislators, and their constituency, their goal was directed more towards stability than it was the equal representation that a democracy can provide.

In that vein, the Founders created the Electoral College. The Electoral College was, in effect, a check on the majority to provide some balance for the minority. The Founders knew that the majority would rule regardless of their efforts, but they did not want the majority (i.e. the passions of the mob) to hold a tyrannical rule over the interests of those in the minority. The Representative Republic form of government was their answer to allow minority interests, such as those in modern day Nebraska and Kansas, to have some say in the manner in which the federal government conducted affairs. The Founders believed that Rome’s version of a Republic was a superior form of government, because it allowed its representatives to make tradeoffs, or compromises, to form legislation for the common good. The Founders also believed that the people would hold these representatives accountable for their tasked role of providing representation. If America were a pure democracy, the interests of the larger states in our union would hold a tyrannical rule on the minds of national politicians.  

Some state that due to the fact that such a large percentage of the nation’s population now live in urban areas of California and New York, the votes of individuals living in Wyoming, Kansas, and Nebraska are given more prominence, in the Electoral College system. They state that this violates the principle of voter equality, and they declare that this is a violation of a democracy, and it is, but the United States of America is not a true Democracy.

Those that pose this argument rarely encounter a counter argument, for it is tough to argue against the idea that our system of representation should be population based to provide greater voter equality, and that a vote from a citizen in Kansas is disproportionately more important than a vote from a citizen from California. One of the many counter arguments is that the Founders based three-fourths of our government branches on equal representation, as opposed to providing population-based representation. The only branch of our government that provides population-based representation is The House of Representatives.

A proponent of what they believe to be the more equal representation provided by the Electoral College might be willing to cede to the idea that the system we have in place regarding presidential elections is inherently flawed. They might also be amenable to changing it, if the opponents of the Electoral College were willing to cede that the other two branches of our government also provide population-based representation. If the proponent began his argument with the notion that we change the Senate to a population-based representation, most opponents of the Electoral College might be willing to compromise on that, as that would give the larger states more power in the Senate. Would these same opponents be amenable to changing the Supreme Court into a more representative body? The proponent could argue that the unelected nine jurists on the Supreme Court do not represent the population as well as the judicial branch could if fifty-one jurists sat on the highest court in the land. (This proposition suggests that Washington D.C. be included, and we would deem it necessary to have an odd number of jurists, I suspect). Not only would that provide more representation for a wider variety of interests on the Supreme Court, it would provide some dilution of the vast power the nine jurists currently wield. In this scenario, we could have Governors, or even State Legislatures, nominate jurists to make sure that the jurists represented their state well. The proponent could also argue that one president doesn’t represent the population well and that we might want to consider having fifty-one presidents, 435, or however many it takes to provide better representation.

Those that seek to “guarantee the Presidency to the candidate who receives the most popular votes in all fifty states and the District of Columbia” have made strides to basically end the original intent of the Electoral College. The first and last question these reform-minded citizens should ask themselves, is if we are going to make changes to the federal government, the Electoral College, and the manner in which the government represents the people how far do we take it?

The idea that these reformers only want to change that which furthers their agenda is obvious, but there are other agendas. That question asks the question, ‘Can a reform movement make all of the people happy all of the time?’ Of course not, and they are not driven by that goal. Their goal is to satisfy a personal, partisan agenda. Most reforms begin as personal, partisan agendas, however, and if this action makes America a better place then we should all be for it. That’s the question. Would this National Popular Vote bill make the country a better place? It would provide greater voter equality of course, but the goal of the Founders was to provide the nation what they believed would result in long-term stability. Those efforts have resulted in the fact that America is still on her first Republic since 1776, and France is now on her fifth since 1792, so one could say that if that was their goal they have succeeded. If that stability is a direct result of all of the checks and balances on government power, including the check that the Electoral College places on what they believed would result in a tyranny of the majority, what would be the unforeseen and unintended consequences that could result from providing to such an action?  

Diet: “Pay attention to what you eat?” nutritionists say. We ignore some of the nuggets of information nutritionists provide, because some of them can go a little overboard. They suggest that we follow a plan that we don’t want to follow, from food we don’t want to eat, to smaller portions, to massive intakes of various vitamins and supplements. Most of us do not want to spend our free time reading ingredients, creating detailed charts of protein intake versus carbohydrate, and fiber. That could be overwhelming, and it could leave us eating nothing but grain and tofu. We may do this short term, but we don’t want to deprive ourselves of the goodies that make life enjoyable. Yet, from every philosophy comes a nugget of useable information.

“If you are what you eat, why would I want to mimic the diet of a person from the Paleolithic Era (AKA the Paleo Diet), if that person had a life expectancy of thirty-five point four years if they were a man, and thirty if they were a woman? Why would I want to mimic anything from an era whose highlights consisted of some use of tools, art that was limited to cave paintings, and whose controlled use of fire came so late in their existence?”

The answer to these questions, say some, is anatomical. The answer lies in various places along what Rob Dunn of the Scientific American calls “the most important and least lovely waterway on Earth”, and what he calls “a masterwork, evolutionarily speaking”. What Mr. Dunn is describing is the human body’s alimentary canal, or our digestive  tract. Rob Dunn also states that while “most canals take the shortest course between two points, the one inside you takes the longest.” The theory behind the Paleo Diet, put simple, is that we only eat food that which the human alimentary canal recognizes before enhancements and we added preservatives to the foods in various agricultural cultivations.

What’s better for the human body margarine or butter? The competitor to butter lists the tale of the tape. The makers of margarine state that it is a vegetable oil based product, as opposed to butter’s saturated fats. They state that butter contains milk, and milk is a dairy product, and anyone that knows anything about losing weight knows to eliminate dairy from their diet. Butter contains contain 100 calories per tablespoon, a typical serving size. One serving has 11 grams of fat, and 7 grams of it is artery-clogging saturated fat –about one-third of your recommended daily value! It also contains 30 milligrams of dietary cholesterol (10% of your daily value). Butter also contains vitamins A, E, K2, and it “contains a type of fat called butyric acid, which helps maintain colon health. It’s also rich in conjugated linoleic acid, a type of fat that may actually help protect against weight gain.”

Margarine is a plant-based alternative, but some margarine contains some trans-fats. Some margarine products suggest that they contain no calories, but most of the products have fewer calories than butter, so margarine is the winner right?

The question that Rob Dunn, and most enthusiasts of the paleo diet ask, and that which might be a usable nugget of information in the debate between butter and margarine is, what does your digestive tract consume in a quicker and more efficient manner? 

The human digestive tract does not process the imitation egg, for example, as well as it does a natural egg that is prepared in the most natural manner possible. The theory holds that weight can be lost, as a result of the digestive tract recognizing how to metabolize that egg in the most efficient, quickest, and most natural manner possible. The theory also holds that the more familiar our digestive tract is with the egg, butter, meats, fish, vegetables, fruits, and nuts that could be found in the Paleolithic Era, the more it knows what to do with the food that has been introduced to it, the greater the health benefits.

I may be wrong in my assertions here, regarding the import of the Paleo diet philosophy, but I do not believe it calls for an exact mimicry of the diet of the Paleolithic man. Rather, it suggests that based on the current evolutionary design of the human body, we should study the diet of the Paleolithic man. We should take some nuggets of information that we believe made the Paleolithic man healthier, in lieu of the more processed foods that have additives and preservatives that can inhibit processing food in the digestive system, and make choices on our dietary habits based on that information. The paleo diet does not call for a complete overhaul of our diet, in other words, it just provides details that allow humans to make choices. Mimicry is a stretch, in other words, but imitation is the sincerest form of flattery.

Historical Inevitability


The idea that history is cyclical has been put forth by many historians, philosophers, and fiction writers, but one Italian philosopher, named Giovanni Battista Vico (1668-1744), wrote that a fall is an historical inevitability. In his book La Scienza Nuova, Vico suggested that evidence of this can be found by reading history from the vantage point of the cyclical process of the rise-fall-rise, or fall-rise-fall recurrences, as opposed to studying it in a straight line, dictated by the years in which events occurred. By studying history in this manner, Vico suggested, the perspective of one’s sense of modernity is removed and these cycles of historical inevitability are revealed.

To those of us that have been privy to the lofty altitude of the information age, this notion seems impossible to the point of being implausible. If we are willing to cede to the probability of a fall, as it portends to a certain historical inevitability, we would only do so in a manner that suggests that if there were a fall, it would be defined relative to the baseline that our modern advancements have created. To these people, an asterisk may be necessary in any discussion of cultures rising and falling in historical cycles. This asterisk would require a footnote that suggests that all eras have had creators lining the top of their era’s hierarchy, and those that feed upon their creations at the bottom. The headline grabbing accomplishments of these creators might then define an era, in an historical sense, to suggest that the people of that era were advancing, but were the bottom feeders advancing on parallel lines? Or, did the creators’ accomplishments, in some way, inhibit their advancement?

“(Chuck Klosterman) suggests that the internet is fundamentally altering the way we intellectually interact with the past because it merges the past and present into one collective intelligence, and that it’s amplifying our confidence in our beliefs by (a) making it seem like we’ve always believed what we believe and (b) giving us an endless supply of evidence in support of whatever we believe. Chuck Klosterman suggests that since we can always find information to prove our points, we lack the humility necessary to prudently assess the world around us. And with technological advances increasing the rate of change, the future will arrive much faster, making the questions he poses more relevant.” –Will Sullivan on Chuck Klosterman

My initial interpretation of this quote was that it sounded like a bunch of gobbeldy gook, until the reader rereads it with the latest social issue of the day plugged into it. What did the person think about that particular social issue as far back as a year ago? Have they had their mind changed on the topic? Have they been enlightened, or have they been proved right on something they didn’t believe as far back as one year ago? If we do change our minds on an issue as quickly as Klosterman suggests, with the aid of our new information resources, are prudently assessing these changes in a manner that allows for unforeseen consequences? This tendency we now have to change our minds quickly, reminds me of the catch phrase mentality. When one hears a particularly catchy, or funny, catchphrase, they begin repeating it. When another asks that person where they first heard that catchphrase, the person that now often uses the catchphrase, and didn’t start using it until a month ago, say that they’ve always been saying it.  

Another way of interpreting this quote is that with all of this information at our fingertips, the immediate information we receive on a topic, in our internet searches, loses value. Who is widely considered the primary writer of the Constitution, for example? A simple Google search will produce a name: James Madison. Who was James Madison, and what were his influences in regard to the document called The Constitution? What was the primary purpose of this finely crafted document that provided Americans near unprecedented freedom from government tyranny, and rights that were nearly unprecedented when coupled with amendments in the Bill of Rights. How much blood and treasure was spent to pave the way for the creation of this document, and how many voices were instrumental in the Convention that crafted and created this influential document?

Being able to punch these questions into a smart phone, and receive the names of those involved can provide them a static quality. The names James Madison, Gouvernor Morris, Alexander Hamilton, and the other delegates of the Constitutional Convention that shaped, crafted, and created this document could become an answer to a Google search, nothing more and nothing less. Over time, and through repeated searches, a Google searcher could accidentally begin to assign a certain historical inevitability to the accomplishments of these men. The notion being that if these names weren’t the answers, other names would be.

Removing my personal opinion that Madison, Morris, Hamilton, and those at the Constitutional Convention composed a brilliant document, for just a moment, the question has to be asked, could the creation of Americans’ rights and liberties have occurred at any time, with any men or women in the history of our Republic? The only answer, as I see it, involves another question: How many politicians in the history of the world have voted to limit their present power, and any future power they might achieve in the future, if their aspirations achieve fruition? How many current politicians would vote for something like term-limits? Only politicians that have spent half their life under what they considered tyrannical rule would fashion a document that could result in their own limitations.   

How many great historical achievements, and people, have been lost to this idea of historical inevitability? Was it an historical inevitability that America would gain her freedom from Britain? Was the idea that most first world people would have the right to speak out against their government, vote, and thus have some degree of self-governance inevitable? How many of the freedoms, opportunities, and other aspects of American exceptionalism crafted in the founding documents are now viewed as so inevitable that someone, somewhere would’ve come along and figured out how to make that possible? Furthermore, if one views such people and such ideas as inevitable, how much value do they attach to them? If they attain a certain static inevitability, how susceptible are they to condemnation? If an internet searcher has a loose grasp of the comprehensive nature of what these men did, and the import of these ideas on the current era, will it become an historical inevitability that they’re taken away in a manner that might initiate philosopher Vico’s theory on the historical inevitability of a fall?

I’ve heard it theorize that for every 600,000 people born, one will be a transcendent genius. I heard this secondhand, and the person that said it attributed it to Voltaire, but I’ve never been able to properly source it. The quote does provide a provocative point, however, that I interpret to mean that difference between one that achieves the stature of genius on a standardized test, or Intelligence Quotient (IQ) test, by scoring high enough on that test to achieve that lofty plateau, and the transcendent genius lies in this area of application. We’ve all met extremely intelligent people in the course of our lives, in other words, and some of us have met others that qualify as genius, but how many of them figured out a way to apply that abundant intelligence in a productive manner? This, I believe, is the difference between what many have asserted is a genius in a one in fifty-seven ratio and the one in 600,000 born. The implicit suggestion of this idea is that every dilemma, or tragedy, is waiting for a transcendent genius to come along and fix it. These are all theories of course, but it does beg the question what happens to the 599,999 that feed off the ingenious creations and thoughts of others for too long? It also begs the question that if the Italian philosopher Vico’s theories on the cyclical nature of history hold true, and modern man is susceptible to a great fall, will there be a transcendent genius that is able to fix the dilemmas and tragedies that await the victims of this great fall? 

Philosophical Doubt versus the Certitude of Common Sense


If philosophy is “primarily an instrument of doubt”, as Scientific American contributor John Horgan writes in the fifth part of his series, and it counters our “terrible tendency toward certitude”, can that sense of doubt prevail to a point that it collides with the clarity of mind one achieves with common sense? In an attempt to provide further evidence of the proclamation that philosophy is an instrument of doubt, Horgan cites Socrates definition of wisdom being the knowledge one has of how little they know. Horgan also cites Socrates’ parable of the cave, and it’s warning that we’re all prisoners to our own delusions.

“In Socrates’ Allegory of the Cave, Plato details how Socrates described a group of people who have lived chained to the wall of a cave all of their lives, facing a blank wall. The people watch shadows projected on the wall from objects passing in front of a fire behind them, and give names to these shadows. The shadows are the prisoners’ reality. Socrates explains how the philosopher is like a prisoner who is freed from the cave and comes to understand that the shadows on the wall are not reality at all, for he can perceive the true form of reality rather than the manufactured reality that is the shadows seen by the prisoners. The inmates of this place do not even desire to leave their prison; for they know no better life.”

“In the allegory, Plato (also) likens people untutored in the Theory of Forms to prisoners chained in a cave, unable to turn their heads. All they can see is the wall of the cave. Behind them burns a fire. Between the fire and the prisoners there is a parapet, along which puppeteers can walk. The puppeteers, who are behind the prisoners, hold up puppets that cast shadows on the wall of the cave. The prisoners are unable to see these puppets, the real objects, that pass behind them. What the prisoners see and hear are shadows and echoes cast by objects that they do not see.” 

What does Socrates’ cave symbolize? This allegory has probably been interpreted a thousand different ways over the thousands of years since Plato first relayed Socrates allegory. A strict reading of the allegory suggests that the cave is a place where the uneducated are physically held prisoner. The people are prisoner in a figurative sense, in that they’re prisoner to their own ideas about the world from their narrow perspective. A strict reading would also detail that the philosopher is the one person in the story free of a cave, and thus an enlightened man that now knows the nature of the forms. One could also say that various caves litter the modern era, and that the philosophers have their own cave. One could also say that those that remain in that philosopher’s cave for too long, until it, too, becomes an insular, echo chamber in which they become a prisoner.

Socrates bolstered this interpretation when he informed a young follower of his named Glaucon that:

“The most excellent people must follow the highest of all studies, which is to behold the Good. Those who have ascended to this highest level, however, must not remain there but must return to the cave and dwell with the prisoners, sharing in their labors and honors.”

A strict reading of this quote might suggest that the philosopher should return to the prisoner’s cave to retain humility. Another reading of it, could lead the reader to believe Socrates is suggesting that it is the responsibility of the philosopher to share his new insight with the cave dwellers. A more modern interpretation might be that the philosopher must return to the cave to round out his newfound intelligence by commingling it with the basic, common sense of other cave dwellers. Inherent in the latter interpretation is the idea that in the cave of philosophical thought, one might lose perspective and clarity, and they can become victims of their own collective delusions.

The philosopher could accept an idea as a fact, based on the idea that the group thought contained within the philosophical cave accepts it as such. This philosopher may begin to surround themselves with like-minded people for so long that they no longer see that cave for what it is. The intellectual might also fall prey to the conceit that they’re the only ones not living in a cave. The intellectual might also see all other caves for what they are, until they come upon their own, for theirs is the cave they call home. As Horgan says, citing the responses of “gloomy” students responding to the allegory of the cave, “If you escape one cave, you just end up in another.”

One of the only moral truths that John Horgan allows, in part five of his series, that trends toward a “terrible tendency toward certitude” is the argument that “ending war is a moral imperative.” This is not much of a courageous or provocative point, as most cave dwellers have come to the same conclusion as Mr. Horgan. Most cave dwellers now view war as something that we only utilize as a last alternative, if at all.

For whom are we issuing this moral imperative, is a question that I would ask if I were lucky enough to attend one of Mr. Hogan’s classes. If we were to issue the imperative to first world countries, I would suggest that we would have a very receptive audience, for most of the leaders of these nations would be very receptive to our proposed solutions. If we were to send it out to tyrannical leaders and oppressive governments of third world governments, I am quite sure that we would have an equally receptive audience, as long as our proposed solutions pertained to the actions of first world countries.

Former Beatles musician John Lennon engaged in similar pursuit in his “make love not war” campaign, but Lennon directed his campaign to first world leaders almost exclusively. Some of us now view this venture as a colossal waste of time. If Lennon had directed his moral imperative at the third world, and their dictators were genuinely receptive to it, Lennon could’ve changed the world. If these third world leaders agreed to stop slaughtering, and starving their country’s people, and they also agreed to avoid engaging in skirmishes with their neighbors, all of us would view John Lennon as a hero for achieving peace in our time. This scenario also presupposes that these notoriously dishonest leaders weren’t lying to Lennon for the purposes of their own public relations, and that the leaders did their best to live up to such an agreement while having to quash coups to take the government over by a tyrannical leader that has other plans. This is, admittedly, a mighty big asterisk and a relative definition of peace, but if Lennon were able to achieve even that, the praise he received would be unilateral.

What Lennon did, instead, was direct the focus of his sit-ins, and sleepins, to the leaders of the Britain and The United States. The question I would’ve had for John Lennon is, how often, since World War II, have first world countries gone to war with one another? Unless one counts the Cold War as an actual war, or the brief skirmish in Yugoslavia, there hasn’t been a great deal of military action between the first world and the second world since World War II either. Most of what accounts for the need for military action, in modern times, involves first world countries attempting to clean up the messes that have occurred in third world countries.

If Lennon’s goals were as genuinely altruistic, as some have suggested, and not a method through which he could steal some spotlight from his rival, Paul McCartney, as others have suggested, he would have changed the focus of his efforts. Does this suggest that Lennon’s sole purpose was achieving publicity, or does it suggest that Lennon’s worldview was either born, or nurtured in an echo chamber in which everyone he knew, knew, that the first world countries were the source of the problems when it came to the militaristic actions involved in war?

To those isolationists that will acknowledge that most of the world’s problems occur in the third world, they suggest that if The United States and Britain would stop playing world police and let these third world countries clean up their own messes, we would achieve a form of peace. To these people, I would suggest that the world does have historical precedent for such inaction: Adolf Hitler. Some suggest that war with Hitler was inevitable. They declare that Hitler was such a blood thirsty individual that he could not be appeased. Britain’s Prime Minister Neville Chamberlain did try, however, and the world trumpeted Chamberlain’s name for achieving “peace in our time”. Chamberlain’s nemesis in the parliament, Winston Churchill, suggested that Chamberlain tried so hard to avoid going to war that he made war inevitable. Churchill suggested that if Britain engaged in more diplomatic actions, actions that could have been viewed as war-like by Germany, such as attempting to form a grand coalition of Europe against Hitler, war might have been avoided. We’ll never know the answer to that question of course, but how many of those living in the caves of idealistic utopia of ending war, as we know it, would’ve sided with Prime Minister Neville Chamberlain and against Churchill, in the lead up to, and after, the Munich Peace Accords? How many of them would’ve suggested that Hitler signing the accords meant that he did not want war, and that heeding Churchill’s warnings would’ve amounted to a rush to war? Churchill has stated, and some historians agree, that the year that occurred between Munich and Britain’s declaration of war, left Britain in a weaker position that led to a prolonged war. How many of those that live in anti-war caves would’ve been against the proposal to form a grand coalition of Europe against Germany, because it might make Germany angry, and they could use it as a recruiting tool?

The point of listing these contrarian arguments is not to suggest that war is the answer, for that would be a fool’s errand, but to suggest that even those philosophers that believe they have the strongest hold on a truth may want to give doubt a chance. It is also a sample of a larger argument. The larger argument suggests that while the philosopher’s viewpoint is mandatory to those seeking a well-rounded perspective, they are not the only people in need of one.

If the only ones a person speaks to one day confirm their bias, they may need to visit another cave for a day. They may not agree with other cave dwellers, but they may hear different voices on the matter that influence their approach to problem solving. The point is if the only thing a student of philosophy hears in a day is doubt directed at the status quo, and that they must defeat that certitude, how far can that student venture down that road before they reach a tip of the fulcrum, and everything they learn beyond that progressively divorces them from common sense?

In the hands of quality teachers and writers, philosophy can be one of the most intoxicating disciplines for one to explore, and some are so fascinated they choose to follow it as their life’s pursuit. Those of us that have explored the subject beyond Philosophy 101, on our own time, have learned to doubt our fundamental structure in ways that we feel compelled to share. This period of discovery can lead some of us to question everything those that formed us hold dear. At some point in this self-imposed challenge to pursue answers to simple questions that are more well-rounded, some of us reveal that not only have we escaped the prisoner’s cave, but we’ve become prisoners in the philosopher’s cave. Few recognize when their answers to the forms dancing on wall reveal this, but those of us that have, have had an intruder inform us “It’s a goat.”