Don’t Go Chasing Eel Testicles: A Brief, Select History of Sigmund Freud


We all envy those who knew, at a relatively young age, what they wanted to do for a living. Most of us experience some moments of inspiration that might lead us toward a path, but few of us ever read medical journals, law reviews, or business periodicals during our formative years. Most of the young people I knew preferred an NFL preview guide of some sort, teenage heartthrob magazines, or one of the many other periodicals that offer soft entertainment value. Most of us opted out of reading altogether and chose to play something that involved a ball instead. Life was all about playtime for the kids I grew up around, but there were other, more serious kids, who we wouldn’t meet until we were older. Few of them knew they would become neurosurgeons, but they were so interested in medicine that they devoted huge chunks of their young lives to learning everything their young minds could retain. “How is this even possible?” some of us ask. How could they achieve that level of focus at such a young age, we wonder. Are we even the same species?

At an age when so many minds are so unfocused, they claimed to have tunnel vision. “I didn’t have that level of focus,” some said to correct the record, “not the level of focus to which you are alluding.” They may have diverged from the central focus, but they had more direction than anyone I knew, and that direction put them on the path of doing what they ended up doing, even if it wasn’t as specific as I guessed.

The questions we have about what to do for a living have plagued so many for so long that comedian Paula Poundstone captured it with a well-placed joke, and I apologize, in advance, for the paraphrasing: “Didn’t you hate it when your relatives asked what you wanted to do for a living? Um, Grandpa I’m 5. I haven’t fully grasped the mechanics or the importance of brushing my teeth yet. Those of us of a certain age have now been on both sides of this question. We’ve been asking our nieces and nephews this question for years without detecting the irony. What do you want to do when you grow up? Now that I’ve been asking this question long enough, I’ve finally figured out why we ask it. Our aunts and uncles asked us this question, because they were looking for ideas. I’m in my forties now, and I’m still asking my nieces and nephews these questions. I’m still looking for ideas.”

Pour through the annals of great men and women of history, and that research will reveal legions of late bloomers who didn’t accomplish anything of note until late in life. The researcher will also discover that most of the figures who achieved success in life were just as dumb and carefree as children as the rest of us were, until the seriousness of adulthood directed them to pursue a venture in life that would land them in the annals of history. Some failed more than once in their initial pursuits, until they discovered something that flipped a switch.

Those who know anything about psychology, and many who don’t, are familiar with the name Sigmund Freud. Those who know anything about Freud are aware of his unique theories about the human mind and human development. Those who know anything about his psychosexual theory know we are all repressed sexual beings plagued with unconscious desires to have relations with some mythical Greek king’s mother. What we might not know, because we consider it ancillary to his greater works, is that some of his theories might have originated from Freud’s pursuit of the Holy Grail of nineteenth-century science, the elusive eel testicles.

Although some annals state that an Italian scientist named Carlo Mondini discovered eel testicles in 1777, other periodicals state that the search continued up to and beyond the search of an obscure 19-year-old Austrian’s in 1876.[1] Other research states that the heralded Aristotle conducted his own research on the eel, and his studies resulted in postulations that stated either that the beings came from the “guts of wet soil”, or that they were born “of nothing”.[2] One could guess that these answers resulted from great frustration, since Aristotle was so patient with his deductions in other areas. On the other hand, he also purported that maggots were born organically from a slab of meat. “Others, who conducted their own research, swore that eels were bred of mud, of bodies decaying in the water. One learned bishop informed the Royal Society that eels slithered from the thatched roofs of cottages; Izaak Walton, in The Compleat Angler, reckoned they sprang from the ‘action of sunlight on dewdrops’.”

Before laughing at any of these findings, one must consider the limited resources these researchers had at their disposal, concerning the science of their day. As is oft said with young people, the young Freud might not have had the wisdom yet to know how futile this task would be when a nondescript Austrian zoological research station employed him. It was his first job, he was 19, and it was 1876. He dissected approximately 400 eels, over a period of four weeks, “Amid stench and slime for long hours” as the New York Times described Freud’s working environment. [3] His ambitious goal was to write a breakthrough research paper on an animal’s mating habits, one that had confounded science for centuries. Conceivably, a more seasoned scientist might have considered the task futile much earlier in the process, but an ambitious, young 19-year-old, looking to make a name for himself, was willing to spend long hours slicing and dicing eels, hoping to achieve an answer no one could disprove.

Unfortunate for the young Freud, but perhaps fortunate for the field of psychology, we now know that eels don’t have testicles until they need them. The products of Freud’s studies must not have needed them at the time he studied them, for Freud ended up writing that his total supply of eels were “of the fairer sex.” Freud eventually penned that research paper over time, but it detailed his failure to locate the testicles. Some have said Freud correctly predicted where the testicles should be and that he argued that the eels he received were not mature eels. Freud’s experiments resulted in a failure to find the testicles, and he moved into other areas as a result. The question on the mind of this reader is how profound of an effect did this failure to find eel testicles have on his research into human sexual development?

In our teenage and young adult years, most of us had odd jobs that affected us in a variety of ways, for the rest of our working lives. For most, these jobs were low-paying, manual labor jobs that we slogged through for the sole purpose of getting paid. Few of us pined over anything at that age, least of all a legacy that we hoped might land us in annals of history. Most of us wanted to do well in our entry-level jobs, to bolster our character, but we had no profound feelings of failure if we didn’t. We just moved onto other jobs that we hoped we would find more financially rewarding and fulfilling.

Was Freud’s search for eel testicles the equivalent of an entry-level job, or did he believe in the vocation so much that the failure devastated him? Did he slice the first 100 or so eels open and throw them aside with the belief that they were immature? Was there nothing but female eels around him, as he wrote, or was he beginning to see what had plagued the other scientists for centuries, including the brilliant Aristotle? There had to be a moment, in other words, when Sigmund Freud realized that they couldn’t all be female. He had to know, at some point, that he was missing the same something everyone else missed. He must have spent some sleepless nights struggling to come up with a different tactic. He might have lost his appetite at various points, and he may have shut out the world in his obsession to achieve infamy in marine biology. He sliced and diced over 400 after all. If even some of this is true, even if it only occupied his mind for four weeks of his life, we can feasibly imagine that the futile search for eel testicles affected Sigmund Freud in a profound manner.

If Freud Never Existed, Would There Be a Need to Create Him?

Every person approaches a topic of study from a subjective angle. It’s human nature. Few of us can view people, places, or things in our lives, with total objectivity. The topic we are least objective about, say some, is ourselves. Some say that we are the central topic of speculation when we theorize about humanity. All theories are autobiographical, in other words, and we pursue such questions in an attempt to understand ourselves better. Bearing that in mind, what was the subjective angle from which Sigmund Freud approached his most famous theory on psychosexual development in humans? Did he bring objectivity to his patients? Could he have been more objective, or did Freud have a blind spot that led him to chase the elusive eel testicles throughout his career in the manner Don Quixote chased windmills?

After his failure, Sigmund Freud would switch his focus to a field of science that would later become psychology. Soon thereafter, patients sought his consultation. We know now that Freud viewed most people’s problems through a sexual lens, but was that lens tinted by the set of testicles he couldn’t find a lifetime ago? Did his inability to locate the eel’s reproductive organs prove so prominent in his studies that he saw them everywhere he went, in the manner that a rare car owner begins to see his car everywhere, soon after driving that it off the lot? Some say that if this is how Freud conducted his sessions, he did so in an unconscious manner, and others say this might have been the basis for his theory on unconscious actions. How different would Freud’s theories on development have been if he found his Holy Grail, and the Holy Grail of science at the time? How different would his life have been? We could also wonder if Freud would have even switched his focus if he found fame as a marine biologist with his findings.

How different would the field of psychology be today if Sigmund Freud remained a marine biologist? Alternatively, if he still made the switch to psychology after achieving fame in marine biology, for being the eel testicle spotter, would he have approached the study of the human development, and the human mind from a less subjective angle? Would his theory on psychosexual development have occurred to him at all? If it didn’t, is it such a fundamental truth that it would’ve occurred to someone else over time, even without Freud’s influence?

We can state, without too much refutation, that Sigmund Freud’s psychosexual theory has sexualized the beliefs many have about human development, a theory others now consider disproved. How transcendental was that theory, and how much subjective interpretation was involved in it? How much of the subjective interpretation derived from his inability to find the eel testicle fueled it? Put another way, did Freud ever reach a point where he began overcompensating for that initial failure?

Whether it’s an interpretive extension, or a direct reading of Freud’s theory, modern scientific research theorizes that most men want some form of sexual experience with another man’s testicles. This theory, influenced by Freud’s theories, suggests that those that claim they don’t are lying in a latent manner, and the more a man says he doesn’t, the more repressed his homosexual desires are.

The Williams Institute at UCLA School of Law, a sexual orientation law think tank, released a study in April 2011 that stated that 3.6 percent of males in the U.S. population are either openly gay or bisexual.[4] If these findings are even close to correct, this leaves 96.4 percent who are, according to Freud’s theory, closeted homosexuals in some manner. Neither Freud nor anyone else has been able to put even a rough estimate on the percentage of heterosexuals who harbor unconscious, erotic inclinations toward members of the same sex, but the very idea that the theory has achieved worldwide fame leads some to believe there is some truth to it. Analysis of some psychological studies on this subject provides the quotes, “It is possible … Certain figures show that it would indicate … All findings can and should be evaluated by further research.” In other words, no conclusive data and all findings and figures are vague. Some would suggest that these quotes are ambiguous enough that they can be used by those who would have their readers believe that most of the 96.4 percent who express contrarian views are actively suppressing their desire to not just support the view, but to actively involve themselves in that way of life.[5]

Some label Sigmund Freud as history’s most debunked doctor, but his influence on the field of psychology and on the ways society at large views human development and sexuality is indisputable. The greater question, as it pertains specific to Freud’s psychosexual theory, is was Freud a closet homosexual, or was his angle on psychological research affected by his initial failure to find eel testicles? To put it more succinct, which being’s testicles was Freud more obsessed with finding during his lifetime?

[1]https://en.wikipedia.org/wiki/Eel_life_history

[2]http://www.theguardian.com/environment/2010/oct/27/the-decline-of-the-eel

[3]http://www.nytimes.com/2006/04/25/health/psychology/analyze-these.html

[4]https://en.wikipedia.org/wiki/Demographics_of_sexual_orientation

[5]http://www.pbs.org/wgbh/pages/frontline/shows/assault/roots/freud.html

If you enjoyed this unique perspective on Sigmund Freud, you might also enjoy the following:

Charles Bukowski Hates Mickey Mouse

The History of Bloodletting by Mark Twain

The Perfect Imperfections of Franz Kafka’s Metamorphosis

James Joyce: Incomparable or Incomprehensible?

Rasputin I: Rasputin Rises

Advertisements

Octopus Nuggets II


Octopus Nuggets I discussed some unusual characteristics of our favorite cephalopod, including the idea that two thirds of the octopuses brain is in their arms, the manner in which the three hearts of an octopus operate, some stories of their reproductive process, and the near-unprecedented loyalty a mother octopus extends to her offspring. We also discussed the ink cloud defense, and the fascinating pseudomorph the octopus creates when, presumably, a simple ink cloud doesn’t confuse the predator enough. If any of these characteristics fascinate the reader, I suggest they read that post first, as this second installment is more of an extension on the more elementary discussion on the characteristics of the octopus.

Image courtesy of Cool Facts for Kids

With the recent and largely refuted click-bait story that the octopus may have originated on another planet, my interest in the octopus was reborn. A word of caution here, the information in this second installment may blow your mind. I’m not going suggest that you take a seat, as I am biologically predisposed to avoiding clichés of this stripe, but if anything happens to anyone while reading the final third of this piece, I hereby absolve myself of all responsibility.

Most of those who love stories regarding the surprisingly complex brain of the octopus have heard the myriad of stories regarding the ability the octopus has to figure puzzles out and escape the best, most secure aquariums, and the tales of SCUBA divers playing hide and seek with an octopus. A writer for Wired, Katherine Harmon Courage, has presumably heard the same stories, and she has an interesting, provocative idea for why we should continue to explore the octopus for more stories though more research, as they might prove instrumental in developing a greater understanding of the human mind.

“If we can figure out how the octopus manages its complex feats of cognition, we might be closer to discovering some of the fundamental elements of thought –and to developing new ideas about how mental capacity evolved.”

As stated in the first installment, the octopus has more neurons in its arms than it does in its brain. I assume the arms and brain work in unison for some sort of prime directive, but what if one of the arms disagrees? As Scientific American states, “Like a starfish, an octopus can regrow lost arms. Unlike a starfish, a severed octopus arm does not regrow another octopus.” So, if the brain directs the arm to perform a dangerous task, does an arm ever exhibit self-preservation qualities? Does an arm ever say something equivalent to, “I saw what you did to arm number four last week, and I witnessed you grow another arm, good as new, in a short time. I do not consider myself as expendable as arm number four was. I am a quality arm that has served you well over the years,“ arm says to brain. “Why don’t you ask arm number seven to perform this function? We all know that arm is far less productive.” I realize this is largely a silly rhetorical question that no one can answer, but it gets to the core question, how much autonomy do the arms have?

Blue Blood: How many of us believed the tale that humans have blue blood, and that it only turns red when introduced to oxygen. The octopus actually does have blue blood, and as Laurie L. Dove writes in How Stuff Works, it’s crucial to their survival.

“The same pigment that gives the octopus blood its blue color, hemocyanin, is responsible for keeping the species alive at extreme temperatures. Hemocyanin is a blood-borne protein containing copper atoms that bind to an equal number of oxygen atoms. It’s part of the blood plasma in invertebrates.” She also cites a National Geographic piece by Stephan Sirucek when she writes, “[Blue blood] also ensures that they survive in temperatures that would be deadly for many creatures, ranging from temperatures as low as 28 degrees Fahrenheit (negative 1.8 degrees Celsius) to superheated temperatures near the ocean’s thermal vents.”

On the planning front, the Katherine Harmon Courage piece in Wired by states that researchers have discovered that octopuses in Indonesia will gather coconut shell halves in preparation for stormy weather, then take shelter by going inside the two pieces of shell and holding it shut.

Courage’s Wired piece also suggested, “If you asked Jean Boal, a behavioral researcher at Millersville University about the inner life of octopuses, she might tell you that they are cognitive, communicative creatures. Boal attempted to feed stale squid to the octopuses in her lab and one cephalopod sent her a clear message: It made eye contact and used one of its arms to shove the squid down a nearby drain, effectively telling her that stale food would be discarded rather than being eaten.”

The freaky almost unnerving elements of this story, for me, lay in the details of the Jean Boal’s story. The idea that an animal might exhibit a food preference suggests a certain level of intelligence, but I’m not sure if that level of intelligence surpasses that of say the dog or the cat. The eerie part for me occurred in contemplating how the octopus relayed that message. Boal suggested that she fed the stale squid to a number of her octopus subjects, and when she returned to the first octopus in that line, that first octopus waited for her to return. It looked her in the eye when she did and shoved the stale squid down the drain, maintaining eye contact throughout the act. I wasn’t there, of course, and I can only speculate based on what Boal said occurred during this incident, but she made it sound like the octopus made a pointed effort to suggest that not only didn’t want to eat what Boal served it, but it was insulted by her effort to pass this stale squid off as quality food.

We all characterize our pets, and other animals with human emotions and statements, but how many dogs and cats will do something more than sniff at the food and move along? How many will wait for a human to return, so they can be assured that the message will be received, and how many will look the humans in the eye before discarding the food in such an exclamatory manner? I don’t know if you’re anything like me, but the thought creeps me out in the sense that I thought I had a decent framework for how intelligent these beings were.

The characteristics we’ve discussed thus far in part I and in the portion you’ve read thus far in part II are fascinating to me, illuminating, and as I say unsettling to those of us that find comfort in the idea that humans are heads and shoulders more intelligent than the other species. This next part may be where the reader reconsiders whether they should set up some reinforcements behind them.

Recent scientific discoveries are suggesting that the octopus can edit their Ribonucleic acid (RNA). Boom! How are you doing? Did you forget to remove any sharp objects behind you? If the only thing keeping you upright is the idea that you kind of, sort of don’t know what RNA is? Don’t worry, I had to look it up too, and the Google dictionary defines RNA as an enzyme that works with deoxyribonucleic acid (DNA) in that it “carries instructions from DNA for controlling the synthesis of proteins, although in some viruses RNA rather than DNA carries the genetic information.”

For those that don’t consider this a “Holy stuff!” fact, think about this. The next time you’re in your man cave engaged in a spider solitaire marathon some octopus somewhere is in their cave re-configuring their molecular structure to redefine their characteristics in a manner that will help it escape a shark attack better. One example might be the pseudomorph. The octopus may have sat in their cave one day realizing that the simple shot of ink was no longer confusing sharks the way it once did, so it reconfigured its typical ink cloud shooting abilities to produce a self-portrait of itself that might confuse predators for just enough time to secure the octopuses’ survival. As we will discuss later, octopus researchers aren’t sure why they edit their RNA, but we have to assume it has something to do with predation, either surviving it or finding nuanced ways to perfect it. If you’re is nowhere near as fascinated with this idea as I am, at this point, you will have to excuse my crush with these cephalopods in the ensuing paragraphs.

An article from Business Insider further describes the difference between DNA and RNA as it applies to editing them, by stating, “Editing DNA allows a species to evolve in a manner that is more permanent for future generations. This is how most species evolve and survive. When a being edits their RNA, however, they can essentially “try out” an adaptation” to see if it works. One other note the authors of this piece make on this subject is that “Unlike a DNA adaptation, RNA adaptations are not hereditary.” Therefore, one can only guess that if an octopus discovers an RNA rewrite that is successful for survival or predation, they can presumably teach this to their offspring, or pass it along information by whatever means an octopus passes along such information. (Octopuses are notorious loners that don’t communicate with one another well.)

A quote from within the article, from a Professor Eli Eisenberg, puts it this way: “You can think of [RNA editing] as spell checking. If you have a word document. If you want to change the information, you take one letter and you replace it with another.”

Research suggests that while humans only have about ten RNA editing sites, octopuses have tens of thousands. Current science is unable to explain why an octopus edits their RNA, or when it started in the species. I must also add here that I don’t know how they can determine with any certitude that an octopus can edit their RNA. I’m sure that they examine the corpses of octopuses and compare them to others, but how can they tell that the octopus edits their RNA themselves? How do they know, with this degree of certitude, that there aren’t so many different strains of octopus who all have wide variables in their RNA strands? I’m sure someone will tell me that the process is far more elementary than I’m making it, and I’m revealing my ignorance on this topic in this paragraph, but I’ve read numerous attempts to study the octopus, and almost all of them suggest that the live octopus is notoriously difficult to study. Some have described their rebellious attempts to thwart brain study as obnoxious. If that’s the case, then I have to ask if the conclusions they reach are largely theoretical based on the studies of octopus corpses.

If it’s an embarrassing display of ignorance on my part to ask how we know if octopuses edit their RNA, is it more embarrassing to ask if we know how they do it? For those that consider this a futile task, I again ask how do we know that they do it in the first place. The answer to that question circles back to Katherine Harmon Courage’s provocative notion that “If we can figure out how the octopus manages its complex feats of cognition, we might be closer to discovering some of the fundamental elements of thought –and to developing new ideas about how mental capacity evolved.”

If we are able to do that, Gizmodo.com quotes scientists that suggest we might be able to root out a mutant RNA in our own strands to see if we can edit them in a manner that helps us cure a number of ailments heretofore considered incurable.

For those scientists that seek guidance on how to edit human RNA the authors of the Business Insider, David Anderson and Abby Tang piece cited above suggest that if these scientists “Have recently proven ways of using the [genome editing tool] CRISPR-Cas9 to edit RNA, perhaps they can learn a thing or two from the experts [octopuses].”

Scat Mask Replica III


1) The Rasputin Paradox. Are you involved in an enterprise in which one person’s alleged ineptitude is holding you back from realizing the vast potential of that enterprise? Is your enterprise one-step away from removing that alleged ineptitude? Those who know the history of the Russian Empire know to be careful what they wish for. Some speculate that Grigori Yefimovich Rasputin had far less influence in the Russian Empire (circa WWI) than history details, and they double down on that by saying that the Romanovs would not refute what others said about Rasputin’s influence, because they enjoyed having Rasputin play the role of the scapegoat. If they did not know the level of blame others placed on Rasputin while he was alive, they definitely found out after his death, because after Rasputin was murdered the focal point for the Empire’s ineptitude was gone. Those in politics, business, and in personal crisis should note that casting blame on one particular person for the failure of your enterprise may prove cathartic in the short-term, but once that person’s gone, it might reveal more about the general ineptitude of that enterprise than any of the other players ever imagined.   

2) “If you have facts on your side, pound the facts. If you have the law on your side, pound the law. If you don’t have either, pound the table.” One of the more uncomfortable situations I’ve experienced involve someone pleading with me to accept them as a genuine person. It’s a gross over simplification to suggest that anytime someone pounds the proverbial table to convince me of something that they’re lying. We’re all insecure about our presentations, and some of us pound the table even when we have the facts on our side. I know it’s easy to say, but those with facts on their side should relax and allow them to roll out as they may. The truth teller who finds it difficult to avoid pleading their case should also know that after we reveal enough supportive evidence most will believe you, but some just enjoy watching us squirm.

3) Speaking of the genuine article, it has recently come to my attention that some pathetic soul stole at least two of the articles from this site. Some call this plagiarism, but I call it pathetic. If imitation is the sincerest form of flattery, I suppose I should consider it a compliment, but this is outright theft. It seems redundant to me to clarify the rules on this matter, but if a writer is going to “repost” they are required to provide attribution. (For those unclear on the definition of this term, it means that a writer is supposed to inform their audience that they didn’t write the article.) Not only did this pathetic soul avoid attributing the article to me, but they also didn’t provide proper attribution to the quotes I did in the article they stole. So, this person (who provides no discernible path back their identity) anonymously steals posts to presumably receive checks from companies that pay writers to sport ads on their site. I don’t care how much those sponsored ads pay, how does this person sleep at night knowing that the profession or hobby they chose is one in which they cannot produce their own quality material. If I were ever to reach a level of such a desperate act, I would seek another profession or hobby. 

4) The difference between selfishness and self-awareness. A complaint about young men and women is that they’re too selfish. It’s the root of the problem, they suggest. I don’t know if it’s true, but if it is I would suggest that those speaking out against it are delivering an incomplete message. My platform would suggest that these selfish types are focusing on self-awareness, and that they should seek it to achieve a level of fulfillment. We could view striving to achieve greater self-awareness as a selfish pursuit, but self-awareness can take several forms. Performing selfless acts, for example, can teach a person a lot about themselves, and it should be encouraged, as people performing many selfless acts can become more aware of themselves and more selfless. The process could lead to an antonym of the vicious cycle these complainers decry. If I had a pulpit, I would also declare that an individual could learn more about themselves through spirituality. I’ve been on both sides of the value of scripture, and I think this gives me greater perspective on the matter. I look at scripture and other Biblical teachings as a roadmap to personal happiness through reflection. Self-interest drives me to follow those teachings because I believe it’s in my best interests to follow them. In short, I would play my sermon to the selfish predilections of the young. I hear sermons that suggest otherwise, and I can’t help but think that the priest is missing a beat.

5) As a former service industry employee, I’ve encountered my share of disgruntled customers. I could provide a list of examples, but the material of their complaints is irrelevant. Most experienced service industry employees know that the most disgruntled customers are the most disgruntled people. They might hate their kids, the spouse, and their life. Whatever the case is, the discrepancy they find causes them to unload, “What kind of Mickey Mouse operation are you running here? Your ad says this item is on sale today for two bucks. If you think I’m going to pay more than that, you must think I’m stupid! Or, are you singling me out based on my characteristics?” These statements are often a mere introduction to a heated exchange that reveals the effort of the disgruntled customer to achieve some satisfaction they can’t find elsewhere in life. A more confident customer would simply say, “Your ad says that this item is on sale today for two dollars.” Those of us who have experience in the service industry know how intimidating a confident presentation of the facts can be, especially from a more secure individual.

6) A new documentary captures an ant crawling down from a piece of cheesecake with a piece of it lodged in its mandibles. The makers of this documentary capture the ant’s progress, in stop action photography, as this permits progressed commentary from various filmmakers talking about the brilliance of each segment. Where does the ant go, and what will it do with the small, round ball of cheesecake? This is the plotline of an amazing new documentary called Posterula. (Spoiler alert) The ant makes it off the plate, but the viewers don’t know if the ant ever takes the piece to the colony to feed the queen. This leads this viewer to believe that an as of yet undisclosed plan for a sequel to this brilliant documentary is in the works.

Hi, I’m Rilaly, and if I were to take you on a tour of my young mind, this would be but an example of what you would read. Some suggest that such humor is too niche, and if that’s the case I would’ve niched my way out of the market. If I had one of my stories published, customers at bookstores would’ve walked past my serious pieces, thinking that I’m nuts, too far gone, and unserious. They probably still think that. I’m niche.

7) I landed upon the term “vague and flexible by design” the other day. The author of the term intended it as a compliment for the subject, but if they directed such a characterization at me, I would view it as an insult. I understand that we’re different people in different surroundings, and that we should all remain flexible with our ideals to prepare for new findings on the subject in question, but the “vague and flexible by design” compliment would register a ‘no core’ insult to me.

8) What hotel, or meeting space, first decided to serve a ball of meat as a solitary entrée? Someone somewhere should’ve stepped in and said, “Woops, you forgot the fixins.” Those who have attended more than twenty corporate galas, weddings, or any catered event are now more than accustomed to the items served in a buffet line. I now eat before I attend one of these functions, because I cannot eat another pinwheel, I’m burnt out on hot wings, and I hit my personal threshold on room temperature potatoes au gratin somewhere around 2004. I am not a finicky eater, but I can no longer stomach this list of dietary choices. I will acknowledge that being American provides me the luxury of making odd and unreasonable dietary choices, but if I’m going to limit myself to one meal a day to maintain a plump figure, as opposed to fat or obese, I’m not going to eat something just because others provide it in a visually pleasing manner.  

9) There is a difference between writing a grammatically correct sentence and quality writing. I took college classes on creative writing, I’ve read the MLAs, and I’ve learned through word-of-mouth what leads to quality reading. I’ve fixed the passive voice sentences, deleted the word “had” as often as possible, and I’ve tried to avoid what fellow writers call “the you-yous”. The goal for the writer is to adhere to the rules of writing while attempting to maintain a stream-of-consciousness style that makes for quality reading. It’s not considered grammatically incorrect to write that you may not enjoy this sentence, but writing that the reader may enjoy it without the word you is considered a more pleasant reading experience. I’ve also attempted to write “who” instead of “that”, and I’ve attempted to limit my need to “that” too often. Example: “You don’t want to write that it was someone else that said something, when who said it is much more familiar to you.” In that sentence, fellow writers suggest using the word “Writers” to replace the first you, and “Readers” is an advisable replacement for the second you. Beta readers suggest that doing otherwise means the writer has a bad case of the you-yous. You is too familiar to you, and that is too unfamiliar, and you do not want to be too familiar or too unfamiliar. The first reason for following this rule is that the writer does not want to write in the manner they speak, because the way one speaks in one locale may not be as familiar to a reader in another locale. These standards set a common base for readers, free from colloquialisms. The you yous also creep up on a writer in free flow, and they may not notice how redundant the use of the word is in their document. The question that haunts me is do I want a perfect document to impress accomplished writers, or do I want to pleasure myself with a document that might have some flaws. The notion one writer lofted was every writer makes mistakes, we readers weave them into the cloth of our expectations, but is there a point when the mistakes distract from the whole.

10) “He’s such an idiot,” Teri said after her boyfriend left the party table to go to the bathroom. “He cheats on me all the time. For all I know, he’s arranged something in the bathroom. I’m serious. I can’t even trust him to go to the bathroom.” Such comments are so unexpected that they’re hilarious.

“Why the hell are you dating him then?” I asked. Room silencing, impulsive comments like these are my gift to the world. I can flatten the smile of any decent person from fifty yards with a single thought implanted in their brain.

The comment sat right with me, but the moment after I delivered it I realized it was so loaded with complications that no one in the right mind would deliver it to a table of people gathered together for the sole purpose of mixing in some laughter with their fun. I thought it might add to the fun, or spur her into extensions on the joke, but I was wrong. I made her uncomfortable.    

As soon as she recovered from the blow, aided by my discomfort, she displayed the idea that she locked herself into a certain, cynical dynamic of life. She knew the world was full of it, and everyone around her was too, in one way or another, because she knew she was. She thought her beau was full of it too, but “He’s a nice guy…most of the time.” I didn’t know if that was her final answer, but I overemphasized my acknowledgement of her answer to suggest that was what I sought.

No matter how often I affirmed her answers, Teri kept coming at me with answers. She said he was “Funny and fun to be around.” She said he was good looking, and she said he did “Sweet things for her.” I couldn’t get out of this uncomfortable spiral of my own making. I pretended to be interested, because I knew I put her in the uncomfortable position of having to explain one of life’s most illustrating choices, but I was trying to end the episode with every word she said to me.

Most of us cannot explain our life altering choices so well that we can weather interrogations. I knew this, but I thought I could explain most of my choices at the time. The question that even the most reflective must ask themselves is, is their base so solid that we make rational, informed choices in the impulsive moments? I don’t think many reflective types would pass their own interrogations, in the moment, for I think we color in the blanks later to make us believe we made informed choices.

Teri told me he was a good man, with a good job, and he had an unusual curiosity about life that she found fascinating. I also learned that while it was obvious he had a restless, nervous energy about him, “He’s incredibly lazy. If he had his choice, he would spend his day on a couch.”

I still didn’t understand the dynamics of their relationship, even though she provided me numerous answers. I wouldn’t understand it for a while. I had no idea at the time that their relationship depended on the idea I had that she enjoyed playing the jealous girl, because, I can only assume, she considered him worthy of her jealousy, and in a world of average men with no discernible qualities, that is something. He was the naughty boy, and he enjoyed that role. “We fight like cats and dogs,” she said with a gleam in her eye, “but then we have makeup sex.” I wondered if she dated guys that wouldn’t cheat on her. I wonder if they wouldn’t fight with her. I wondered if they bored her. He provided her something to focus on other than herself. He was the dunce, but he was an amiable dunce. He provided her drama. He was always on the cusp of cheating on her. She also had a desire to date a guy that she could be better than, and she wasn’t much. Either that, or there is a desire to care for something that could break. “He’s an idiot, he doesn’t know how good he has it,” she said more than twice. The guy was fulfilling the age-old male need of feeling like a bad boy. Most guys need this coursing through their veins, and some girls apparently need a guy like this too.

11) Unhappy couples fascinate me. They don’t smile often, but smiles are a refuge of the simple minded. They don’t hug, kiss, or touch very often, but they’re not that type of people. They’re emotionally distant people, and happy people make them sick. Do they have a greater understanding about who they are than we ever will, or are they jealous? She didn’t date in high school, and he was a broken boy. Death of a loved one breaks some, divorce breaks others, and still others experience a seismic betrayal that creates an irreparable break. Yet, they found something in one another that they always wanted. As an outsider looking in, we can’t understand the allure, but the two of them stay together for years. Some stay in a job they hate, because they fear the unknown. Do people stay in relationships for the same reason? He doesn’t speak often, and relatives find it difficult to strike up a conversation with him. He gives off the vibe that he’s not interested in what others have to say, and this affects the way others react to him.

My initial instinct was that he wasn’t interested in what I had to say, for reasons endemic to our relationship, until others informed me they shared similar experiences with him. He’s more interesting when he drinks, but when the night is over, the participants realize he wasn’t really interesting in the truest sense of the word, but he was more interesting than they expected him to be. A couple of drinks loosen our inhibitions. A couple more might loosen them even more, until the potential exists for us to become interesting. That’s the mindset of the drinker anyway, I’m not sure if this is his mindset, but he does have a drinking problem. He is emotionally distant, because those that formed him devastated him emotionally. Yet, it many ways he appears satisfied with who he is.

12) No one is as boring as we think they are, but no one is as interesting as we think we are either. How many of us look back to our authentic years with the belief that we weren’t nearly as authentic as we are now, and how many of us will look back ten years from now with the same thought? One could say that the effort put into being authentic provides progressively diminishing returns. 

13) How many of us remember the first person who told us about America’s atrocities? Did they package it with a provocative statement such as, “This is something your moms and dads don’t want you to know about.” For those of us who are now parents, it’s probably been so long since someone introduced us to the dark side that we forget how intoxicating it was at the time. I don’t remember my first messenger because I’ve heard about these atrocities so many times since that they’ve all but drowned out my first messenger. Thanks to a myriad of resources I’ve encountered since, I am now able to frame those atrocities with the virtuous acts America has done throughout her history to arrive at the independent conclusion that America has been a noble nation overall. It did take me a while, however, to arrive at that conclusion. 

Some might think that learning of the atrocities for the first time might leave the recipient feeling cold, disillusioned, and/or depressed that their parents sold them a pack of lies. In the combative environment of my youth, one of the many focal points of ridicule was naïveté. “Don’t tell me you believed all that baseball and apple pie crap?” someone would say in the aftermath of a discussion on American’s atrocities. I did, and those early messengers in my life provided me information to combat the characterization that I was naïve. I considered them more informed, brave and righteous. I thought they were cooler than cool for speaking out against the marketing arm of America, and I thought they were treating me with the type of respect than my dad never did.

Now that I’m a seasoned adult, I know my dad wasn’t necessarily lying to me, and he wasn’t withholding a truth, but he didn’t give me the whole picture either. He didn’t know some of the atrocities these messengers told me, but there were incidents that he did know, and he neglected to tell me about them. Anyone who remembers their teenage mind knows how much we exaggerate the characterizations of our parents, especially when “truth tellers” package such information accordingly. Their presentations excited me in a way that’s tough to describe. I thought I was finally hearing the truth from someone.

A vital mindset for parents to have, while sharing our knowledge of American history, is that they are in a constant battle with their peers to avoid appearing naïve. For those worried about telling their children about the awful things the country has done, consider it ammunition to combat these stories with the stories of the country’s virtues. Our goal should be to instill a love of country in a comprehensive manner. To a certain point, we parents have told them what to think and how to think for so long that we may have a difficult time giving up those reins. On this particular subject, however, we need to present this information in a manner that allows them to decide, and we might even add that we understand it’s a lot to take in one setting, so we should allow them to think about it.

If we don’t do this, the truth will rear its ugly head when we least expect it. Those who provide them this information will likely not frame it in the manner we think they should, and our kids might turn around and accuse us of lying, telling half-truths, and not trusting them enough to deal with such sensitive information. Whatever the case is, we might never be able to win them back. My advice is we teach them the virtues of this country and couple it with a healthy dose of the horror some Americans have done since the country’s birth. Do some research on the atrocities and prepare for the follow up questions, because there will be questions. Once we’re done, we should repeat the cycle so often that by the time that cool, rebellious person tells our children, “The things we don’t want them to hear,” they will turn on that person and say, “I’ve heard all of this a million times, and to tell you the truth I’m sick of hearing about it.” If condemning your country in such a manner is difficult, much less teaching it to your child, ask yourself how you would prefer America’s atrocities framed? Would you rather provide your child with a more comprehensive narrative, or would you rather someone who hates their country do it for you? One way or another, your child will learn this information.

14) I’m about 15 years into using devices to stream music on a daily basis at this point in my life, so it might seem a little odd to show appreciation now. Anytime I take a very short drive, I gain greater appreciation for the freedom technology has offered when I turn on my local FM stations and I hear a DJ offer tidbits from their life. I’m not talking about morning show hosts, as I think I listened to one-show decades ago, just to hear what everyone was talking about, and I never listened to another one. When a DJ informs me about a day in their life, I switch the channel so hard my fingers hurt later. I don’t care about the private lives of celebrities, but I understand that some do. No one knows who these DJs are, and I think even less care. Yet, when they are on the clock, moving from one song to another, they tell us about their day. They tell us about a party they attended, a soup they enjoyed yesterday, and something their significant other said to them in the movie theater. Nobody cares! The only line we should hear from a radio DJ is, “That was one song, and here’s another.”  

15) Most people have heard the quote, “The definition of insanity is doing the same thing over and over and expecting a different result.” The quote is widely attributed to Albert Einstein. Most people know this quote, but they only apply to innovative qualities that appeal to them and their relative definitions of the status quo. When another is in the process of doing the same thing in a different way, their process receives scorn and ridicule. “Do you know the quote?” we ask. “Yes, but it doesn’t apply here. That just isn’t the way we do things.” Okay, but the way you do things hasn’t worked for decades now. The counter argument is that they’re on the cusp of it working and the new person could damage all of the progress they’ve made. Again, they’ve been on the cusp for decades, and they might even recognize some merits of the innovative pursuit of the matter, but most innovators take arrows in the process.

Willie and Kenneth


“I have a death voice,” Kenneth Greene said after interrupting a conversation I was having with my fellow employees on break. Kenneth Greene was the manager of this restaurant, and the only time he interrupted our conversations in the breakroom was to inform us that the restaurant was so busy that we would have to cut our breaks short to help the staff out. When he first entered our breakroom we thought that’s what he was doing, but he looked so insecure about it.

Kenneth Greene operated from a baseline of insecurity. Kenneth didn’t think the staff took him seriously enough in the first few months of his tenure as our manager, so he grew a Fu Manchu. Kenneth’s Fu Manchu did not have handlebars, a la Salvador Dali, it was more late 60’s Joe Namath. Kenneth would never admit that he grew a Fu Manchu for the sole purpose of generating respect from his peers, but when that Fu Manchu grew to fruition, the psychological effect on his was all but emanating around his head. Kenneth Greene went from a greasy, overweight ginger with a mullet to a greasy, overweight ginger with a mullet and a Fu Manchu.

The psychological influence of the Fu Manchu became apparent when he progressed from a manager that asked his employees if they wouldn’t mind cutting their breaks short for business needs to a manager that instructed us to do so. Thus, when the new Kenneth Greene stepped into our breakroom, it appeared that the Fu Manchu might have lost its psychological influence. After a moment of hesitation, in which it appeared that Kenneth had something to say, he left without saying a word. When he returned, after apparently recognizing how vital this moment was to the new Kenneth Greene, he stared at me with renewed conviction.

“What’s up?” I asked.

“I have a death voice,” Kenneth Greene said.

“What’s a death voice?” I asked.

“I front a death metal band,” Kenneth said. “In my off time.”

Kenneth Greene’s goal, I can only assume, was to display a talent that matched the subjects of the discussion he interrupted. In that discussion, a friend and I spoke about the various artistic talents of those on the staff, and Kenneth Greene wanted us to know that he had a talent equivalent to those that we were discussing. He wanted us to know that he was much more than a manager of a low-rent restaurant chain that would go out of business within a year, and he wanted us to know that this death voice was his gift and artistic calling.

‘Beauty is in the eye of the beholder,’ is an expression that dates back, in various forms, to the Ancient Greeks. The reason such a notion exists, as Benjamin Franklin’s version of the expression states, is that at the core of one’s definition of beauty is an opinion.

I would never consider myself an arbiter of art, in other words, but I thought Kenneth Greene would have a tough road ahead of him if he hoped to convince those of us sitting in a restaurant break room that we should consider a skilled death voice for our conversation of artistic talents. I was, as I always am, eager to have another prove me wrong.

I didn’t know what to do with this information, however, so I assumed that he wanted to show us. After several attempts to goad him into it, Kenneth decided against performing his death voice for us. I think he saw something in our faces that suggested that the moment after one lets loose a death voice in the middle of a restaurant breakroom, they become the person that let a death voice loose in the middle of a restaurant breakroom. When he invited us to hear it in person, at one of his shows, I could tell he knew we wouldn’t attend, but he needed to say something to get out of the uncomfortable situation he created.

***

I thought Willie Bantner was a real character when I met him. Willie and I found that our backgrounds were similar, and I thought this was odd considering that our outlooks were so dissimilar. Willie’s worldview was foreign to my own, yet there was something about him I couldn’t quite put my finger on. This sense of familiarity became so hard to deny that it stirred feelings of déjà vu, until Willie revealed to me the actual character he was playing in life.

My initial inclination was the once one meets a significant number of odd characters in life they begin some overlap. There are only so many odd characters out there, in other words, and I thought Willie reminded me of one of them.

These odd, weird sensibilities were the reason I was so fascinated with Willie Bantner. It was the reason I would go to him with very specific scenarios. I wanted to learn what he thought, why he thought what he did, and how someone can arrive at such a notion. The funny, thought-provoking things he said were the reasons that we became friends. This friendship lasted for over ten years. Over the course of those ten years, I grew so familiar with Willie that his peculiarities were not so peculiar, but there was still that nagging sense of familiarity about him that plagued me.

When we began one of those lists that seem indigenous to the male gender, this one of the best television shows ever, we mentioned the usual shows that we considered the best of their day. When we entered into the list of what we thought should be on a list of honorable mentions, the list was lengthy. I mentioned the show Family Ties. Willie agreed that show should be on the list of honorable mentions. I added, “If nothing else, the show gave us Michael J. Fox, and the character Alex P. Keaton, and I think Alex P. Keaton was one of the best TV characters ever written.”

“I modeled my life after him,” he said. After some confusion, Willie clarified that he did not model his life after Michael J. Fox. He modeled his life after Alex P. Keaton.

Over the years, I’ve learned that one of the reasons young men swear so often is that they lack confidence. They don’t know how to articulate an opinion in a manner that will impress their peers. They are also unable, at this point in their lives, to provide detailed analysis of the subject of their opinion, so they choose to coat those opinions in superlatives that they hope will provide cover for any unformed intellect. If one person says that Marlon Brando was the best actor of all time, another may agree with that person. Rather than enter into a detailed discussion of that sense of spontaneity Brando brought to his roles, or the fleshed out nuances he brought to method acting that influenced a generation of actors, they say, “I’ve built a personal shrine to him in my bedroom.” When one person says that a movie was the scariest movie they’ve ever watched, another might say, “That movie was so scary that I didn’t sleep right for weeks.” In most cases, there were no shrines built or hours of sleep lost, but in the absence of detailed analysis, a young man thinks he has to say something over the top to pound the point home. I thought this was all Willie when he said he modeled his life after Alex P. Keaton. The more I chewed on it, however, the more I began to see a truth mixed into that admission.

I would watch him, going forward, with that admission in mind. The idea that the man modeled his reactions, his physical gestures, and his life after a situation comedy character became obvious once I had a conclusion for my search for that nagging sense of familiarity. Once I saw that elusive sense of deja vu for what it was, I couldn’t believe I didn’t see it earlier. 

I was also disappointed that my initial assessment of Willie Bantner proved so prescient. I thought he was a character, and he was, but not in the general sense that I intended. I was disappointed to learn that individual experiences did not inform Willie Bantner’s personality as much as I thought, unless one considers tuning into NBC’s early to mid 80’s, Thursday night lineup at 7:30 central to be an individual experience.

Willie Bantner made me think, he made me laugh, and I thought he earned it all with ingenious, individualistic takes. After his admission, I began to wonder how many of those comments were off the cuff, and how many of them he lifted from Family Ties’ scripts. The unique personality that I wanted to explore became, to me, a carefully manufactured character created by some screenwriters in a boardroom on Melrose Avenue. The odd sense of familiarity plagued me as I wrote, but I can’t remember putting much effort into trying to pinpoint the core of Willie Bantner’s character. If I had, I probably would’ve over-estimated what influenced his core personality, but that’s what young men do. Even if I was able to temper my search to more reasonable concepts, I don’t think I would’ve considered something as banal as watching too much TV to be the sole influence for what I considered such a fascinating personality, until he admitted it.

Now, I have no illusions that I’ve scrubbed the influence of TV characters from my personality. I imagine I still have some remnants of the Fonz in my cavalcade of reactions, and I’m sure that Jack Tripper is in there somewhere. I also know that an ardent fan of David Letterman could spot his influence somewhere in how I react to the people, places and things that surround me, but I think it’s almost impossible to develop a personality without some degree of influence from the shows we watched every week for years. To model one’s entire life on one fictional, television character, however, speaks of a level of insecurity I think the American Psychiatric Association should consider in their next edition of the Diagnostic and Statistical Manual of Mental Disorders.

Philosophical Doubt versus the Certitude of Common Sense


If philosophy is “primarily an instrument of doubt”, as Scientific American contributor John Horgan writes in the fifth part of his series, and it counters our “terrible tendency toward certitude”, can that sense of doubt prevail to a point that it collides with the clarity of mind one achieves with common sense? In an attempt to provide further evidence of the proclamation that philosophy is an instrument of doubt, Horgan cites Socrates definition of wisdom being the knowledge one has of how little they know. Horgan also cites Socrates’ parable of the cave, and it’s warning that we’re all prisoners to our own delusions.

“In Socrates’ Allegory of the Cave, Plato details how Socrates described a group of people who have lived chained to the wall of a cave all of their lives, facing a blank wall. The people watch shadows projected on the wall from objects passing in front of a fire behind them, and give names to these shadows. The shadows are the prisoners’ reality. Socrates explains how the philosopher is like a prisoner who is freed from the cave and comes to understand that the shadows on the wall are not reality at all, for he can perceive the true form of reality rather than the manufactured reality that is the shadows seen by the prisoners. The inmates of this place do not even desire to leave their prison; for they know no better life.”

“In the allegory, Plato (also) likens people untutored in the Theory of Forms to prisoners chained in a cave, unable to turn their heads. All they can see is the wall of the cave. Behind them burns a fire. Between the fire and the prisoners there is a parapet, along which puppeteers can walk. The puppeteers, who are behind the prisoners, hold up puppets that cast shadows on the wall of the cave. The prisoners are unable to see these puppets, the real objects, that pass behind them. What the prisoners see and hear are shadows and echoes cast by objects that they do not see.” 

What does Socrates’ cave symbolize? This allegory has probably been interpreted a thousand different ways over the thousands of years since Plato first relayed Socrates allegory. A strict reading of the allegory suggests that the cave is a place where the uneducated are physically held prisoner. The people are prisoner in a figurative sense, in that they’re prisoner to their own ideas about the world from their narrow perspective. A strict reading would also detail that the philosopher is the one person in the story free of a cave, and thus an enlightened man that now knows the nature of the forms. One could also say that various caves litter the modern era, and that the philosophers have their own cave. One could also say that those that remain in that philosopher’s cave for too long, until it, too, becomes an insular, echo chamber in which they become a prisoner.

Socrates bolstered this interpretation when he informed a young follower of his named Glaucon that:

“The most excellent people must follow the highest of all studies, which is to behold the Good. Those who have ascended to this highest level, however, must not remain there but must return to the cave and dwell with the prisoners, sharing in their labors and honors.”

A strict reading of this quote might suggest that the philosopher should return to the prisoner’s cave to retain humility. Another reading of it, could lead the reader to believe Socrates is suggesting that it is the responsibility of the philosopher to share his new insight with the cave dwellers. A more modern interpretation might be that the philosopher must return to the cave to round out his newfound intelligence by commingling it with the basic, common sense of other cave dwellers. Inherent in the latter interpretation is the idea that in the cave of philosophical thought, one might lose perspective and clarity, and they can become victims of their own collective delusions.

The philosopher could accept an idea as a fact, based on the idea that the group thought contained within the philosophical cave accepts it as such. This philosopher may begin to surround themselves with like-minded people for so long that they no longer see that cave for what it is. The intellectual might also fall prey to the conceit that they’re the only ones not living in a cave. The intellectual might also see all other caves for what they are, until they come upon their own, for theirs is the cave they call home. As Horgan says, citing the responses of “gloomy” students responding to the allegory of the cave, “If you escape one cave, you just end up in another.”

One of the only moral truths that John Horgan allows, in part five of his series, that trends toward a “terrible tendency toward certitude” is the argument that “ending war is a moral imperative.” This is not much of a courageous or provocative point, as most cave dwellers have come to the same conclusion as Mr. Horgan. Most cave dwellers now view war as something that we only utilize as a last alternative, if at all.

For whom are we issuing this moral imperative, is a question that I would ask if I were lucky enough to attend one of Mr. Hogan’s classes. If we were to issue the imperative to first world countries, I would suggest that we would have a very receptive audience, for most of the leaders of these nations would be very receptive to our proposed solutions. If we were to send it out to tyrannical leaders and oppressive governments of third world governments, I am quite sure that we would have an equally receptive audience, as long as our proposed solutions pertained to the actions of first world countries.

Former Beatles musician John Lennon engaged in similar pursuit in his “make love not war” campaign, but Lennon directed his campaign to first world leaders almost exclusively. Some of us now view this venture as a colossal waste of time. If Lennon had directed his moral imperative at the third world, and their dictators were genuinely receptive to it, Lennon could’ve changed the world. If these third world leaders agreed to stop slaughtering, and starving their country’s people, and they also agreed to avoid engaging in skirmishes with their neighbors, all of us would view John Lennon as a hero for achieving peace in our time. This scenario also presupposes that these notoriously dishonest leaders weren’t lying to Lennon for the purposes of their own public relations, and that the leaders did their best to live up to such an agreement while having to quash coups to take the government over by a tyrannical leader that has other plans. This is, admittedly, a mighty big asterisk and a relative definition of peace, but if Lennon were able to achieve even that, the praise he received would be unilateral.

What Lennon did, instead, was direct the focus of his sit-ins, and sleepins, to the leaders of the Britain and The United States. The question I would’ve had for John Lennon is, how often, since World War II, have first world countries gone to war with one another? Unless one counts the Cold War as an actual war, or the brief skirmish in Yugoslavia, there hasn’t been a great deal of military action between the first world and the second world since World War II either. Most of what accounts for the need for military action, in modern times, involves first world countries attempting to clean up the messes that have occurred in third world countries.

If Lennon’s goals were as genuinely altruistic, as some have suggested, and not a method through which he could steal some spotlight from his rival, Paul McCartney, as others have suggested, he would have changed the focus of his efforts. Does this suggest that Lennon’s sole purpose was achieving publicity, or does it suggest that Lennon’s worldview was either born, or nurtured in an echo chamber in which everyone he knew, knew, that the first world countries were the source of the problems when it came to the militaristic actions involved in war?

To those isolationists that will acknowledge that most of the world’s problems occur in the third world, they suggest that if The United States and Britain would stop playing world police and let these third world countries clean up their own messes, we would achieve a form of peace. To these people, I would suggest that the world does have historical precedent for such inaction: Adolf Hitler. Some suggest that war with Hitler was inevitable. They declare that Hitler was such a blood thirsty individual that he could not be appeased. Britain’s Prime Minister Neville Chamberlain did try, however, and the world trumpeted Chamberlain’s name for achieving “peace in our time”. Chamberlain’s nemesis in the parliament, Winston Churchill, suggested that Chamberlain tried so hard to avoid going to war that he made war inevitable. Churchill suggested that if Britain engaged in more diplomatic actions, actions that could have been viewed as war-like by Germany, such as attempting to form a grand coalition of Europe against Hitler, war might have been avoided. We’ll never know the answer to that question of course, but how many of those living in the caves of idealistic utopia of ending war, as we know it, would’ve sided with Prime Minister Neville Chamberlain and against Churchill, in the lead up to, and after, the Munich Peace Accords? How many of them would’ve suggested that Hitler signing the accords meant that he did not want war, and that heeding Churchill’s warnings would’ve amounted to a rush to war? Churchill has stated, and some historians agree, that the year that occurred between Munich and Britain’s declaration of war, left Britain in a weaker position that led to a prolonged war. How many of those that live in anti-war caves would’ve been against the proposal to form a grand coalition of Europe against Germany, because it might make Germany angry, and they could use it as a recruiting tool?

The point of listing these contrarian arguments is not to suggest that war is the answer, for that would be a fool’s errand, but to suggest that even those philosophers that believe they have the strongest hold on a truth may want to give doubt a chance. It is also a sample of a larger argument. The larger argument suggests that while the philosopher’s viewpoint is mandatory to those seeking a well-rounded perspective, they are not the only people in need of one.

If the only ones a person speaks to one day confirm their bias, they may need to visit another cave for a day. They may not agree with other cave dwellers, but they may hear different voices on the matter that influence their approach to problem solving. The point is if the only thing a student of philosophy hears in a day is doubt directed at the status quo, and that they must defeat that certitude, how far can that student venture down that road before they reach a tip of the fulcrum, and everything they learn beyond that progressively divorces them from common sense?

In the hands of quality teachers and writers, philosophy can be one of the most intoxicating disciplines for one to explore, and some are so fascinated they choose to follow it as their life’s pursuit. Those of us that have explored the subject beyond Philosophy 101, on our own time, have learned to doubt our fundamental structure in ways that we feel compelled to share. This period of discovery can lead some of us to question everything those that formed us hold dear. At some point in this self-imposed challenge to pursue answers to simple questions that are more well-rounded, some of us reveal that not only have we escaped the prisoner’s cave, but we’ve become prisoners in the philosopher’s cave. Few recognize when their answers to the forms dancing on wall reveal this, but those of us that have, have had an intruder inform us “It’s a goat.”

Unconventional Thinking vs. Conventional Facts


Unconventional thinking can be seductive, as it can be alluring to gain more knowledge than another has. Most of us are skeptical when we hear conventional information. We consider the source, we frame it accordingly, and we fact check. Yet, some suggest we have an instinctive, emotional attachment to alternative theories, and that such notions have such a magnetic, gravitational pull on us that we must make a concerted effort to avoid falling prey to the seduction. Those who fall prey to this should heed the warning that quantity does not always equal quality. There is only so much conventional knowledge available, but there are numerous avenues for those seeking unconventional answers to explore. Most of these avenues contain information that conventional thinkers have never considered before, and in some cases those arguments should be considered, but in my experience most of these arguments provide nothing more than provocative distractions and obfuscations from the core argument.

One of the universal truths I’ve discovered about unconventional thoughts is that they are not always true. This may seem like such an obvious truth that it’s a discussion hardly worth having, but how many people put so much stock into unconventional thinking that they consider conventional thinkers naïve for believing everything they’re told? Unconventional thinkers are more apt to believe an alternative truth is out there, and it’s their mission in life to find it. 

Police officers, working a beat, have a modus operandi (M.O.) they bring to their job: “Believe none of what you hear and half of what you see.” This is the ideal mindset for a police officer to have. Is it ideal, however, for a casual consumer of news, an employee who learns information regarding to their employer, or a friend listening to another friend tell a story?

A top shelf media personality suggested that skepticism of the press undermines their authority, but the vaunted role the press plays in our culture should require them to endure constant, intense scrutiny, skepticism, and cynicism that makes them uncomfortable. Members of the media should conduct themselves in a manner that welcomes all of that from their audience and defeats it with performance that leads to a solid record they can point to whenever anyone questions them. Wouldn’t the members of the media say the same thing of the subjects they cover? 

There is a point, however, when a healthy sense of skepticism creeps into a form of cynicism that believes “none of what I hear and half of what I see.” Such cynicism breeds holes in people that allow “other” information to fill it.

As an individual that has an insatiable curiosity for unconventional thinking, specific to human behavior, I’ve had friends introduce me to a wide array of alternative outlets. They’ve introduced me to everything from the definitions of human psychology through astrology, numerology, and witchcraft. I also had one friend introduce me to a book that suggested aliens from other planets could teach us a lot about ourselves.

Within this book, were transmitted (or transmuted) messages from aliens of another planet to earthlings. A thread emerged from the messages, that the tenets of my political ideology were wrong. The implicit idea was while we humans can argue over such messages, who do we think we are to argue against a superior life form. The first question this skeptic would love to ask an author of human psychology, by way of alien scripture, is why do we assume that aliens from another planet are of a superior intellect? The collective thought, among certain corners of human authority, suggests that not only is there intelligent life out there, but it’s more intelligent than anything meager humans can conceive. Sort of like the unlimited omniscience that the religious assign to their deity of choice. It would be just as foolish as those that suggest that there are no superior intellects out there, as it is to suggest that all other entities are of a superior intellect, but those that suggest the latter often have an agenda for doing so.

What would be the point of worshiping a deity that had a level of intelligence equal to our own, and what would be the point of reporting on the transmissions from space if the aliens were not of a superior intellect that could teach us a lot about human psychology? We should note that most alien transmissions often align with those of the author of such a work. 

The next time an alien transmits a message that has something to do with humans being of equal to inferior intellect (“We are in awe of the capabilities of the new iPhone X, and we have not found a way to duplicate that technology in our labs”), will be the first time I take an alien transmission seriously. The next time an alien transmits a message that has something to do with a compliment regarding human technology in agricultural techniques (“We find the techniques developed by Monsanto Co., to be awe-inspiring”) will be the first time I re-read an author’s interpretation of their message. For some reason, most aliens want us to know that the author of the piece, that characterizes their message, is correct about the dystopian nature of human beings.  

Another friend of mine has mined alternative outlets to such a degree that he thinks he’s found loopholes in our legal system, our financial system, and in the systems we use to maintain health. These arguments often devolve to him arguing from an inferior standpoint, and me guarding against sounding too superior. I don’t consider myself superior to him in the strictest definition of the term, but when he informs me that he is going to risk it all based on the alternative information that he has attained, I feel the need to warn him. In that delicate warning, delivered with genuine intentions, however, I might sound superior. 

My friend suffers from “dumb guy” disease. He did as poorly in school as I did, and he decided to educate himself in his adult life to try and catch up with all of those that were more interested and engaged in school. The difference between the two of us can be explained in a scenario. If I were approached by a used car salesman, and that used car salesman began employing various sales tactics, I would either shut the conversation down in one way or another or walk away. My friend, however, would begin using all the resources he’s discovered over the years for outdoing a salesman at their own game. He would attempt to better the used car salesman, whereas I would recognize the limits of my intelligence while on salesman’s home turf. What I believe would happen to my friend, in this scenario, is that the used car salesman would begin using my friend’s newfound confidence against him, and flip the conversation into one that centered on my friend’s intelligence, until my friend ended up paying more for the car. 

The thing of it is that concerned parties cannot tell those who rely on alternative sources of information that they might be vulnerable to half-truths that lead them to put too much stock in unconventional beliefs. Many unconventional thinkers now consider themselves more knowledgeable than those who ascribe to conventional truths, because they have different knowledge that they believe equals more knowledge.

Another problem inherent in unconventional thinking is that their disciples fail to focus on results. How many outlets, of this nature, provide straight verifiable points that pass peer review? How many of them can point to a verifiable track record of being correct, as opposed to providing the anecdotal evidence that they promote? How many of their messages devolve into speculation regarding motives and round about speculation that no one can refute? It’s that kind of information, in my opinion, that leads to confusion.

Those of us who ascribed to unconventional thoughts at one point in our lives began to see them for what they were, and we discovered that just because a thought is unconventional does not mean it’s correct. We enjoyed the offspring of the counterculture for what it was. We all thought they were so hip that our interest in their thoughts led some TV programmers to identify and capitalize on the purveyors of unconventional thinking, until those thoughts seduced us into incorporating them into our conventional thinking on some matters.

Whether it is political, social, or any other venue of thought, some people derive definition by fighting against the status quo, but we could say that the status quo is an ever-shifting focus that can lead to so many beginning to convert to such thoughts that they become status quo, conventional thoughts. 

I no longer buy a book of unconventional thinking, or befriend an unconventional thinker, with the hope of having my mind changed on a subject. If their ideas do change my mind, that’s gravy, but I have learned that such thoughts, are often best used as a challenge to my current worldview, and/or to bolster to my current view, as I attempt to defeat them. I do not then write of this discovery with the intent of changing anyone else’s mind. I do enjoy, however, taking the conventional standpoint and melding it with unconventional thinking to arrive at what I consider a hybrid of the truth that neither party has considered before.

The best illustration of this methodology exists in a piece I wrote called He Used to Have a Mohawk. In this piece, I documented the conventional thinking regarding an individual that would decide to have their hair cut in a thin strip upon their head. If that person grows the mohawk to eight inches, and dyes it blue, conventional thinking suggests that that person deserves any ostracizing they might receive. Unconventional thinking suggests that there’s nothing wrong with a person who decides to shave their head in such a manner. This mode of thought suggests that it’s on the observer to accept the mohawk wearer for who he or she is as a person. They also suggest that the observer might discover the limits of their preconceived notions or conventional thoughts of a person, by finding out that a person that leaves a thin strip of hair on their head, grows it eighteen inches, and dyes it blue is actually a beautiful person inside. The approach I took, with this piece, combined the two modes of thought and examined them through the prism of an individual that used to have such a mohawk.

What kind of person asks a hair stylist to cut their hair into a mohawk? What happens to them when they age and go back to having a more sensible haircut? Do others’ perceptions of them change? Do they miss the altered perceptions they used to experience when they had the haircut? Do they regret having the haircut in the first place?

One of my favorite critiques of this piece stated that the immediate components of this story could lead a reader to find an instinctual, emotional offense, until they read the piece carefully to understand the complex subtext of the piece through deep analysis. “I like the way you take a mohawk and turn it into something greater than just a simple hairstyle. You give it character that I feel not many others could appreciate,” Amanda Akers stated. 

No matter where the reader stands on the conventional fulcrum with this subject, they must acknowledge that an individual who asks that their hair to be cut into a mohawk does so to generate reactions, or different reactions, more than a person with a more sensible haircut could procure on any given day. Some would say that mohawk wearers generate unwanted attention for themselves by wearing such a haircut, but others could say that no attention is unwanted for some.

If a mohawk wearer detested those that judge them for such a haircut, he or she could allow the hair to lay flat. They don’t, I pose, because they enjoy detesting straight-laced people that will never try to understand them as a person, they enjoy the bond they have with those that sympathize with their plight, and they bathe in the sheer number of reactions they’ve received since they made the decision to wear a mohawk.

The people at this wedding party stated that they wanted to get to know the groom that used to have a mohawk, when he had the mohawk, in part because he had a mohawk. As they learned more about him, to their apparent dismay, they discovered that he was a nice man. As an uninformed bystander, I considered the shock they displayed that a man with a mohawk could be nice, a little condescending. I considered it odd that one man would say that he wanted to get to know a man that wears a mohawk better –based solely on that man’s haircut– a little condescending. This groom appeared to bathe in all of it. I watched this man react to these statements, and I couldn’t tell if he considered it a mark of his character that he had befriended people regardless of the haircut, or if he missed all of the reactions that haircut used to generate for him. My money was on the latter.

The point, as I see it, is that conventional thinking has potholes, and we should remain skeptical of everything we see and hear, but those that put so much energy into unconventional thoughts often end up more confused on a given subject than enlightened. Forming a hybrid of sorts, is the ideal plane for one to reach as it suggests that while we should remain skeptical in nature, we should also maintain an equal amount of skepticism for enlightened, unconventional thoughts. Yet, the seductive nature of unconventional thinking rarely calls for a ledger on which one can score their thoughts, theories, and ideas. 

Most people hate being wrong, and for unconventional thinkers one would think it incumbent on them to establish their bona fides. What often happens is that either the unconventional thinkers adapt a linear adjustment to their way of thinking on the issue when the facts come out, or they move on. Those who move onto other alternative theories often do so without reservations or recalculations. They move onto the next conspiracy theory, or unconventional mode of thought with the idea that comprehensive unconventional thought lends itself to invulnerability, for few in their audience are secure enough in their own being, or their own knowledge base, to ask them why they continue to believe in such things. Conventional thinkers are also forever vulnerable to the charge that they believe what they tell them.

It’s been my experience that if an unconventional thinker were able to proactively turn off their susceptibility to unconventional thinking after charting and graphing their previous thoughts on such matters, they might not devote so much energy to being the smartest person in the room, with the most knowledge. If they charted their hypothetical guesses, based on alternative thinking, against the time-tested and boring conventional thoughts their grandma taught them, I think they would find that more often than not, the conventional, generalized thoughts that their grandma believed are generally true.

Let Me Have Cake


An article I read detailed that eating food to sustain life was something of a miracle. For all the things we take for granted, sustained life has to be the most fundamental. Are you sustaining life as you read this? Have you ever considered the idea that food allows you to continue living?

ask-history-did-marie-antoinette-really-say-let-them-eat-cake_50698204_getty-eAn uncle of mine contracted a muscular degenerative disease at a young age. Throughout the course of his life, this degeneration progressed, until he lost almost all bodily functions. He reached a point, in this degeneration, where he was no longer eating well. He had coughing fits in the course of digestion that caused concern. I saw these coughing fits, hundreds of them, and they were difficult to ignore. The coughing fits caused such concern, to the workers at the care facility where he lived, they determined that my uncle should no longer be fed orally. The determination was that he would be fed through a tube going forward. Uncle John was so crushed by this, he had a lawyer draw up a letter that stated that neither John, nor any of his remaining family members, would hold the care facility liable for anything that happened as a result of oral feeding. But, the letter stated, he wanted to enjoy oral feeding once again. He also threatened to sue the care facility, in that letter, if they did not abide by his wishes. He then said, and this is the heartbreaking part, that “Eating is one of the last joys I have left, and I do not want this taken away from me.”

I had a boring, mindless job at the time. Throughout the course of my time at this job, I rebelled. I talked to whomever I wanted, whenever I wanted. I did the work, and my scores were admirable, but management could not abide by all the talking. I assumed, at one point, that management was either trying to drive me out, or the job had become so awful that I couldn’t maintain the illusion that it was a decent job. I was miserable. I obsessed over those that had no talent, but were living the life I had always wanted to live.

A majority of my co-workers were obese. The first inclination I had was that these people ate the same as everyone else, but they were in a job that involved ten hours of sitting. My next guess was that eating was the only joy they/we had left. I, too, was gaining weight, and I was reaching a point where I didn’t care. I read an article that listed off the heinous deeds of serial killer Jeffrey Dahmer. One of the accounts detailed that Dahmer opened a hole in his victim’s head and poured acid in. He wanted to kill his victim’s brain, or that part of them that produced such sedition. The purpose was to allow Dahmer to enjoy having relations with them, without having to listen to their complaints. How different, I wondered, is that from the day to day life in my current job? My inability to prove my worth to anyone, much less myself, had landed me in a job where creativity is not appreciated. “Just be happy you have a job,” was the mantra fellow employees scream at the unhappy. “You’re in the greatest country in the history of the world, at what could be its greatest time, and you’re complaining? Just be happy that you can financially sustain life, and shut up.”

Routine has a way of killing the mind. Fear of the unknown has a way of convincing one that they are happy. Or they learn, over time, to just shut up!

Employers use fear as a motivation. They convince a person that they’re lucky to have a job, and they instill fear as a motivator. How often have I been informed that I’m meeting the required goals? A number of times, but it’s done in a lethargic manner. They would much rather inform their employees that they’re not, so that they’re motivated to do better. The one that achieves the goal is not the focus of concern, so they fade into the background. They allow their minions to focus on you, and destroy you with hyper critical edicts that chip away at your self-worth. Not only are you in a mindless job that eats away at any creativity that a person may use to prosper in some fashion that they cannot find by themselves, as non-self-starters, but they’re not making the grade.

We were not allowed to speak, in a casual manner, to our co-workers. All conversations were required to be work-related. We were not allowed to email friendly messages to our friends, and our Instant Message system was taken away from us. Food was all we had left, and we were all gaining weight. We were being paid to do this mindless job, and we were using this money to feed ourselves food that was killing us.

When a person sits behind a computer for ten hours a day, four days a week, the clock is a cautious bitch that won’t turn right on red. She drives twenty-to-thirty miles an hour under the speed limit, and we can’t help but notice that the other lane contains free flowing cars, speeding up to prevent entrance. We were in this position as a result of lack of talent, lack of drive, and the inability to take a risk. We felt lucky to have a job in a country that provides ample opportunity for ambitious risk-takers with an idea, but with so much available it’s hard to pick one lane to drive in. The grass is always greener on the other side, of course, but I felt I was planted in a field of weeds that inhibited my own growth. The alternative, of course, is stagnancy.

The complaints that I have/had were all sourced from a first world, privileged background, but I saw those around me grow and prosper, and I reached a point of frustration that probably should’ve led to some counseling. I witnessed firsthand, the end result of frustration so great that one doesn’t want to live anymore, but I have never been suicidal. I’ve always considered alternatives, and what greater alternative is there than change? I would explore my mind for anything and everything that could lead me to happiness. My definition of happiness, I calculated, could be attained. I could live free to explore my mind for every thought I had ever had. It was a privileged, first world avenue, but I had the means to do so. Why wouldn’t I take advantage of it?

People have definitions of the way in which one should conduct their lives. If an individual doesn’t fit those parameters, he is cast out. He is condemned for not living life the way they think he should. How should he live? He made a mistake somewhere around the first thirty years of his life. He sustained life. He entered the workforce with few skills. He developed some. He developed a work ethic. He never called in sick, and after a time, he became more serious, and he was never tardy. Once the latter was managed better, he fell into the background, but he was still employed, gainfully? That’s the question. Was he satisfied? No, he went to another place, and another place, and he discovered a cap on his abilities. He never interviewed well, his public speaking abilities were less than admirable, and he tested poorly. Analysis of his being made him so nervous that he developed a comprehensive form of test anxiety.

His role models, in life, were blue collar workers that did their job, went home, drank too much, and complained about the awful responsibility in life. These were people that focused on his shortcomings. “Where did you come up with that?” was a question they asked the aspiring young minds around them. I have gone back and forth on this relatively innocuous question. At the outset, one has to imagine that such a question arises in an adult mind when the child they’ve known for decades comes to them with a particularly ingenious thought. It has to be a surprise to that old mind to see a younger one outdo them, so one can forgive them for what may cause the young mind to question their base, but it defines that young mind in a manner that suggests that they should remember their station in life.

I’ve witnessed what I can only assume is the opposite of this rearing pattern. I witnessed young, ambitious, and adventurous minds believe in themselves. If they had questions about their abilities to accomplish great things in life, their insecurities paled in comparison to mine. They had such belief in their abilities that when I showed them awe, they swatted my awe away saying that their accomplishment was either not as awe-inspiring as I believed, or that it was but a rung on a ladder to an accomplishment I couldn’t even fathom pursuing.

I considered some of these people so different, I wondered if we were even the same species. How can one put themselves on the line in such a fashion without due consideration put into the fear of failure? They don’t mind the prospect of exposing themselves to ridicule. ‘What if it all comes crumbling down around you?’ I wondered to them. Their answer, in roundabout ways, was that they’d try something else. That wasn’t going to happen, however, for they had belief in themselves. Where does this unbinding faith in one’s self come from? Answer, it’s bred into them. They’re not afraid to try, to risk it all on something that would keep me up at night.

At some point after we spent so much time together, getting drunk and what have you, they ventured out and pursued matters that I didn’t have the confidence to pursue. They were self-starters, and they led, and they accomplished, and I look forward to eating something different in a day. The meal of the day became something to look forward to, nothing more and nothing less than my uncle had to threaten to sue to maintain in his life.

“Let them eat cake,” is an old line, purported to be delivered by the bride of King Louis XVI, Marie Antoinette, that suggested that the unhappiness of the Frenchman in her empire could by quelled by allowing them to eat something delicious. Some have also interpreted it to be an illustration of Marie Antoinette’s detachment from the common man, based on an idea that if they could not afford bread, to sustain life, they should eat cake. Whether or not she actually delivered that line, the import is that we, peasants, derive pleasure from food. Some of us hate our jobs, our family, and our lives, and if we can just find one semi-pleasurable meal, we can find some measure of happiness. If that single meal doesn’t do it for the talent-less minions that neglected to develop an ambitious plan for life, we can look forward to the next day, and thus not only sustain life, through the miracle of food, but achieve some sort of sensorial pleasure through the routine of it.

Eating to sustain life. Eating for pleasure. Too much pleasure? Too much eating? What else do we have?