Scat Mask Replica II (20)


1) The Love Boat might have been one of the dumbest shows ever put on television. For those too young to remember that show, we watched a ton of crappy shows just because there wasn’t anything else on the other two channels. This show also had one of the dumbest theme songs ever made, yet I watched that show so often that that stupid song is so stuck in my cranium that I will never break free of it. Anytime some says the word love, or exciting, it triggers the start of that song. “Love, exciting and new!” This show got cancelled over thirty years ago, and I still can’t get that stupid song out of my head. Our brains are the most complex, most sophisticated hard drive ever invented. While neurologists say that the term hard drive is not a perfect analogy for the human brain, as it can contain a large quantity of memories without having to clear out old memories to welcome new ones. I understand it’s not a perfect analogy, but I have to think that some of that has to go on in our brains. If I’m anywhere close to being right, how much important data has The Theme of The Love Boat deleted or damaged to maintain space in my head? If I’m going to hell, and the guardians of hell determine an individual’s punishment on a case-by-case basis, they’re going to be playing The Theme of The Love Boat for me on a loop for the rest of eternity. 

2) What emboldens those of us who publicly state that our beliefs system is superior? We all have our insecurities, and we join groups to align ourselves with an idea we consider superior, so we can mock and denigrate others that belong to the other group. Some of us need a proverbial podium to mock and denigrate the other group, so that our group might view us as superior. Some view their presentation as bold, but I can’t help but wonder about the raging insecurities that drive a person to do this.

3) At the breakfast table, a five-year-old son speaks about the death of his father. The mother informed the son that he should hope that the father lives long enough to teach him how to be a man. The son looks at the father, “Well tell me.”

4) Analysts on financial/business networks often drop the term financial purgatory. Their context suggests that the term purgatory describes one stuck in misery that an uninformed viewer might mistake for abject misery. Those more familiar with their Catholic Catechism know that purgatory is a place between heaven and hell, a stasis reserved for those awaiting further judgment from the powers that be. A better description of financial purgatory might involve a child of the lower middle class upbringing, finding a way to live among those kids whose parents make true money, and all of the judgment that follows. This kid has no pressing needs, and his life is happy in all ways other than this talk of money. Most kids don’t care about money, but as kids begin to age, how much their parents make becomes a topic of conversation. It can lead them to recognize that while his family is not poor they cannot afford to buy their way into money conversations. Some might dismiss this as a first world problem, and that children adapt well, but any child that seeks entrée into the in-crowd knows that it feels like Armageddon in the moment. Depending on the kids around them, it can lead a kid to feel he doesn’t belong in a financial heaven or hell, and the subsequent, general idea that they don’t belong can last well into adulthood.

5) The horoscope for the new sign Ophiuchus: This will be another meaningless week in your otherwise meaningless life. If someone informs you that they have something meaningful to say about your life this week, walk away. Don’t check in with yourself this week, just go through the week on autopilot for all events and information you receive will be meaningless. Your lucky weather element is wind.

6) A writer arguing about the rules of usage is not only tedious it’s an exercise in futility. Some writers pine for the age-old, linguistic purity of Geoffrey Chaucer, others argue that we should strive to remain casual for greater readability among the masses. On the latter, I know that I might be banging my spoon on my high chair, but when I read the numerous ways professional writers, overuse the word “had” a layer of glaze coats my eyes. I know writing, “I had biked over trails” is past perfect tense and “I biked over trails” is present perfect tense, and grammatically these two forms of expressing action are perfectly acceptable, but I find one causes a brief interruption and the other flows so well that the reader doesn’t pause. There is an ample middle ground for writers to explore between strict grammatical rules and readability, and most of them know it without knowing it, but a reading of Chaucer reminds one of the strict grammatical rules that have long since fallen out of favor in modern writing. On that note, I find “I had done” a most egregious violation of readability, as in “I had done my research before writing this paragraph.” It appears redundant and awkward to me, and when I read, professional writers write in such a manner, I wonder if they don’t pay their editors enough or if they overwork them.

7) Joe Theismann admitted that as a student/athlete at Notre Dame he allowed the university’s public relations department to change the pronunciation of his name from THEES-man to THIGHS-man. The pitch the PR department personnel made was that Theismann’s chances at winning college football’s most prestigious prize, the Heisman trophy, might increase if he changed the pronunciation of his name so that it rhymes with the name of the trophy. Even though Joe made this unusual sacrifice for the award, he did not win the award. We can only guess that his family was against this PR push, and that they scorned him for doing it on some level for the decades they spent correcting people on how to pronounce their family name. We know why Joe did it, but why did he allow us to mispronounce his name for decades? Did he prefer the new pronunciation, was he embarrassed that the PR campaign failed to win him the Heisman, and he didn’t want to have to explain that over and over, or did he consider the pronunciation of his name relatively inconsequential? Whatever the case, Joe allowed the world to mispronounce his name for decades. The former football star is now a celebrity spokesman for a company that purports to aid aging men with prostrate problems that cause them to urinate so often that it disrupts their lives. An ambitious member of marketing arm of this company –that knows about Theismann’s willingness to change the pronunciation of his name– should ask him to change the pronunciation of his name back, so that it rhymes with HE PEES-man.

8) What would you say if a grown man approached your table at an outdoor café and said, “Pardon the intrusion, but I have to say that I enjoy watching the way you eat a tortilla chip.”

9) By modern cultural standards, Joseph Hupfel is a creepy man. He is dirty, unshaven and generally unattractive. He eats a very clean blt. Mayo. Toasted. Buttered lightly, immediately upon exiting the toaster. He enjoys the sedimentary layers of the sandwich. How many sedimentary levels of the man do we know? How much of a man lies on the surface? We know creepy when we see it, until we learn more about the man. How much will we never know about him? Modern man believes he has a decent feel for the history of mankind, but how many fact-finding missions uncover something revolutionary that puts everything we thought we knew in the rear-view mirror? Some speculate that there are miles upon miles of undiscovered artifacts lying under homeowner’s homes in Rome that could further explain the history of mankind, but the homeowners won’t let excavators unearth them. How many sedimentary levels of a seeming simple man have we yet to unearth in our personal profile?

10) In the 1890 essay, A Majestic Literary Fossil, author Mark Twain provides a hilarious condemnation of two thousand years of scientific theory from esteemed intellectuals in the field of medical science. Twain focuses the theme of this essay on the repudiation of the science behind the accepted medical practice of bloodletting. This practice relied on the accepted theory that blood doesn’t circulate in the body, it stagnates, and to achieve proper health the patient needs to have old blood taken out on a regular basis to send a signal to the body that it’s time to regenerate new, healthier blood. The scientific community regarded blood as one of many humours in the body, and they believed that all humours required regular regulation. As such, they believed that a healthy patient would allow their doctor to bleed them on a regular basis, as a preventative measure. The import of Twain’s essay is not necessarily a condemnation of science, in my humble opinion, but the idea of confirmation bias, and the idea that anyone should put stock in the consensus of science. As one who knows little to nothing about science, I do know that most scientists prefer not to use that word. For anyone who wants to argue that sound science is not susceptible to occasional flights of human error, remember that the belief in the virtues of bloodletting wasn’t a blip in human history, the consensus of the scientific community considered the science behind bloodletting so sound that medical practitioners relied on it for most of human history. The import of this essay also asks us to examine what we believe today, based on a consensus of scientific theory. If we were able to go back in time to Abraham Lincoln’s day, and we witnessed the archaic act of bloodletting, how would we try to dispel them of their scientific findings? “You don’t believe in science?” is a question they might ask us. To which we would tell them that we do believe in science, but we also know that some science, their science in particular, is wrong. “You realize that you’re arguing against 2,000 years of science. Why should we take your word for it even if, as you say, you’re from the future?”

If a person from the future were to travel back in time to our day, what would they ridicule us for believing? Which archaic rituals and procedures that we derive from our scientific findings would they mock? Would they laugh at us in the same manner we laugh at the scientists of Twain’s day? Our natural inclination will be to laugh with them, for we know all too well the foolish beliefs others in our era have, but will we stop laughing when they touch upon that which we believe, or will we continue to laugh with them under the soft lie that we were never that gullible?

11) I heard a cop once say that the rule of thumb for being a cop on the beat is to believe half of what you see and none of what you hear. Those who watch network television shows and major Hollywood movies should apply the same principle to their viewing habits.

12) Listening to one party’s version of a romantic breakup is always dicey. The listener knows they’re only hearing one side of the story, and we know where to get the other side if we’re feeling especially adventurous, curious, and nosey. We suspect that we will hear an equally partisan take on the situation from the other side, and we suspect that both accounts might uncover some key discrepancies in the other’s account, and that we might be able to help both parties discover a truth that lies somewhere in the foggy middle. Before enlightening these two parties, however, the listener needs to consider the idea that their objective truth is just as subjective as the two parties concerned, and the crucial point is that what the listener might believe is true is not necessarily the truth. Just because a listener is a third party, uninterested listener does not mean that they are objective.

13) If someone were to ask me for dating advice, based on my experiences, I would say the key to attracting a person is to try and be as genuine, and as normal, as possible on a date, unless those two characteristics conflict. The best dating experience of my life involved a woman who convinced me she was relatively normal. She went through some drama in her previous life, but she managed to extricate herself from that situation relatively normal. Everyone says that they managed to escape prior relationships unaffected, but if we try to be honest with ourselves, we recognize how impossible that is. One of her key selling points was convincing me that she did not attempt to influence those affected parties with intimate details of her ex’s past transgressions. Most people I know adopt the time-honored tradition of slash and burn politics to assure all parties concerned of their nobility, but thoughtful people know that nobility is a long-term value that will eventually reveal itself. She claimed that my greatest attribute was authenticity. I went through some stuff in my previous life, but I maintained whatever it was she sought in a man. If the person I knew was dating someone they feared was not normal, I would warn them that putting a best foot forward and creating a façade of normalcy is easy in short spurts. I would tell them to watch that person around their family and friends and pay special attention to the way they interact with the people with whom they’re most comfortable. Most people don’t want their friends and family to think that a boyfriend, or girlfriend, can change them, so they’re authentic around their friends. If that doesn’t work, take a long trip with that person. That prolonged involvement should reveal the characteristics of the other party and allow one to make a more informed decision on them.

14) “What do you believe in?” I’ve asked those who ridicule me for believing in a person, place, or thing that turns out to be wrong. These people inform me that I should’ve been more skeptical, and while that is true, my question to them is, “Have you ever believed in something, only to find out you’re wrong, to one degree or another?” The answer for some of them, has often been no, because they’ve wrapped themselves in a cocoon of fail-safe contrarian thinking to avoid ridicule.

After the facts roll out, it’s easy for a cynic to say that they never believed in it in the first place, but there is a point shortly after one learns of a novel idea, or a new approach to solving humanity’s problems, when the new information  excites to the reader. This point, just before the reader can personally research the subject, defines them as a hopeful person who wants to believe in people, places and things. For the purpose of discussion, let’s say that we’ve just finished an intoxicating nonfiction book that espouses radical, new secular and apolitical ideas to solving one of the world’s many problems. Let’s also say that this book is about a subject matter that covers a matter the reader knows little to nothing about, by an author they’ve never heard of before. How does one react to the information in that book, before doing personal research on it?

Some of us are more inclined to believe in something if the presenter builds a solid case for it, cynics are more inclined to seek out refutation for any person, place, or thing before the facts roll out, and then there are those cynics who ridicule everyone that believes in anything before the facts roll out. They prefer to call it skepticism, but I call it cynicism. It’s in my nature to believe in people, places, and things, until the facts prove otherwise. I believe, for example, that for just about every tragic situation mankind faces there is an ingenious problem solver who will eventually solve it. In the court of public opinion, this mindset often places me in a vulnerable position for ridicule.

When I first read John Douglas’ Mindhunter decades ago, I was a believer. I believed that Douglas laid out a solid case for how, why, and where criminal profiling could provide useful tools to assist law enforcement in their efforts to locate a criminal. It was a temporary setback for me to discover how often profilers erred. The naysayers used those instances to claim that criminal profiling is essentially a form of confirmation bias that involves throwing out a bunch of commonalities that most serial killers have, for example, to form a standard profile for the next serial killer they profile. The naysayers further this repudiation saying that after law enforcement captures the perpetrator, and the perpetrator confesses, the profiler then aligns the perpetrator’s characteristics with elements of the conclusions they made in their profile. The question these naysayers have for those who believed Douglas was, “How often was John Douglas wrong, and did he list those instances in his book?” It might have something to do with the idea that I was ready to canonize Douglas after reading his book, but the factual refutations of his work, by the naysayers, were eye opening to me. Once I recovered from the setback, I discovered that while flawed, criminal profiling might be on par with all that informs a doctor’s profile on a patient, before they reach a diagnosis on that patient’s ailments. In the back and forth on this issue, I began to question the effectiveness of criminal profiling more and more, but I also began to question the motives of the cynical naysayers. What drives an absolute cynic to tear down everything they read, hear and see? Dissecting any idea to locate truth is not only necessary it’s admirable, but how they approach their research is fundamental to their being.

Believers might approach personal research of such matters in a cynical vein, but they only do so in a scientific method to disprove. Absolute cynicism is so foreign to my thought process that it’s difficult for me to portray without bias, but I think it’s a fail-safe, contrarian approach that some use to ward off ever being incorrect and enduring subsequent ridicule for their personal track record. When I learn of an interesting new concept, or problem solving measure, it excites me until I learn that it is not as effective as it was in the author’s presentation. I view this belief as food for the mind, and that a person who doesn’t believe in anything might have a more difficult time achieving fulfillment, and again I’m reserving this space for secular, apolitical ideas and philosophies. It seems to me that those empty spaces in the mind of cynical contrarians cry out for sustenance in a manner equivalent to an empty belly crying out for food, and that those vacuous holes do get filled by the belief in something. That something, I’ve often found, are alternative modes of thought that they consider almost impossible to refute.

15) Anytime I think I might be smart, I dip into a discussion involving the creations of our universe. One such discussion involved the time-space framework, another involved the idea that our universe is flat with a slight bend due to cosmic background radiation, and a third informed us of the idea that there are efforts now looking through the Microwave Background Radiation for evidence that some other universe at one time collided with ours. I don’t know what these people are talking about, and I dare say most don’t. Most of us, even most scientists, prefer to argue about the knowable.

16) For most of my life, I’ve managed to avoid caring what happens to celebrities. I used to strive to know what was going on in their world if only to better understand the cultural references comedians drop. I’m to the point now that I don’t understand three-fourths of those references, and I don’t care as much as I should. I did manage, however, to arrive at a decade old story involving the messy divorce between singer Shania Twain and the producer Mutt Lange. It appears that Mutt Lange had an affair with Twain’s best friend, and he eventually married that best friend. In a noteworthy turn of events, Twain ended up marrying that best friend’s husband. Hollywood writers love to give cute names to marrying couples like, Tomkat, Bennifer, and Brangelina. I suggest we call the Twain/Lange eventual arrangements, getting Shlanged.

17) Every time I watch a professional athlete make a mistake, I empathize. I arrive at this empathy from a much smaller vantage point, as I didn’t engage in organized sports past junior high. I played intramural games and pickup games constantly throughout my youth, however, and I made errors ESPN might have added to their Not Top 10 if I committed them on a higher level. I have to think those laughing hardest at the foibles of professional athletes never played sports in their life, or they’re seeking to diminish whatever laughable errors they made by laughing harder at other’s errors. What follows such laughter is some incarnation of the line, “I made some errors, sure, but I never would’ve done anything like that.” If I didn’t commit an error similar to that one, I think of all the egregious errors I made that were as embarrassing if not more so, and I follow that with the thought that at most, I had maybe twenty people witness my error. These professional athletes commit errors in front of millions, and sometimes hundreds of millions of people depending on how many times ESPN replays their errors for the enjoyment of those no empathy.

18) We’ve all mistakes large and small. Some of us have made life-altering mistakes, and some of us have made mistakes that affect others’ lives in a manner we have to live with, but few have made mistakes that change the course of history in the manner mapmaker Martin Waldseemuller did. Due to the popular observations of an Italian writer/explorer Americus Vespucci, the mapmaker named an entire continent after him. The general practice of naming continents involved leaders of expeditions, but Vespucci was more of an observer who wrote about the expeditions that he took part in. Christopher Columbus led the expedition to find a new path to the East Indies. When he arrived back in his home country, Spain, he reported as his findings. In the course of the confusion over what Columbus actually discovered, Vespucci wrote about his many expeditions to foreign lands, and conflicting accounts suggest Vespucci might have participated in Columbus’ expedition. Regardless if he participated in that particular expedition or not, Vespucci took part in expeditions following Columbus’, and he reported the discovery a new continent. Amid the sensation of that report, Waldseemuller mistakenly labeled the new continent Amerigo’s land. The standard practice of the day also suggested that continents have feminine version of a word, such as Asia, Africa, and Europa, so Waldseemuller took the feminine version of Americus’ name and called the land America. Some suggest that Waldseemuller attempted to correct this mistake by removing Amerigo Vespucci’s name from later editions of his maps, but it was too late to change it in the popular culture of the day. Columbus’ home country, Spain, refused to accept the name America for 200 years, saying their explorer should get credit for his accomplishment, not an Italian writer, but they couldn’t defeat the consensus on the topic. Thus, some suggest that Americans should call their homeland Columbia, the United States of Columbia, or the United States of Columbisia. From this, we can say that not only did America become a land of vagabonds, creeps, and cast offs, but we were mistakenly named after a writer who achieved some decent sales in his day, and the popular opinion derived from those sales defeated all attempts to correct the record.

19) Those who enjoy reading biographies as often as I do know how little the childhood chapter has to do with the overall narrative of the subject’s life. The childhood chapter deals with the subject’s relatively difficult childhood, the child’s genealogy, and some elements of their upbringing. Other than familiarizing the reader to the subject, the only reason to include the childhood chapter is to reveal the research the author has performed on the subject. Chekov’s Razor applies to writers of fiction, but it does not apply, unfortunately, to writers of biographies. I’ve decided to skip the passages that inform us that the subject played hopscotch, their relationships with peers and siblings, and if their parents encouraged them or not. I now start a biography at the subject’s first major accomplishment, and I find that I don’t miss anything I consider substantive.

20) Reading through the various portrayals of George Orwell, a reader finds a number of opinion makers claim the Orwell loathed the idea that right-wingers adopted many of political theories. He was, to his dying day, a libertarian socialist these authors repeat at the end of every description. Some of his works, including Animal Farm and 1984, appear to denounce Stalin and the U.S.S.R., but Orwell didn’t limit his fears of totalitarian principles to locales or leaders. He feared the idea that too many citizens of the world were willing to give up their freedom for comfort, and he feared these susceptibilities were just as inherent in people of Britain as The United States. As we’ve witnessed, such fears can be defined and redefined by both parties, but I choose to view them as apolitical. I understand that when political opponents adopt the theories of esteemed intellectuals, the other side will mount a defense, but when those theories prove correct, there will be cloistered mass of humanity vying for the peak. If a political opponent adopted one of my theories to explain their beliefs, we might find that we disagree on an end game, but if we continued to find some agreement on a principle regarding fundamental elements of human nature, I would find that a compliment regardless of their political viewpoint.

The Debilitating Fear of Failure


“The reason we struggle with insecurity,” notes Pastor Steven Furtick, “is because we compare our behind-the-scenes with everyone else’s highlight reel.”

Some quotes educate us on matters we know nothing about, but the ones that stick take a matter we know everything about and puts a clever twist on it that changes our perspective. We all know failure, or some level of it, at various points of our life. Some of those failures have shaped us in profound ways that we assume everyone remembers them the moment we enter a room, and some people will, but will they remember their own, or will they compare our failings to their highlight reels.

Pastor Steven Furtick
Pastor Steven Furtick

“Acknowledging failure,” Megan McArdles writes in the book The Up Side of Down: Why Failing Well is the Key to Success, “Is a necessary first step in learning from it.”

Some of us are old enough to remember the severe penalty for missing a rung on the monkey bars. An erroneous grab, at the very least, could land a victim center-of-attention status, as we attempted to find our feet. At worst, it would cause the pack of onlookers to send an emissary to the office with a call for assistance. These everyone-is-looking-at-you moments are so immersed in embarrassment, and pain, that few can see any benefit to them.

Most of those liable for such situations, have lowered the monkey bars, and made the ground so forgivable that one would have to fall from a skyscraper to receive any pain. Thanks to these and other technological advances, fewer children get hurt on playgrounds, fewer playground manufacturers get sued, and everyone is much happier. There is one casualty, however, the pain of failure.

No one wants to see a child cry, and we should do everything we can to prevent it, but pain teaches us.

After a near fall in a supermarket, the checker complimented me on the agility and nimbleness I displayed to avoid hitting the ground. “It could be that,” I returned, “or it could be said that only someone so well-practiced in the art of falling knows how to avoid it.”

I eventually did touch ground a short time later, at a family reunion. I also touched a parked car, and then I touched the ground again. Among the lessons I learned is that pain hurts. Had it been a simple fall, it would be hardly worth noting. This was one of those by-the-time-this-ends type of falls everyone will be looking, some will be concerned, and most will be laughing. I thought I corrected my trajectory a number of times, but I was moving too fast. By the time it was finally over, I silenced just about everyone in the vicinity. The kids around me laughed, as kids will do when anyone falls, and my age-denying (Not Defying!) brother laughed, but if the Greg Giraldo line, “You know you’re getting old when you fall down, no one laughs and random strangers come running over acting all concerned,” is true, then I am getting old.

Most lessons in life are learned the hard way, and they are often learned in isolation, in that even our closest friends and family members distance themselves from us in these moments, so that they have no association with them. These dissociations range from laughter to sympathy, but the latter can be just as dissociative as the former if it’s done a right. The point is, no matter how we deal with these moments of failure, we usually end up having to deal with them alone. 

The point is that the lessons learned through pain and embarrassment, are lessened by lowering the monkey bars, providing a forgiving ground, and instituting zero tolerance bully campaigns. The point is that those of us that see little-to-no benefit derived from bullying, or that any benefits are inconsequential when compared to the damage done by the bully may eventually see the fact that few lessons in life are learned by the individual, until those kids enter adult arenas.

A quote like Pastor Steven Furtick’s, also tells us the obvious fact that we’re not alone in having moments of failure, but that those that can deal with them in the proper perspective might actually be able to use them to succeed on some levels.

Artistic Creations

Any individual that attempts to create some form of art knows more about comparing another’s “highlight reels” to their “behind-the-scenes” efforts better than most.

How many times did Ernest Hemingway grow insecure when comparing his behind-the-scenes efforts to the shining lights that preceded him? How many times did he fail, how many times did he quit, and give up, under that personally assigned barometer, before finally finding a unique path to success?

Even in the prime of his writing career, Hemingway admitted that about 1 percent of what he wrote was usable. Think about that, 1 percent of what he wrote for The Old Man and the Sea, was publishable, worth seeing, and that which Hemingway considered worthy of the highlight reel that we know as the thin book called The Old Man and the Sea. The other 99 percent of what he wrote, proved to be unpublishable by Hemingway’s standards. Yet, this highlight reel of the Old Man and the Sea writing sessions are what has inspired generations of writers to write, and frustrated those that don’t consider all of the behind-the-scenes writing that never made it in the book’s final form.

mark-twain-6fa45e42400eea8cac3953cb267d66a33825a370-s6-c30Mark Twain

“Most of what Mark Twain wrote was dreck,” writes Kyle Smith.{1}

Most of us know Tom Sawyer and Huckleberry Finn, the highlight reels of Mark Twain’s writing. We know the infamous Twain quotes that occurred in the numerous speeches he gave, and the essays that he wrote, but it is believed that he wrote as many as 50,000 letters, 3,000 to 4,000 newspaper and magazine articles, and hundreds of thousands of words that were never published. Twain also wrote hundreds of literary manuscripts—books, stories, and essays—that he published, and then abandoned, or gave away. Almost all of it has been discovered over the last century, and placed in a home called The Mark Twain Papers.{2}

Very few of us are so interested in Mark Twain, or any of his writing, that we want to read his “dreck”. Very few of us are so fanatical about Twain that we want to know the material he, and his publishers, deemed unpublishable. Yet that “dreck” ended up fertilizing the foundation of his thought process so well that he churned out two highlight reels that many agree to be historic in nature. Similarly, very few would want to want to watch a Michael Jordan, or a Deion Sanders, practice through the years to tweak, and foster their athletic talent to a point that we now have numerous three to four second highlight reels of their athletic prowess. Their behind-the-scenes struggles may provide some interesting insight into their process, but they’ve become a footnote at the bottom of the page of their story that no one wants to endure in total.

nirvanain-365xXx80Kurt Cobain

When we hear the music contained on Nirvana’s Nevermind, we hear a different kind of genius at work. We hear their highlight reels. We don’t know, or care, about all of the “dreck” Kurt Cobain wrote in quiet corners. Most of us, don’t know, or care, about the songs that didn’t make it on Nevermind. Most of us don’t know, or care, about all the errors he committed, the refining, and the crafting that went into perfecting each song on the album, until the final form was achieved. We only want the final form, the highlight reels, and some of us only want one highlight reel: Smells Like Teen Spirit.

On an album prior to Nirvana’s Nevermind, called Bleach, Kurt Cobain penned a song called Floyd the Barber. “Where does the kernel of a song like that start?” Soundgarden’s Chris Cornell asked. Cornell may not have come from the exact same background as Cobain, and he may not have been influenced by the exact same artists as Cobain, but he presumably felt like his creative process was so close to Cobain’s that he couldn’t fathom how the man achieved such divergence from the norms of musical creation. Those familiar with Cobain’s story also know that he was heavily influenced by the music of Soundgarden, and that fact probably confused Cornell all the more.

Other than Soundgarden, Cobain also loved Queen, The Beatles, The Pixies, The Melvins, and a number of other lesser known bands. How much of his early works were so similar to those artists that no one took him seriously? As I wrote earlier, it’s a major part of the artistic process that every artist goes through to attempt to duplicate influential artists in some manner. It’s a step in the process of crafting original works. When that artist duplicates those that came before them often enough, the artist (almost accidentally) begins to branch off into building something different … if they have any talent for creation in the first place.

Divergence in the artistic process

Few artists can pinpoint that exact moment when they were finally been able to break the shackles of their influences, for it happens so progressively that it’s almost impossible to pinpoint. Most artists do remember that moment when that one, somewhat inconsequential person said that some aspect of their piece wasn’t half bad however. At that point, the artist becomes obsessed to duplicate, or replicate, that nugget of an idea. Once that nugget is added to another nugget, those nuggets become a bold idea that wasn’t half bad. Once that is achieved, another bold idea is added, until it all equals a “halfway decent” compendium of ideas that may form something good. At that point, the artist believes he has something that others may consider unique enough to be called an artistic creation in its own right. When enough unique, artistic creations are complete, the artist may eventually achieve their own highlight reels.

When did Cobain finally begin to branch off? How did he become divergent, and creative, and different on a level that made him an organic writer to be reckoned with? How many casual statements, spray paintings on walls, and other assorted personal experiences had to occur before Kurt Cobain had the lyrics for Nevermind? How many different guitar structures did Cobain and company work through, until he arrived at something usable? How many Nevermind lifted music or lyrics from other failed songs, casual strummings in a closet, and offshoots of other guitarists? What did Floyd the Barber, Come as You Are, and Pennyroyal Tea sound like in those moments when they first found their way from notepad to basement practice sessions? How many transformations did these songs go through in those practice sessions, until they were entirely original, and transformative, and legendary additions to the albums they were included on? If Cobain were alive to answer the question, would he acknowledge that Nevermind is a 1% highlight reel of about a decade of work? Most of us don’t care, we only want to hear the highlight reels, so we have something to tap our finger to on the ride home from work.

Cobain’s highlight reel, Nevermind, proved to be so popular that record execs, and fans, called for a B-list, in the form of the album Incesticide. That album proved Cobain’s B-list was better than most people’s A-list, but what about the D-list, and E-list songs that proved to be so embarrassing that no one outside his inner circle ever knew they existed?

The point is that some of us are so influenced by an artist’s highlight reels that we want to replicate it, and duplicate it, until we become equally as famous as a result, and when we don’t, we think that there is something wrong with us. The point is that the difference between a Mark Twain, a Hemingway, a Cobain, and those that compare their behind-the-scenes work to an influential artist’s highlight reels is that while these artists recognized that most of what they did was “dreck”, they also knew that their behind-the-scenes struggles could be used as fertilizer to feed some flowers.

So, the next time you sit behind behind-the-scenes of your computer keyboard, tattered spiral notebook, or whatever your blank canvas is, remember that all of those geniuses —who so inspired you to be doing what you are doing right now— probably spent as many hours as you do staring at a blank page, or a blinking cursor, trying to weed through all the “dreck”, that every artist creates, to create something different, something divergent from all those creations that inspired them to create. You now know that they succeeded in that plight, but you only know that because the only thing you want to see, hear, and read are their highlight reels.

{1}http://www.forbes.com/sites/kylesmith/2014/02/20/what-mark-twain-van-halen-and-dan-rather-teach-us-about-failure/

{2}http://www.marktwainproject.org/about_projecthistory.shtml

Details, Details, Details


Epiphanies, like women, can pop up when you least expect them, and they can free you from a troubling part of your life you didn’t understand as a problem until they were revealed. Most of us learn if we’re multi-taskers, optimistic, outgoing, genuinely funny, and/or thick-skinned to those who label us otherwise. Isn’t it interesting when a “That’s me!” pops up that teaches us more about ourselves than we know before.

In a PBS documentary on Mark Twain, a number of incidents arose in the building of Twain’s home, and the construction team began “badgering” Twain with questions regarding how he wanted them handled. The questions regarded the construction of his home, a place the older Twain would presumably live in for the rest of his life, so the observer should forgive the construction crew’s chief for the badgering. The team didn’t know how he wanted some particulars of his home constructed, and they probably had hundreds of questions for him. What the team did not know, however, was that Twain had an oft expressed aversion for details.

Twain

“That’s me,” I thought. If I were to construct my own home, I can see myself going “all-in” on the big, meaningful constructs. I can see myself all-in on the design, and some of the details. I can see myself all hopped up in the beginning, acutely focused, and knocking out every question with pinpoint answers. I would consider other perspectives, others’ advice. I would probably read books, watch YouTube videos, and gather as much information as possible to make an informed decision. At some point, and no one knows when this point hits, I would begin to shut down. It often happens soon after the this-and-that questions hit the floor.

“Do you want this or that?”

“I’ll take this.”

“Are you sure, because that offers a this and that.”

“OhmiGod, just gimme that then.”

“It’s your house,” they say. “We just want to make sure you’re getting what you want.”

“I understand. Give me that.”

Details, regarding otherwise inconsequential minutiae, make me feel stupid. These details start firing far too many neurons in my brain for me to handle, and I often get overwhelmed and exhausted by them. I know that I should be listening to every question, and I feel guilty for not being able to ponder all of the details they give me to come up with the ideal solution for my family, but my capacity for such matters is limited. When the flood of this-and-that questions hits, I’m completely out of gas. “Whatever, just get it done!” I’ll fall away from the creative to what is expected, and what it is that those still paying attention want. My answers going, forward, are autonomic. “Yes, that sounds fine,” I’ll say without knowing the question. I’ll just want the damn thing to be built already by that point, because I’m not a details-oriented guy. I’ll want to make the big decisions, but I’ll want to leave all of the “inconsequential” details-oriented questions to others.

I feel guilty. I want to be involved, informed, and constantly making acutely focused decisions throughout the process. I feel guilty when others start making the decisions that affect me, because I know I’m an adult now, and I should be making all these decisions. There is also some fear that drives me to constantly pretend that I’m in prime listening mode, based on the fact that I may not like the finished product if I’m not involved in every step. I may not like, for example, the manner in which the west wing juts out on the land and makes the home appear ostentatious, or obtuse, or less pleasing to the eye with various incongruities, and I’ll wish I would not have been so obvious with my “Whatever just do it!” answers. Details exhaust me, though, and they embarrass me when I don’t know the particulars that the other is referencing.

I don’t know if the guilt is borne of the fact that I know I’m an intelligent being, and I should be able to make these decisions in a more consistent manner, or if I’m just too lazy to maintain acute focus. I do have a threshold though, and I know how my brain works. I know that if there are seven ways to approach a given situation, I will usually select one that falls in the first two selections offered. I usually do this, because I’m not listening after the second one. Everything beyond that involves the other party showing off the fact that they know more than I do. I know this isn’t always the case, but it’s the only vine I can cling to when having to deal with my limited attention span and the limited arsenal of my brain.

Knowing my deficiencies for retaining verbosity, I will ask for literature on the subject that provides the subject a tangible quality that can be consumed at my pace. If I do that, and I have, I will then pretend to read every excruciating word, but I will usually end up selecting one of the first two selections offered. Companies know the predilection we have to choose the first one or two selections, and they pay search engines to optimize their place in searches. We might envy the person who knows enough to know that selection #7 is the ideal company for this job, but we know we’re not that guy.

I like to think I have a complex brain. I like to think that I display all that I’m about in my own way, but I’m always reminded of the fact that most of the people around me give full participation to the details of life no matter how overwhelming and exhausting they can be to me. It’s humbling to watch these brains, I like to consider inferior, operate on planes of constant choices, and decisions, and retentions, and details I am incapable of retaining.

I have this daydream that I will one day be given an excuse for having such a limited brain by the relative brilliance I reveal to the world in the form of my book. I am interviewed in this dream, and I am asked, “So, what does it mean to you to have crafted such a fine book?” I am far wittier than reality would suggest in this dream when I reply: “It will help me deal with all of my faults better. The fact that I cannot fix my own plumbing, can now be countered with, but I wrote a fine book. The fact that I cannot fix my own car, compete with my wife in certain areas of intelligence, or hold down a decent job can now be countered with, but I wrote a book that is held up as a fine book in certain quarters.”

We’ve all heard the line “Everybody’s mind works differently,” but until we learn something regarding the fact that the brilliant mind that composed Huckleberry Finn has similar deficiencies, we cannot help but feel guilty about them. “Well, work on your deficiencies,” those around us suggest, and we do when that next project comes about. We’re out to prove ourselves in that next project. We answer every question, from the first few to the this-and-thats, with prolonged mental acuity. When that third and fourth project rolls around, however, we’ll revert back to those inferior brains that can’t retain details, and it is then that we’ll envy those “inferior” brains, consistently showing their superiority. This could lead those of us that never knew we were suffering from such a recognized deficiency into feelings of incompletion, until someone like Mark Twain recognizes and vocalizes his defeciencies for us.