Know Thyself


“I do not know myself yet, so it seems a ridiculous waste of my time to be investigating other, irrelevant matters,” —Socrates, on the subject of studying mythology and other trivial concerns.

“Know thyself?” we ask. “What do you mean know thyself? I know myself. I know myself better than anyone else does. Why would I waste my time trying to understand why I do things when it’s all these other people who make no sense to me? I have no problem with me, and this idea of trying to know thyself better, to the level the ancient Greeks and Socrates suggest, seems to be nothing more than a selfish conceit for pointy-headed intellectuals who had far too much time on their hands.”

Philosophers suggest that the key to living the good life life lies in self-examination and reflection. If we’re not where we thought we’d be at this point in our lives, and we want to change, any changes we might make will be pointless and unsustainable if we don’t have intimate knowledge of our strengths and weaknesses.

The most popular avenue for knowing thyself is through comparative analysis. We use others to understand how different, weird, or strange we are, and we derive feelings of superiority and inferiority in the process. This analysis also provides some relief when we examine themselves against the freaks, creeps, and geeks. “At least I’m not that,” we say.

To put the idea of our comparative analysis into a visual, we might want to try using the Cartesian coordinate system we studied in high school algebra. Using this coordinate system might help us locate where we are compared to others. If we gauge our ideas of being normal on one axis and our resultant feelings of superiority and inferiority on the other axis, it might provide us some answers. If we find that we are not any more normal or abnormal than our peers, and we feel no subsequent feelings of superiority or inferiority we would end up on the (0,0) point on the (X,Y) axis. Any experiences we have that dictate we are more normal or more abnormal than them would exert a countervailing effect on the other axis of feelings of superiority and inferiority. We know comparative analysis is an inexact science, but it is the most common method we use to know ourselves better.

We’ve all met strange individuals who tend to be strange in a more organic manner, and we know we’re not that. Through comparative analysis, we might say that the strangest person we’ve met exists five increments to the left of the point of normalcy on X axis of the Cartesian coordinate system, if being strange is a negative. The most normal would be five increments to the right.

The first question those of us who seek truth through comparative analysis should ask is if we have a model for absolute normalcy. The second question regards the numerous ideas we all have about being normal, weird, and strange. Most consider these relative concepts nearly impossible to quantify, but I’m sure they would have an argument against defining us as the barometer by which all people striving for normalcy should be measured. Normal might be one of the most relative concepts there is, for we all define it internally and compare the rest of the world to our definition of it. How normal are we, and how normal is the most normal person we know?

If we prize normalcy, we might argue that for all of our eccentricities, we are quite normal. We might admit that a majority of people we run into are more normal than we are, but we also consider them just as boring. If we are able to admit that, we’re admitting that we are a two on the weird-to-normal axis. We can guess that our point on the X axis would have a corresponding effect on the Y axis if being normal has a corresponding relationship to self-esteem and the subsequent feelings of superiority. Through comparative analysis we could say, with some confidence, that we are probably a (2,2) coordinate, as compared to the rest of the normal, well-adjusted world.

When plotting points in our personal ledger, most people don’t view themselves honestly, and that makes it difficult to compare ourselves to others. Too often, we instinctually eliminate the negative in our quest to accentuate the positive. Thus, if we are the ones introducing the variables to this equation, there will always be contradictions, and these contradictions lead to the answer no solution.

The true solution to finding out more about us does not lie in comparative analysis, so everyone can put their pencils down. These ledgers are pointless. The solution to knowing more about oneself lies just inside the analysis we perform when deciding our comparative plotting points to form our Cartesian coordinate points. Most of us will not arrive at a definitive answer, but if the questions we ask ourselves lead to other questions we are on the correct road to final analysis through self-reflection. Ask more questions, in other words, and the subject of the interrogation is destined to provide their interrogator more answers. The point plotter might never find the perfect question that leads to the truth of it all, but questions lead to answers, and answers provide other questions that we never asked before.

✽✽✽

The great philosophers spent a lifetime asking questions of themselves and their followers, yet many in the audience considered their philosophical tenets too general. Bothered by these complaints, some believed the ancient Greeks granted them a gift in the form of a maxim. Among the many things the ancient Greeks offered us was a simple inscription on the forecourt of the Temple of Apollo at Delphi, reported to the world by Pausanias. This gift was what modern-day philosophers might call the ancient philosophers’ “Holy stuff!” moment, and what a previous generation would call a “Eureka!” moment. To all philosophers since, it has become the foundation for all philosophical thought. For modern readers, the discovery may appear as vague as it has always been, but it is a comprehensive sort of vague that helped construct the science of philosophy. This simple, complex discovery was a Rosetta Stone for the human mind, human nature, and human involvement, and the ancient Greeks achieved it with two simple words, “Know thyself.”[1]           

Perhaps a modern translation or update of the ancient Greek maxim is necessary. Perhaps, today, we should say, “Keep track of yourself,” as that might be a better interpretation for those modern readers who are blessed and cursed with the many modern distractions that render such a task more difficult.

Although it could be said that mankind has found the investigation of other, more irrelevant matters far more entertaining for as long as we have occupied Earth, few would argue that we have more distractions from the central argument of knowing thyself than we have right now. Today, it is easier than ever to lose track of who we are, who we really are.

The Holy Grail for those who produce images for our numerous screens is to create characters the audience can identify with so well that we relate to them. Another goal is to create characters that we not only relate to but we attempt to emulate. Idyllic images litter this path to the Holy Grail, and we associate with them so often that we begin to incorporate the characters’ idealism into our personality. On a conscious level, we know they are fictional characters, yet they exhibit such admirable characteristics that we attempt to mimic them when we are among our peers. Somewhere along the path, who we are, who we really are, can get lost in the shuffle.

A decisive moment eventually arrives when we find that we’re having difficulty drawing a line of distinction between the subconscious incorporation of these fictional characteristics and the realization that we are not those characters. This decisive moment is often one of crisis, and it can lead an identity crisis, because we always thought that when a moment of crisis arrived we would be able to handle it much better than we did.

When this crisis arrives, we might initially project an idyllic screen image version of us into reality. That version knows how to handle this crisis better than we ever will. Yet, it is not us, in the truest sense, but a different us, some fictional image we have created of us that handles pressure, conflict, and crisis so much better than we do. The trouble is, now that the reality of a real-world crisis stands before us, we cannot remember how that character that we resonated with did it.

In one distant memory, we were a swashbuckling hero who encountered a similar problem and dealt with it in a more heroic fashion. We might have encountered a verbal assault on our character in another distant, foggy episode, which we remember countering with a cynical, sardonic comeback that laid out our verbal assaulter. We cannot recall the specifics of these moments, now that really need them, because we weren’t really doing them. On some level, we recognize that we’ve been fooling ourselves, but we’ve incorporated so many idyllic images of so many characters handling so many situations with such adept fluidity that we’ve incorporated those idyllic screen images into our image of ourselves.

Another idyllic image occurs over time, in our interactions with peers. These images may be nothing more than a false dot matrix of tiny mental adjustments we’ve made over time to deal with situational crises that might have otherwise threatened to lessen our self-esteem if we didn’t make them, until we became the refined, sculpted specimen now capable of handling any situation that arises. These adjustments may be false interpretations of how we actually handled those previous confrontations, but we’ve preferred our rewrites for so long that they somehow became part of a narrative that we now believe.

We’ve all had to correct people at one time or another. It can be uncomfortable at times, but we’ve all done it. We’ve sat through their rendition of the past, and we’ve had to correct them. “I’m sorry, but that’s not the way it happened.” When they didn’t believe us, we invited others into the argument to augment our version with overwhelming corroborating evidence. We are shocked when our peer refuses to acknowledge their error, even in the face of the corroborated account. At that point, we fear our peer must be delusional, and the only sane thing to do is walk away.

If we know them well, and we know they’re not delusional people, we assume that they must be purposefully lying about the incident, spinning it to make themselves look better. We assume they need to colorize their role in it to boost their reputation and self-esteem. We think less of these confused, delusional, or lying individuals from a distance, and that distance suggests to us that we’ve achieved a place of honesty they never could.

After thoroughly condemning them, we encounter a similar scenario, only with the roles reversed. We won’t see it this way, of course, as a significant amount of time will pass between our confrontation and theirs, but my guess is most who confront the delusional experience someone who seeks to show us we have similar holes in our memory. It can be an eye-opening experience for those of us who strive for objective honesty, if we are able to see it for what it is.

✽✽✽

Lurking in the fourth layer of Abraham Maslow’s Hierarchy of Needs, we find esteem. Maslow states, and I paraphrase, “This need for greater self-esteem, this need for respect, value, and acceptance by others is vital to one’s sense of fulfillment.”[2]

If esteem is so vital to our psychological makeup, what happens when we fail where others succeed? If we are able to convince ourselves that these successes are an exception to the rule, we find an excuse, but when these repeat so often that we can no longer find a suitable excuse confusion and frustration sets in. To avoid falling spiraling down further, we develop defense mechanisms.

Mental health experts say that if these defense mechanisms are nothing more than harmless delusions and illusions, they can actually be quite healthy. The alternative occurs when the reality of these repeated situations begins to overwhelm us. If this is happens, we might begin wondering where the dividing line is between using delusions for greater mental health and becoming delusional?

If we attain what we seek from momentary delusional thoughts and we get away with it, what’s to stop us from using those excuses so often that we’re rewarded with a better perception among their peers, along with greater self-esteem? Why would we choose to moderate future delusions? What’s to stop us from continuing down their delusional paths, until we begin to lose track of who we are, who we really are?

Most historical research dedicated to the brain focuses on its miraculous power to remember, but some of the more recent research suggest that the power to forget and misremember seminal moments is just as fundamental to happiness and greater mental health.[3] The thesis suggests that the brain distills horrific memories and horrible choices out, and it eliminates them for the sake of better mental health, in a manner similar to how the liver distills impurities out for better physical health.

Thus, we could say our delusional peers might be actually recalling the incidents differently as an unconscious attempt to improve their mental health. Their account of what happened may not be true, but did they create it to deceive us? We don’t know the answer to that and each situation calls for independent analysis, but experience with such matters and extensive reading on the subject has led me to believe they may just be deceiving themselves into an idyllic path, the one they need for better mental health. To take this theory to its natural conclusion, we could also say those in need of professional counseling might have opted for the bright and shiny delusional paths too often. They might subconsciously omit embarrassing details from their memory and forget some of the self-esteem-crushing decisions they’ve made along the way. Some might fill those gaps with the actions or words from their favorite scripted responses or actions from screen actors. By replacing and redefining the embarrassing details and self-esteem-destroying decisions with idyllic images and positive reinforcements, they’ve spent a little too much time in those bright, shiny forests of positive illusions and delusions. The power of these idyllic images have become so ingrained that they now need a professional to take them by the hand and guide them back to the truth that they’ve hidden so far back in the forest of their mind that they can no longer find it without assistance.

One of this therapist’s primary goals in such sessions is to attempt to teach their clients how to know thyself better. In the vein we’re discussing here, they assist the client in attempting to rid their mind of the accumulation of illusions and delusions that the client used to create a sense of superiority. They attempt to remove the dot matrix of tiny adjustments and idyllic images we used to keep mental health issues at bay. To remove these subjective views, the therapist asks their client questions the client should’ve been asking themselves all along, to help them achieve some form of personal clarity.

Some of us are better able to keep track of ourselves, to gain personal clarity as we age and as a result of experiences, but clarity cannot occur without extensive reflection, and Abraham Maslow suggested that a mere 2 percent of the people in the world reflect enough to achieve self-actualization.[4] The comprehensive term personal clarity is not necessarily moral clarity, but without guiding principles, it is impossible to achieve it. Clarity serves as subtext for morality and vice versa.

Of course, no human being can achieve absolute clarity, as we are all unsure of ourselves in various moments and we are insecure by nature. Nevertheless, some submit the red herring argument that because absolute clarity is nearly impossible to achieve, it is pointless to strive for it. They also submit that because there are no absolutes, and they don’t understand why anyone would attempt to achieve clarity on any matter. What if that reliance on anecdotal arguments invites the confusion that inhibits progress toward clarity, and that their argument that a thoughtful person always focuses on anecdotal arguments permits them to avoid trying to achieve a level of clarity.

The final hurdle in achieving clarity by knowing thyself arrives when we recognize that too much comparative analysis intrudes upon self-reflection. There’s nothing wrong with comparing oneself to others, of course, as it helps us clarify our progress and learn more about our identity. Too much comparative analysis might distract us from who we really are, in some cases, as we attempt to assimilate their characteristics into our own, and it can dilute the acute focus we need to jump through the hoops involved in knowing thyself better, however, it becomes counterproductive.

It is for these reasons that greater minds than ours have suggested that the path to greater knowledge, a better life, happiness, and more self-esteem exists somewhere on the path to knowing thyself better. They also suggest that too often, we spend too much time investigating superfluous minutiae. It’s a waste of time, they say, for people with too much time on their hands.

[1]https://thezodiac.com/soul/oracle/whentheoraclespoke.htm

[2]https://www.verywellmind.com/what-is-maslows-hierarchy-of-needs-4136760

[3]https://www.huffingtonpost.com/dr-judith-rich/the-power-of-conscious-fo_b_534688.html

 

[4]http://www.deepermind.com/20maslow.htm

[1]https://thezodiac.com/soul/oracle/whentheoraclespoke.htm

[2]https://www.verywellmind.com/what-is-maslows-hierarchy-of-needs-4136760

[3]https://www.huffingtonpost.com/dr-judith-rich/the-power-of-conscious-fo_b_534688.html

 

[4]http://www.deepermind.com/20maslow.htm

The Epic Battle of Ayn Rand vs. Larry David


“Who would win in a fight Godzilla or King Kong?” was a question that was asked by just about every kid I knew growing up. “What about Batman versus Superman, or how about The Six-Million Dollar Man and Big Foot?” With that mindset forever entrenched in my skull, I was intrigued when I learned that one of our society’s most popular satirists would be taking on one of our most popular philosophers.

Larry-David-9542580-1-402Larry David’s Clear History is a satirical comedy, not a philosophical treatise, so the movie should be given some artistic license when it attempts to deconstruct, refute, or simply poke fun at one of Ayn Rand’s most famous books The Fountainhead. The question that every viewer should ask themselves is where does that artistic license end, and the requirement of factual refutation begin? As it has often been said, a satirist can be humorous when poking fun at various institutions, but he can be hilarious if he adds an element of truth to his satire. In this vein, Clear History is not hilarious.

Some would say that those who are so bothered by the content of a movie that they can’t enjoy something as simple as a simple comedy without analyzing it to death, need to relax, get out more, or have more relations with the opposite sex. It’s a fair point, but isn’t it also a fair point that if these movie makers are going to attempt to satirically refute one of the most famous books of all-time (Ayn Rand’s The Fountainhead) that the material might be more effective if they did so in a more accurate manner? Why even mention the book, much less make it an ongoing theme of your movie, if that wasn’t their goal? If the screenwriters simply wanted to provide light humor, why didn’t they just invent a book, and that book’s writer, for Will Haney character’s inspiration? They could then more easily refute any claims of inaccuracy by those who believe that they didn’t properly represent the book in question.

Even if the writers wanted to avoid the heavy handed task of providing exact refutation, and their work of light humor was only going to trim the edges of Ayn Rand’s philosophy, for the purpose of providing their audience a base from which light humor and sight gags would spring, we should require those satirists to get the subtext of her philosophy correct, for proper, albeit humorous refutation. If that satire’s main character is going to portray an anti-Rand character (Nathan Fromm), shouldn’t we require his adversary (Will Haney) to properly represent the Rand character, if for no other reason than to have a proper adversarial relationship … Even if it’s for no other reason than to have humorous exchanges, or to have a subtext that hints at those philosophical differences?

randThere are moments in the movie where it appears as though the writers purposely avoided representing the Ayn Rand philosophy accurately, that they don’t understand the greater import of her message, or that they simply wanted to provide their “impossible to grasp” interpretation of it. One of the few direct interpretations of The Fountainhead’s main character, Howard Roark, involves a swear word that characterizes Roark as one of the meanest characters in the history of literature.

Teenagers use this swear word, in this manner, to provide their listeners with an all-encompassing dismissal of the chosen object of their scorn, and that’s all other teens need to follow a fellow teen’s dismissal of their subject. Adults often need more. Adults may allow the speaker to dismiss a person with a swear word, especially for the sake of humor, but they often require more if they are going to join the speaker in their attempts to dismiss a person, or an idea. Even if said adults aren’t willing to join the speaker in the condemnation of a subject, they usually enjoy the blows delivered in an epic battle, but even then, even for the purpose of satirical refutation, most adults prefer to have an element of truth added for added amusement.

When I learned that a mighty satirist would be taking on a mighty philosopher, I thought of all of those speculative epic battles that we talk about in our youth. When I saw my satiric hero had another character in the movie deliver a blow below the belt, characterizing Ayn Rand’s character Howard Roark with a swear word that was supposed to define him as one of the meanest characters in the history of literature, I knew this wouldn’t be a fair fight. Even though I knew that the protagonist’s adversary (Ayn Rand) in this epic battle was no longer alive to counter punch, I knew the fight would be called early.

It strikes me that when we create a satirical piece, we have one shot. We have to combine a substantive take with clever inserts of humor. It’s a juggling act that allows some room for error, as long as the premise is true. Doing otherwise leaves the audience thinking, “Ok, you don’t like Ayn Rand, or the Fountainhead. We got it. Now tell us why we shouldn’t.”   

Then, when I realized that this below the belt punch was going to be the best blow in the arsenal of one my favorite satirists, watching the rest of Clear History proved to be as sad, and as depressing, as watching Muhammad Ali battle Larry Holmes and Mike Tyson battle Lennox Lewis at the end of their careers. This isn’t to say that I think Clear History spells the end of Larry David’s career, or that he’s in any way past his prime, but that he had one awful match in which he proved to be out of his weight class.

The Debilitating Fear of Failure


“The reason we struggle with insecurity,” notes Pastor Steven Furtick, “is because we compare our behind-the-scenes with everyone else’s highlight reel.”

Some quotes educate us on matters we know nothing about, but the ones that stick take a matter we know everything about and puts a clever twist on it that changes our perspective. We all know failure, or some level of it, at various points of our life. Some of those failures have shaped us in profound ways that we assume everyone remembers them the moment we enter a room, and some people will, but will they remember their own, or will they compare our failings to their highlight reels.

Pastor Steven Furtick
Pastor Steven Furtick

“Acknowledging failure,” Megan McArdles writes in the book The Up Side of Down: Why Failing Well is the Key to Success, “Is a necessary first step in learning from it.”

Some of us are old enough to remember the severe penalty for missing a rung on the monkey bars. An erroneous grab, at the very least, could land a victim center-of-attention status, as we attempted to find our feet. At worst, it would cause the pack of onlookers to send an emissary to the office with a call for assistance. These everyone-is-looking-at-you moments are so immersed in embarrassment, and pain, that few can see any benefit to them.

Most of those liable for such situations, have lowered the monkey bars, and made the ground so forgivable that one would have to fall from a skyscraper to receive any pain. Thanks to these and other technological advances, fewer children get hurt on playgrounds, fewer playground manufacturers get sued, and everyone is much happier. There is one casualty, however, the pain of failure.

No one wants to see a child cry, and we should do everything we can to prevent it, but pain teaches us.

After a near fall in a supermarket, the checker complimented me on the agility and nimbleness I displayed to avoid hitting the ground. “It could be that,” I returned, “or it could be said that only someone so well-practiced in the art of falling knows how to avoid it.”

I eventually did touch ground a short time later, at a family reunion. I also touched a parked car, and then I touched the ground again. Among the lessons I learned is that pain hurts. Had it been a simple fall, it would be hardly worth noting. This was one of those by-the-time-this-ends type of falls everyone will be looking, some will be concerned, and most will be laughing. I thought I corrected my trajectory a number of times, but I was moving too fast. By the time it was finally over, I silenced just about everyone in the vicinity. The kids around me laughed, as kids will do when anyone falls, and my age-denying (Not Defying!) brother laughed, but if the Greg Giraldo line, “You know you’re getting old when you fall down, no one laughs and random strangers come running over acting all concerned,” is true, then I am getting old.

Most lessons in life are learned the hard way, and they are often learned in isolation, in that even our closest friends and family members distance themselves from us in these moments, so that they have no association with them. These dissociations range from laughter to sympathy, but the latter can be just as dissociative as the former if it’s done a right. The point is, no matter how we deal with these moments of failure, we usually end up having to deal with them alone. 

The point is that the lessons learned through pain and embarrassment, are lessened by lowering the monkey bars, providing a forgiving ground, and instituting zero tolerance bully campaigns. The point is that those of us that see little-to-no benefit derived from bullying, or that any benefits are inconsequential when compared to the damage done by the bully may eventually see the fact that few lessons in life are learned by the individual, until those kids enter adult arenas.

A quote like Pastor Steven Furtick’s, also tells us the obvious fact that we’re not alone in having moments of failure, but that those that can deal with them in the proper perspective might actually be able to use them to succeed on some levels.

Artistic Creations

Any individual that attempts to create some form of art knows more about comparing another’s “highlight reels” to their “behind-the-scenes” efforts better than most.

How many times did Ernest Hemingway grow insecure when comparing his behind-the-scenes efforts to the shining lights that preceded him? How many times did he fail, how many times did he quit, and give up, under that personally assigned barometer, before finally finding a unique path to success?

Even in the prime of his writing career, Hemingway admitted that about 1 percent of what he wrote was usable. Think about that, 1 percent of what he wrote for The Old Man and the Sea, was publishable, worth seeing, and that which Hemingway considered worthy of the highlight reel that we know as the thin book called The Old Man and the Sea. The other 99 percent of what he wrote, proved to be unpublishable by Hemingway’s standards. Yet, this highlight reel of the Old Man and the Sea writing sessions are what has inspired generations of writers to write, and frustrated those that don’t consider all of the behind-the-scenes writing that never made it in the book’s final form.

mark-twain-6fa45e42400eea8cac3953cb267d66a33825a370-s6-c30Mark Twain

“Most of what Mark Twain wrote was dreck,” writes Kyle Smith.{1}

Most of us know Tom Sawyer and Huckleberry Finn, the highlight reels of Mark Twain’s writing. We know the infamous Twain quotes that occurred in the numerous speeches he gave, and the essays that he wrote, but it is believed that he wrote as many as 50,000 letters, 3,000 to 4,000 newspaper and magazine articles, and hundreds of thousands of words that were never published. Twain also wrote hundreds of literary manuscripts—books, stories, and essays—that he published, and then abandoned, or gave away. Almost all of it has been discovered over the last century, and placed in a home called The Mark Twain Papers.{2}

Very few of us are so interested in Mark Twain, or any of his writing, that we want to read his “dreck”. Very few of us are so fanatical about Twain that we want to know the material he, and his publishers, deemed unpublishable. Yet that “dreck” ended up fertilizing the foundation of his thought process so well that he churned out two highlight reels that many agree to be historic in nature. Similarly, very few would want to want to watch a Michael Jordan, or a Deion Sanders, practice through the years to tweak, and foster their athletic talent to a point that we now have numerous three to four second highlight reels of their athletic prowess. Their behind-the-scenes struggles may provide some interesting insight into their process, but they’ve become a footnote at the bottom of the page of their story that no one wants to endure in total.

nirvanain-365xXx80Kurt Cobain

When we hear the music contained on Nirvana’s Nevermind, we hear a different kind of genius at work. We hear their highlight reels. We don’t know, or care, about all of the “dreck” Kurt Cobain wrote in quiet corners. Most of us, don’t know, or care, about the songs that didn’t make it on Nevermind. Most of us don’t know, or care, about all the errors he committed, the refining, and the crafting that went into perfecting each song on the album, until the final form was achieved. We only want the final form, the highlight reels, and some of us only want one highlight reel: Smells Like Teen Spirit.

On an album prior to Nirvana’s Nevermind, called Bleach, Kurt Cobain penned a song called Floyd the Barber. “Where does the kernel of a song like that start?” Soundgarden’s Chris Cornell asked. Cornell may not have come from the exact same background as Cobain, and he may not have been influenced by the exact same artists as Cobain, but he presumably felt like his creative process was so close to Cobain’s that he couldn’t fathom how the man achieved such divergence from the norms of musical creation. Those familiar with Cobain’s story also know that he was heavily influenced by the music of Soundgarden, and that fact probably confused Cornell all the more.

Other than Soundgarden, Cobain also loved Queen, The Beatles, The Pixies, The Melvins, and a number of other lesser known bands. How much of his early works were so similar to those artists that no one took him seriously? As I wrote earlier, it’s a major part of the artistic process that every artist goes through to attempt to duplicate influential artists in some manner. It’s a step in the process of crafting original works. When that artist duplicates those that came before them often enough, the artist (almost accidentally) begins to branch off into building something different … if they have any talent for creation in the first place.

Divergence in the artistic process

Few artists can pinpoint that exact moment when they were finally been able to break the shackles of their influences, for it happens so progressively that it’s almost impossible to pinpoint. Most artists do remember that moment when that one, somewhat inconsequential person said that some aspect of their piece wasn’t half bad however. At that point, the artist becomes obsessed to duplicate, or replicate, that nugget of an idea. Once that nugget is added to another nugget, those nuggets become a bold idea that wasn’t half bad. Once that is achieved, another bold idea is added, until it all equals a “halfway decent” compendium of ideas that may form something good. At that point, the artist believes he has something that others may consider unique enough to be called an artistic creation in its own right. When enough unique, artistic creations are complete, the artist may eventually achieve their own highlight reels.

When did Cobain finally begin to branch off? How did he become divergent, and creative, and different on a level that made him an organic writer to be reckoned with? How many casual statements, spray paintings on walls, and other assorted personal experiences had to occur before Kurt Cobain had the lyrics for Nevermind? How many different guitar structures did Cobain and company work through, until he arrived at something usable? How many Nevermind lifted music or lyrics from other failed songs, casual strummings in a closet, and offshoots of other guitarists? What did Floyd the Barber, Come as You Are, and Pennyroyal Tea sound like in those moments when they first found their way from notepad to basement practice sessions? How many transformations did these songs go through in those practice sessions, until they were entirely original, and transformative, and legendary additions to the albums they were included on? If Cobain were alive to answer the question, would he acknowledge that Nevermind is a 1% highlight reel of about a decade of work? Most of us don’t care, we only want to hear the highlight reels, so we have something to tap our finger to on the ride home from work.

Cobain’s highlight reel, Nevermind, proved to be so popular that record execs, and fans, called for a B-list, in the form of the album Incesticide. That album proved Cobain’s B-list was better than most people’s A-list, but what about the D-list, and E-list songs that proved to be so embarrassing that no one outside his inner circle ever knew they existed?

The point is that some of us are so influenced by an artist’s highlight reels that we want to replicate it, and duplicate it, until we become equally as famous as a result, and when we don’t, we think that there is something wrong with us. The point is that the difference between a Mark Twain, a Hemingway, a Cobain, and those that compare their behind-the-scenes work to an influential artist’s highlight reels is that while these artists recognized that most of what they did was “dreck”, they also knew that their behind-the-scenes struggles could be used as fertilizer to feed some flowers.

So, the next time you sit behind behind-the-scenes of your computer keyboard, tattered spiral notebook, or whatever your blank canvas is, remember that all of those geniuses —who so inspired you to be doing what you are doing right now— probably spent as many hours as you do staring at a blank page, or a blinking cursor, trying to weed through all the “dreck”, that every artist creates, to create something different, something divergent from all those creations that inspired them to create. You now know that they succeeded in that plight, but you only know that because the only thing you want to see, hear, and read are their highlight reels.

{1}http://www.forbes.com/sites/kylesmith/2014/02/20/what-mark-twain-van-halen-and-dan-rather-teach-us-about-failure/

{2}http://www.marktwainproject.org/about_projecthistory.shtml