Historical Inevitability


The idea that history is cyclical has been put forth by many historians, philosophers, and fiction writers, but one Italian philosopher, named Giovanni Battista Vico (1668-1744), wrote that a fall is an historical inevitability. In his book La Scienza Nuova, Vico suggested that evidence of this can be found by reading history from the vantage point of the cyclical process of the rise-fall-rise, or fall-rise-fall recurrences, as opposed to studying it in a straight line, dictated by the years in which events occurred. By studying history in this manner, Vico suggested, the perspective of one’s sense of modernity is removed and these cycles of historical inevitability are revealed.

To those of us that have been privy to the lofty altitude of the information age, this notion seems impossible to the point of being implausible. If we are willing to cede to the probability of a fall, as it portends to a certain historical inevitability, we would only do so in a manner that suggests that if there were a fall, it would be defined relative to the baseline that our modern advancements have created. To these people, an asterisk may be necessary in any discussion of cultures rising and falling in historical cycles. This asterisk would require a footnote that suggests that all eras have had creators lining the top of their era’s hierarchy, and those that feed upon their creations at the bottom. The headline grabbing accomplishments of these creators might then define an era, in an historical sense, to suggest that the people of that era were advancing, but were the bottom feeders advancing on parallel lines? Or, did the creators’ accomplishments, in some way, inhibit their advancement?

“(Chuck Klosterman) suggests that the internet is fundamentally altering the way we intellectually interact with the past because it merges the past and present into one collective intelligence, and that it’s amplifying our confidence in our beliefs by (a) making it seem like we’ve always believed what we believe and (b) giving us an endless supply of evidence in support of whatever we believe. Chuck Klosterman suggests that since we can always find information to prove our points, we lack the humility necessary to prudently assess the world around us. And with technological advances increasing the rate of change, the future will arrive much faster, making the questions he poses more relevant.” –Will Sullivan on Chuck Klosterman

My initial interpretation of this quote was that it sounded like a bunch of gobbeldy gook, until the reader rereads it with the latest social issue of the day plugged into it. What did the person think about that particular social issue as far back as a year ago? Have they had their mind changed on the topic? Have they been enlightened, or have they been proved right on something they didn’t believe as far back as one year ago? If we do change our minds on an issue as quickly as Klosterman suggests, with the aid of our new information resources, are prudently assessing these changes in a manner that allows for unforeseen consequences? This tendency we now have to change our minds quickly, reminds me of the catch phrase mentality. When one hears a particularly catchy, or funny, catchphrase, they begin repeating it. When another asks that person where they first heard that catchphrase, the person that now often uses the catchphrase, and didn’t start using it until a month ago, say that they’ve always been saying it.  

Another way of interpreting this quote is that with all of this information at our fingertips, the immediate information we receive on a topic, in our internet searches, loses value. Who is widely considered the primary writer of the Constitution, for example? A simple Google search will produce a name: James Madison. Who was James Madison, and what were his influences in regard to the document called The Constitution? What was the primary purpose of this finely crafted document that provided Americans near unprecedented freedom from government tyranny, and rights that were nearly unprecedented when coupled with amendments in the Bill of Rights. How much blood and treasure was spent to pave the way for the creation of this document, and how many voices were instrumental in the Convention that crafted and created this influential document?

Being able to punch these questions into a smart phone, and receive the names of those involved can provide them a static quality. The names James Madison, Gouvernor Morris, Alexander Hamilton, and the other delegates of the Constitutional Convention that shaped, crafted, and created this document could become an answer to a Google search, nothing more and nothing less. Over time, and through repeated searches, a Google searcher could accidentally begin to assign a certain historical inevitability to the accomplishments of these men. The notion being that if these names weren’t the answers, other names would be.

Removing my personal opinion that Madison, Morris, Hamilton, and those at the Constitutional Convention composed a brilliant document, for just a moment, the question has to be asked, could the creation of Americans’ rights and liberties have occurred at any time, with any men or women in the history of our Republic? The only answer, as I see it, involves another question: How many politicians in the history of the world have voted to limit their present power, and any future power they might achieve in the future, if their aspirations achieve fruition? How many current politicians would vote for something like term-limits? Only politicians that have spent half their life under what they considered tyrannical rule would fashion a document that could result in their own limitations.   

How many great historical achievements, and people, have been lost to this idea of historical inevitability? Was it an historical inevitability that America would gain her freedom from Britain? Was the idea that most first world people would have the right to speak out against their government, vote, and thus have some degree of self-governance inevitable? How many of the freedoms, opportunities, and other aspects of American exceptionalism crafted in the founding documents are now viewed as so inevitable that someone, somewhere would’ve come along and figured out how to make that possible? Furthermore, if one views such people and such ideas as inevitable, how much value do they attach to them? If they attain a certain static inevitability, how susceptible are they to condemnation? If an internet searcher has a loose grasp of the comprehensive nature of what these men did, and the import of these ideas on the current era, will it become an historical inevitability that they’re taken away in a manner that might initiate philosopher Vico’s theory on the historical inevitability of a fall?

I’ve heard it theorize that for every 600,000 people born, one will be a transcendent genius. I heard this secondhand, and the person that said it attributed it to Voltaire, but I’ve never been able to properly source it. The quote does provide a provocative point, however, that I interpret to mean that difference between one that achieves the stature of genius on a standardized test, or Intelligence Quotient (IQ) test, by scoring high enough on that test to achieve that lofty plateau, and the transcendent genius lies in this area of application. We’ve all met extremely intelligent people in the course of our lives, in other words, and some of us have met others that qualify as genius, but how many of them figured out a way to apply that abundant intelligence in a productive manner? This, I believe, is the difference between what many have asserted is a genius in a one in fifty-seven ratio and the one in 600,000 born. The implicit suggestion of this idea is that every dilemma, or tragedy, is waiting for a transcendent genius to come along and fix it. These are all theories of course, but it does beg the question what happens to the 599,999 that feed off the ingenious creations and thoughts of others for too long? It also begs the question that if the Italian philosopher Vico’s theories on the cyclical nature of history hold true, and modern man is susceptible to a great fall, will there be a transcendent genius that is able to fix the dilemmas and tragedies that await the victims of this great fall? 

Why Adults Hate Their Parents


‘I am so glad I don’t have to go through all that anymore,’ is one of the first thoughts I have when I hear someone say they still hate their parents. That raging insecurity and confusion, projected onto the parents, was so painful at times that I’m glad I don’t have to deal with it anymore. When I hear someone say that their parents are bumbling fools, idiots, or backwater hicks from the 1950’s, I remember saying such things, and I do regret some of it. As has been said of regrets, there is little that we can do about them now. I have also heard some say that the struggle to correct past errors defines us.

The one question I would love to ask of those adults that continue to hate the ‘absolute morons’ that are their parents is, “Why is it so important to you that they be wrong?”

“I’m smarter than my dad,” writes a twenty-something blogger. “I really wish I wasn’t. It’s like finding out Santa isn’t real.” 

This isn’t an exact quote, but it is a summary of her snark. The blogger goes onto discuss how her intelligence, and cultural sensitivity, are a cross that she must now bear in her discussions with her parents. She never states that she hates her parents. She states that she, in fact, loves them a great deal, but she characterizes her love with an element of pity, bordering on condescension, that appears to be endemic in twenty-somethings.

Some people hate their parents in their teens, and well into their twenties. The teen years are a period of cultivation, containing rebellion, learning, etc., that occur before our minds fully form. As we age, our mind matures, and so does our rebellion, until it manifests into either full-fledged hatred, or a condescending pity that recognizes their backwater modes of thought for what they are. This matured rebellion is also based on the fact our parents still have some authority over us, and that reminds us of those days when our parents had total authority over us, and how they “abused it to proselytize their closed-minded beliefs on us.”

When we finally reach that point where they’re no longer helping us pay for tuition, a car, or rent, and we’re able to flex some independent muscles, we spend the next couple of years fortifying this notion that they were wrong, all wrong, all along.

By the time we progress to our thirties, circumstances reveal to us the logic and wisdom our parents attempted to pass down to us, and that it does apply in some circumstances. (Some will never admit this. Some remain stuck in a peak of rebellion.) Their advice may not have applied in all circumstances, but it may have applied in enough of them to force us to remove the ‘bumbling fool’ label. By the time we reach our forties, we begin to think that they’re idiots all over again.

I wrote the last line to complete the joke I heard elsewhere. It’s a funny line, because there is an element of truth in it, but in my experience a truth lies somewhere in the middle. The truth is a hybrid of the forty-something recognition of our parents’ failings, their solid points, and the respect we have for them now that we’re so independent of their authority that we begin to view them as fellow adults that were trying to lead their children down a path they thought was most conducive for success in life.

This specific timeline may not apply to everyone, as we all go through these stages on our own time, and the word ‘hate’ may be too stark a term for the adults still experiencing some animosity towards their parents, but anyone that has been through the roller coaster ride knows that it is one hell of an emotional ride.

The key word in the timeline, for me, is ‘circumstances’. For until I reached my thirties, theory formed the foundation of my uninformed rebellion, and circumstances revealed to me that some of the “archaic and antiquated” advice my dad offered me had some merit.

These circumstances might include having children and protecting the sanctity of their childhood, in the same manner our parents attempted to protect ours. As evidence of this, I thought my dad committed an error in some ways, by allowing me to lead a sheltered existence, until some know-it-all suggested that that might suggest he did his job. “How so?” I asked in a pseudo confrontational manner that suggested that Mr. Know-it-all knew nothing about my rearing. “By allowing your childhood to last as long as possible,” he said.

It might include circumstances we experience in the work place, and all the ways in which we have learned to get along with our co-workers and appease our boss, and it might include general experiences of which our parents specifically warned us what could occur. The instinct we have is to believe that when they proved correct, it was a coincidence, for their warnings proved so prescient that the bumbling fools could’ve never known how right they were.

It’s not debatable to me that I was right about some of the things I planted a flag in, but I came to understand that my dad lived a rich, full life by the time he became my mentor, and some of my impulsive, theoretical thoughts about the world were, in fact, wrong. (Even after gaining some objectivity on my relationship with my dad, it still pains me to write that line.)

Having my own job, my own money, and my own car did a great deal to provide me the independence I desired, but I wanted more. Having my own home, and friends, and a life completely independent of my dad’s influence gained me even more, but it wasn’t enough.

I wanted to be free of those figurative shackles that being my dad’s child suggested. Every piece of information I received about history, the culture, and the world was exciting, and new, and mine, because it stood in stark contrast to everything my dad taught. The information I received, that confirmed my dad’s wisdom, bored me so much I dismissed it. The new age information I received coincided with everything I wanted to believe about the “new” world my dad knew nothing about, and it confirmed my personal biases.

I didn’t ask myself the question that I posed to the blogger when I was a twenty-something, about why I needed this to be so. I probably would not have had much of an answer, even if I searched for it. I probably would have said something along the lines of “Why is it so important to him that he clings to that age-old, traditional mode of thought?”

This redirect would not have been an attempt at deception or evasiveness. I just did not have the awareness necessary to answer such a question. Moreover, as a twenty-something, new age thinker, I was rarely called upon to establish my bona fides. All parties concerned considered me the righteous rebel, and the old guard was, by tradition, the party on trial. They often felt compelled to answer my questions, as opposed to forcing me to define my rebellion. The beautiful thing about this whole setup was that it allowed me to avoid answering questions, even of myself.

My twenty-something definition of intelligence relied on emotion, theory, and very little in the way of facts. I thought they were facts, however, and I had the evidence to back them up. I thought I was intelligent, more intelligent, but what is intelligence? It depends on whom you ask.

In Abraham Lincoln’s day, the ability to reference Shakespeare and The Bible in any given situation defined one’s intelligence level, to another generation it was the ability to quote Friends and Seinfeld, and knowing the IMBD list of Bruce Willis appearances, and to the next generation it was something about Kim Kardashian and Lady Gaga. (I concede that the latter may be an epic fail on my part.)

My dad knew nothing of Seinfeld, or Bruce Willis, so he knew nothing as far as I was concerned. He knew nothing of computers, or devices, and he had just been introduced to gold records (They were CDs! LOL! Gold records?) shortly before his death. This lack of knowledge about pop culture and innovation transcended all matters, as far as I was concerned. I believed my dad was a traditionalist trapped in 1950’s traditionalist modes of thought, and that he could’ve never survived in our current, more sensitive culture. He was backwater, hick, and whatever other names that are applied to a man trapped in a time warp of 1960’s, maybe 70’s, but definitely not 90’s and 00’s, much less the twenty teens.

In the Workplace  

Much to my shock, I began quoting my dad when I was fully ensconced in the workplace, in my thirties:

“Everyone has a boss,” and “You can learn everything there is to know about the world from books, but the two words most conducive to success in life are going to be either: ‘Yes sir!’ and ‘No sir’.” 

Although, I loathed these words for much of my young life, as they implied that even after escaping my dad’s management of my life –a level of authority that turned out to be far more macro than I ever considered possible– a portion of my life would always be managed. I would learn the difference between his level of macro, and my boss’s definition of macro (Hint: micro) when I was out on my own, and out from under his “totalitarian” thumb. I would also learn that others’ moods would dictate whether my day would be a good one or a bad one, in the same manner my life had been under my dad’s roof, only tenfold.

He based his advice on his own experience in the workplace, but that experience occurred in an era that required reverence of a boss. Thanks to the new age ideas of boards and panels conducting arbitration cases for those that have been fired, the various wrongful termination lawsuits, and the threat thereof that gave life to the Human Resources department, the reverence requirement was no longer as mandatory in my era.

I would also learn that my newfound freedom would contain a whole slew of asterisks that included the fact that no matter how much free time I had, I would spend a great portion of my life in a workplace, under the watchful eye of authority, compromising my definition of freedom every step of the way.

Throughout the course of my life, I’ve also met those that “never went through any of this.” If you find this as impossible to believe as I did, all I can tell you is I’ve met them. They say rational things like, “I never thought my parents were perfect, but I know that they always tried to steer me into what they believed to be the right course.”

After picking myself off the floor from laughter, believing that I was on the receiving end of a comedic bit, I realized they were serious. The fact that their upbringing was so much healthier than mine, caused me to envy them in some ways, but after chewing on that for some years I realized that all of the tumult I experienced, self-inflicted and otherwise, defined my character and my current individual definition of independence.

We are our parent’s children, and at times, we feel trapped by this. As a result, we focus on the differences. We may mention some similarities, but we do the latter in a manner that we believe is ‘understood’ by all parties concerned. Even when we reach that point in life, somewhere in our thirties and forties, when we begin to embrace some elements of that trap, we cling to the idea that we’re so different. The answers as to why these dichotomies exist within us are as confusing to us as the fact that they are a fait accompli.

When immersed in the tumult of the younger brain, trying to make some sense of our world, we may fantasize, at times, about what it would be like to have other parents. Our friend’s parents seem so normal by comparison. We think most of our problems could be resolved if we had their parents, or any parents that are more normal. We may have even fantasize about what it might be like to have been free of all patriarchal influence. We consider how liberating it could be in some ways to be an orphan, until we recognize how confusing it must also be. People without parents don’t have a framework, or a familiar foundation from which to rebel, and when we consider this, we realize that our identity is wrapped up in that push and pull of acquiescence and rebellion to our parents.

While there is some acknowledgement of the ‘the more things change, the more they stay the same’ dictum when we receive advice from our parents, our rebellion operates under the “It was the best of times, it was the worst of times” principle when we process that advice. When we reach an age where it dawns on us that knowledge of innovations and pop culture are as superfluous as they are, that removes a substantial plank of our rebellion, until politics takes its place. We then sit down at their dinner table to resolve the political and geopolitical problems of the day, for our nation, in a manner we deem substantial. It fires us up. We deliver nuke after nuke, until we realize that the effort to persuade our parents is futile, and that steeped in this effort was our juvenile, sometimes snarky need to prove them wrong. While a more substantial plane than pop culture, political discussions can be just as silly for us, as it was for our parents when they discussed such issues at their parents’ dinner table, when they considered their parents to be bumbling idiots that offered nothing new to the discussion and stubbornly resisted the winds of culture change. The one import that they may have taken from their discussions with their parents, as we will with ours, over time, is that the more things change, the more they stay the same, and human nature doesn’t change as much as we may believe it does with innovations, cultural advancements, and social awareness. A kiss is still a kiss, a boss is still a boss, and the fundamental things still apply, as time goes by.

 

Find Your Own Truth


“Find your own truth,” was the advice author Ray Bradbury provided an aspiring, young writer on a radio call-in show.

Most people loathe vague advice. We want answers, we want that perfect answer the helps us over the bridge, and a super-secret part of us wants those answers to be easy, but another part of us knows that a person gets what you pay for in that regard. When we listen to a radio show guesting a master craftsman, however, we want some nugget of information that will explain to us how that man happened to carve out a niche in the overpopulated world of his craft. We want tidbits, words of wisdom about design, and/or habits that we can imitate and emulate, until we reach a point where we don’t have to feel so alone in our structure. Vague advice, and vague platitudes, feel like a waste of our time. Especially when that advice comes so close to a personal core and stops.

Bradbury went onto define this relative vision of “the truth” as he saw it, but that definition didn’t step much beyond that precipice. I had already tuned him out by the time he began speaking of other matters, and I eventually turned the channel. I may have missed some great advice, but I was frustrated.

If the reader is anything like me, they went back to doing what they were doing soon after hearing advice, but the quality of deep, profound advice starts popping up in the course of what a person does. It begins to apply so often that, we begin chewing on it, and digesting it. Others may continue to find this vague advice about a truth to be nothing more than waste matter –to bring this analogy to its biological conclusion– but it begins to infiltrate everything an eager student does. If the advice is pertinent, the recipient begins spotting truths that should’ve been so obvious before, and they begin to see that what their thought was the truth –because it is for everyone else– is not as true for them as they once thought.

Vague advice may have no import to those that don’t bump up against the precipice, and for them a platitude such as, “Find your own truth” may have an of course suffix attached to it. “Of course an artist needs to find their own truth when approaching an artistic project,” they may say. “Isn’t that the very definition of art?” It is, but go ahead and ask an artist if the project they are currently working on is any closer to their truth than the past pieces they attempted. Then, once they’ve completed that project, go ahead and ask them if they’re any closer to their truth. The interrogator is likely to receive a revelation of the artist’s frustration in one form or another, as most art involves the pursuit of a truth coupled with an inability to capture it to the artist’s satisfaction. Yet, it could be said that the pursuit of artistic truth, and the frustration of never achieving it, may provide more fuel to the artist than an actual, final, arrived upon truth ever could.

Finding your truth, as I see it, involves intensive knowledge of the rules of a craft, locating the parameters of the artist’s ability, finding their formula within, and whittling. Any individual that has ever attempted to create art has started with a master’s template in mind. The aspiring, young artist tries to imitate and emulate that master design, and they wonder what that master of the design might do in moments of artistic turmoil. Can I do this, what would they do, should I do that, and is my truth nestled somewhere inside all of that awaiting further exploration? At a furthered point in the process, the artist is hit by other truths, truths that contradict prior truth, and this begins to happen so often that everything the artist believed to be a truth, at one point, becomes an absolute falsehood, and this is where the whittling comes in.

In a manner similar to the whittler whittling away at a stick to create form, the storyteller is always whittling. He’s whittling when he writes. He’s whittling when he reads. He’s whittling in a movie theater, spotting subplots, and subtext that no one else sees. He’s whittling away at others’ stories to what he believes to be the core of the story that the author of the piece may not even see. Is he correct? It doesn’t matter, because he doesn’t believe that the author’s representation of the truth is a truth.

Once the artist has learned all the rules, defined the parameters, and found his own formula within a study of a master’s template, and all the templates that contradict that master template, it is time for him to branch out and find his own truth.

The Narrative Essay

Even while scouring the RIYL (read if you like) links provided at the bottom of the webpages of books I’ve enjoyed, I knew that the narrative essay existed. Just as I’ve always known that the strawberry existed, I knew about the form some call memoir, that others call creative non-fiction. The question I have, is have you ever tasted a strawberry that caused you to flirt with the idea of eating nothing but strawberries for the rest of your life? If you have, I’m going to guess that it had more to do with your diet than it did the actual taste of that strawberry. A person may go long stretches of time carelessly ignoring the nutrients that this gorgeous, little heart-shaped berry has in abundance for. They may suffer from a vitamin C depletion, for example, in ways that were not apparent to them, until they took that first bite of this gorgeous, little heart-shaped berry.

That first bite caused a person inexplicable feelings of euphoria that they didn’t understand, until they learned of the chemicals of the brain, and the manner in which the brain rewards the person for fulfilling a biological need. The only thing they may have known at the time was that that strawberry tasted so glorious that they stood at the strawberry section of a buffet line gorging on strawberries while everyone behind them waited for them to starting moving.

I am sure, at this point, that the reader would love to learn the title of that one gorgeous, little narrative essay that caused my feelings of creative euphoria. The only answer I can give the reader is that if they’re suffering a depletion, one essay will not quench this depletion any more than one strawberry can. One narrative essay did not provide a eureka-style epiphany that led me to an understanding of all of the creative avenues worthy of exploration. One essay did not quench the ache my idea-depleted mind endured in the more traditional parameters, with the time-tested formulas and notions I had of the world of storytelling. I just knew that I needed more, and I read all the narrative essays I could find in a manner equivalent to the effort I put into exploring the maximum benefits the strawberry could provide, until a grocery store checker proclaimed that she had never witnessed one man purchasing as many strawberries as I had at one time. She even called a fellow employee over to witness the spectacle I had laid out on her conveyor belt. The unspoken critique being that no wife would permit a man to purchase this many strawberries at once, so I must be single and self-indulgent.

An unprecedented amount of strawberries didn’t provide me an unprecedented amount of euphoria, of course, as the brain appears to only provide near-euphoric chemical rewards for satisfying a severe depletion, but the chemical rewards my brain offered me for finding my own truth, in the narrative essay format, have proven almost endless. As have the rewards I’ve experienced reading others reach their creative peaks. As I’ve written, I knew narrative essays existed, but I considered most of them to be dry, personal essays that attempted to describe the cute, funny things that happened to them on their way to forty. I never thought of them as a vehicle for the exploration of unique creativity, until I found those authors that had.

It is difficult to describe an epiphany to a person that’s never had one. Even to those that have had one, I would say that the variables within an epiphany are so unique that they can be difficult to describe to a listener with an “of course” face on. I could’ve informed them that, more often than not, an epiphany does not involve the single, most unique thought ever considered, but a common place “of course” thought that the recipient has to arrive at of their own accord. When that doesn’t make a dent in their “of course” face, we can only concede that epiphanies are personal.

For me, the narrative essay was an avenue to the truth that my mind craved, and I may have never have ventured down this path had Ray Bradbury’s vague four words failed to register. For those that stubbornly maintain their “of course” faces in the shadow of the maxim the late, great Ray Bradbury inscribed in the minds of all those that heard it, I offer another vague piece of advice that the late-great Rodney Dangerfield offered to an aspiring, young comedian:

You’ll figure it out.”

If a vague piece of advice, such as these two nuggets, appear so obvious that they’re hardly worth saying, or the recipient of such advice can’t understand how it might apply, no matter how often one thinks about it, does it, attempts to add to it, or whittles away at it to find a core worthy of exploration, I add, you’ll either figure it out, or you won’t.

To worry, or too worried?


Nestled within the quest to be free and to experience life through the portal of YOLO (You Only Live Once), or FOMO (Fear of Missing Out), lies a fear, concern, and worry that we might be too free.  Born, if the thesis of Francis O’Gorman’s book, from a need to be led.

It may seem illogical to make an argument that we’re too free, in lieu of the technological, and governmental, advances have led us to believe every move we make, and every thought we have is monitored, infringed upon, and legislated against.  Francis O’Gorman Worrying: a Literary and Cultural History is not a study of freedom, but one of the common man worrying about how the people, places, and things around us that are affected by the freedom.  Mr. O’Gorman makes this proclamation, in part, by studying the literature of the day, and the themes of that literature.  He also marks this with the appearance, and eventual proliferation of self-help guides to suggest that this greater sense of concern, or worry, led to readers, of another era, rewarding writers that provided them more intimate, more direct answers.  This study leads Mr. O’Gorman to the conclusion that this general sense of worry is a relatively new phenomenon, as compared to even our recent ancestral history.

yes_me_worryOne fascinating concept Mr. O’Gorman introduces to this idea is that the general sense of worry appears to have a direct relation to the secularization of a culture.  As we move further and further away from the religious philosophies to a more individualistic one, we may feel freer to do what we want to do, but we are also more worried about the susceptibility we have to the consequences of unchecked, mortal decision making. We humans have an almost inherent need to be led.

How often does a secular person replace religion with politics?  Politics, at its core, is leadership, and in our dining room table discussions of politics, most of our discussions revolve around why one person is capable of leading our locale, our state, and our nation.  It involves why one person’s idea of leadership may be inept, while another –that abides by our principles– is more capable. As much as those adults that believe themselves fully capable of living without leadership would hate to admit it, all political thought revolves around the desire to be led.

Reading through the various histories of man, we have learned that our ancestors had more of a guiding principle, as provided by The Bible.  The general theory, among those that preach the tenets of The Bible is that man’s mental stability, and happiness, can be defined in direct correlation to his desire to suborn his will to God’s wishes.  God gave us free will, they will further, but in doing so He also gave us guiding principles that would lead us to a path of righteousness and ultimate happiness.

If a man has a poor harvest –an agrarian analogy most preachers use to describe the whole of a man’s life– it is a commentary on how this man lived.  The solution they provide is that the man needs to clean up his act and live in a Godlier manner.  At this point in the description, the typical secular characterization of the devoutly religious comes to the fore, and their agreed upon truth has it that that these people are unhappier because they are unwilling to try new things, and puritanical in a sense that leads them to be less free.  The modern, more secularized man, as defined by the inverse characterization, has escaped such moral trappings, and he is freer, happier, and more willing to accept new ideas and try new things.  If the latter is the case, why are they so worried?

We’ve all heard snide secularists say that they wish they could set aside their mind and just believe in organized religion, or as they say a man in the sky.  It would be much easier, they say, to simply set their intelligence aside and believe.  What they’re also saying, if Mr. O’Gorman’s thesis can be applied to them, is that it would give them some solace to believe that everything was in God’s hands, so that they wouldn’t have to worry all the time.

Like the child that rebels against authority, but craves the guidance that authority provides, the modern, enlightened man appears to reject the idea of an ultimate authority while secretly craving many of its tenets at the same time.  A part of them, like the child, craves the condemnation of immorality, a reason to live morally, and for some greater focus in general.  The randomness of the universe appears to be their concern.

One other cause for concern –that is not discussed in Mr. O’Gorman’s book– is that the modern man may have less to worry about.   If social commentators are to be believed, Americans have never been more prosperous:

“(The) poorest fifth of Americans are now 17 percent richer than they were in 1967,” according to the U.S. Census Bureau

They also suggest that the statistics on crime are down, and teenage pregnancy, and drinking and experimental drug use by young people are all down.  If that’s the case, then we have less to worry about than we did even fifteen years ago.  It’s a concern.  It’s a concern in the same manner that a parent is most concerned when a child is at its quietest.  It’s the darkness before the storm.

Francis O’Gorman writes that the advent of this general sense worry occurred in the wake of World War I.  Historians may give these worriers some points for being prescient about the largely intangible turmoil that occurred in the world after the Great War, but World War I ended in 1918 and World War II didn’t begin until 1939, a gap of twenty-one years of people worrying about the silence and calm that precedes a storm.  This may have propelled future generations into a greater sense of worry, after listening to their parents’ concerns over a generation, only to have them proved right.

The idea that we worry about too much freedom, as in freedom from the guidelines and borders that religion, or God, can provide, can be accomplished without consequences, writes The New Republic writer, Josephine Livingstone in her review of Francis O’Gorman’s book:

“The political concept of freedom gets inside our heads.  It is a social principle, but it structures our interiority.  This liberty worries us; it extends to the realm of culture too, touching the arts as much as it touches the individual human heart and mind.

“In this way, O’Gorman joins the tide of humanities scholars linking their discipline with the history of emotion, sensory experience, and illness. It’s an approach to culture most interested in human interiority and the heuristics that govern the interpretation of experience: Happiness can be studied; sound can be thought through; feeling can be data.”

Ms. Livingstone furthers her contention by writing that the human mind can achieve worry-free independence, in a secular society, by studying select stories, from select authors:

“Worrying also fits into the tradition of breaking down myths and tropes into discrete units, a bit like Mircea Eliade’s Myth and Reality or C. S. Lewis’ Studies in Words. We care about these books because we need stories about the cultural past so that we might have a sense of ourselves in time. The real value of O’Gorman’s book lies, I think, in the way it flags the politics of the stories we tell ourselves. In its attribution of emotional drives to the ideas behind modernist culture and neoliberal politics alike, Worrying shows that their architects –writers, mostly– are as much victims of emotion as masters of thought. If we can see the emotional impulses behind our definitions of rationality, liberty, and literary craftsmanship, we can understand our own moment in cultural time more accurately and more fairly: Perhaps we can become our own gods, after all.”

One contradiction –not covered in the O’Gorman book, or the Livingstone review– is the trope that religious people are miserable in their constraints.  This is ostensibly based on the premise that they fear the wrath of God so much that they’re afraid to live the life that the secular man does.  Yet, O’Gorman infers that religious people tend to worry less, because they follow the guidelines laid out in The Bible, and they place their destiny, and fate, in the hands of God.  The import of this is that for religious minds, the universe is less random.  Ms. Livingstone’s review basically says that the secular life doesn’t have to be so random, and it doesn’t have to cause such concern.  She basically states that if we study happiness as if it were an algorithm of either physical or aural data points, and incrementally form our thoughts around these findings we can achieve happiness.  She also states that through reading literature we can discover our own master plan, through their mastery of emotions through thoughts and ideas.  On the latter point, I would stress the point –in a manner Ms. Livingstone doesn’t– that if you want to lead a secular life, there are the ways to do so and still be worry free.  The key words being if you want to.  If you’re on the fence, however, a religious person could argue that all of the characteristics Ms. Livingstone uses to describe the virtues of the stories and the authors she considers masters of thought, could also be applied to the stories, and writers of The Bible, and the many other religious books.  If her goal, in other words, is to preach to her choir, she makes an interesting, if somewhat flawed case.  (I’m not sure how a living, breathing human being, could study a data sheet on happiness and achieve the complicated and relative emotion.)  If her goal, on the other hand, is to persuade a fence sitter that secularism is the method to becoming your own god, this reader doesn’t think she made a persuasive case.

An Intellectual Exercise in Exercising the Intellect


“There are no absolutes,” a friend of mine said in counterargument.  The snap response I had was to counter her counter with one of a number of witty responses I had built up over the years for this statement.  I decided, instead, to remain on topic, undeterred by her attempts to muddle the issue at hand, because I believe that for the most part this whole philosophy has been whittled down to a counterargument tactic for most people.

Whenever I hear the “No Absolutes” argument, I think of the initial stages of antimatter production.  In order to get the protons, neutrons, or electrons spinning fast enough, a physicist needs to use a Particle Accelerator to attempt the production of an atomic nuclei, otherwise known as antimatter.  The acceleration of these atoms occurs in a magnetic tube that leads them to a subject, upon which they smashed to produce this final product.  The process is a lot more intricate and complex than that, but for the purpose of this discussion this simplified description can be used as an analogy for the “There are No Absolutes” argument that is often introduced in an echo chamber of like-minded thinkers, until it is smashed upon a specific subject, and the subject matter at hand is then annihilated in a manner that produces intellectual antimatter in the mind of all parties concerned.

Tower of Babel

Tower of Babel

The “No Absolutes” argument is based on the post-structuralism idea that because we process, or experience, reality through language –and language can be declared unstable, inconsistent, and relative– then nothing that is said, learned, or known can be said to be 100% true.

This degree of logic could be the reason that a number of philosophers have spent so much time studying what rational adults would consider “Of Course!” truths.  One such example, is the idea of presentism.  Presentism, as presented by the philosopher John McTaggart Ellis McTaggart, could also be termed the philosophy of time. The central core of McTaggart’s thesis has it that the present is the lone timeframe that exists, and that the past, and the future cannot exist at the same time.  The past has happened, he states, and the future will happen, but they do not exist in the sense that the present does.  This philosophy is regarded in some circles (to the present day!) as so insightful that it is included in some compilations of brilliant philosophical ideas.

Anyone that is familiar with McTaggart’s philosophy, or will be by clicking here, can read through the description of the man’s theory a number of times without grasping what questions the man was answering.  His description of time is so elementary that the reader wonders more about the audience that needed that explained to them, than they do the philosophy of Mr. McTaggart.  Was McTaggart arguing against the linguists attempts to muddle the use of language, or was he attempting to argue for the reinforcement of agreed upon truths?  Regardless, the scientific community had problems with McTaggart’s statement, as depicted by the unnamed essayist writing in this article:

If the present is a point (in time) it has no existence, however, if it is thicker than a point then it is spread through time and must have a past and future and consequently can’t be classed as purely the present.  The present is immeasurable and indescribable” because it is, we readers can only assume, too finite to be called a point.”

Those that want to dig deep into the physicist’s definition of time, of which this unnamed essayist seems to be a party, will find that time is a measurement that humans have invented to aid them in their day-to-day lives, and that the essence of time cannot be measured.  Time is not linear, and it cannot be seen, felt or heard.  They will argue that there is nothing even close to an absolute truth regarding time.  Setting aside the physicists’ definition of time, however, humans do have an agreed upon truth of time that McTaggart appeared to want to bolster through elementary, agreed upon truths of time to thwart the confusion that sociolinguists, with a political orientation, introduced to susceptible minds.

There’s nothing wrong with a man of science, or math, challenging our notions, perceptions, and agreed upon truths.  Some of these challenges are fascinating, intoxicating, and provocative, but some have taken these challenges to another level, a “No Absolutes” level to this point of challenging our beliefs system that has resulted in damage to our discourse, our sense of self, free-will, and a philosophy we have built on facts and agreed upon truths in a manner that may lead some to question if everything they believe in is built on a house of cards that can be blown over by even the most subtle winds of variance.

There was a time when I believed that most of the self-referential, circuitous gimmicks of sociolinguistics –that ask you to question everything you and I hold dear– were little more than an intellectual exercise that professors offered their students to get them using their minds in a variety of ways.  After questioning the value of the subject of Geometry, my high school teacher informed me: “It is possible that you may never use any aspect of Geometry ever again, but in the course of your life you’ll be called upon to use your brain in ways you cannot now imagine.  Geometry could be called a training ground for those times when others will shake you out of your comfort zone and require a mode of thinking that you may have never considered before, or use again.” This Geometry professor’s sound logic left me vulnerable to the post-structuralist “No Absolutes” Philosophy professors I would encounter in college.  I had no idea what they were talking about, I saw no value in their lectures, and I thought that the ideas that I was being introduced to, such as those nihilistic ideas of Nietzsche, always seemed to end up in the same monotonous result, but I thought their courses were an exercise in using my brain in ways I otherwise wouldn’t.

Thus, when I first began hearing purveyors of the “No Absolutes” argument use it in everyday life, for the purpose of answering questions of reality, I wanted to inform them that this line of thought was just an intellectual exercise reserved for theoretical venues, like a classroom.  It, like Geometry, had little-to-no place in the real world.  I wanted to inform them that the “No Absolutes” form of logic wasn’t a search for truth, so much as it was a counterargument tactic to nullify truths, or an intellectual exercise devoted to exercising your intellect.  It is an excellent method of expanding your mind in dynamic ways, and for fortifying your thoughts, but if you’re introducing this concept to me as evidence for how you plan on answering real questions in life, I think you’re going to find it an exercise in futility over time.

Even when a debate between two truth seekers ends in the amicable agreement that neither party can sway the other to their truth, the art of pursuing the truth seems to me to be a worthwhile pursuit.  What would be the point of contention for two “No Absolutes” intellectuals engaging in a debate?  Would the crux of their argument focus on pursuing the other’s degree of error, or their own relative definition of truth?  If they pursued the latter, they would have to be careful not to proclaim their truths to be too true, for fear of being knocked back by the “There are No Absolutes,” “Go back to the beginning” square.  Or would their argument be based on percentages: “I know there are no absolutes, but my truth is true 67% of the time, while yours is true a mere 53% percent of the time.”  Or, would they argue that their pursuit of the truth is less true than their opponents, to therefore portray themselves as a true “No Absolutes” nihilist?

Some may argue that one of the most vital components of proving a theoretical truth in science, is the attempt to disprove it, and others might argue that this is the greatest virtue of the “No Absolutes” argument, and while we cannot dismiss this as a premise, purveyors of this line of thought appear to use it as nothing more than a counterargument to further a premise that neither party is correct.  Minds that appear most confused by the facts, find some relief in the idea that this argument allows them to introduce confusion to those minds that aren’t.  Those that are confused by meaning, or intimidated by those that have a unique take on meaning, may also find some comfort in furthering the notion that life has no meaning, and nothing matters.  They may also enjoy informing the informed that a more complete grasp on meaning requires one to have a firmer grasp on the totality of meaninglessness.  The question I’ve always had, when encountering a mind that has embraced the “ No Absolutes” philosophy is, are they pursuing a level of intelligence I’m not capable of attaining, or are they pursuing the appearance of it?

Shame, Shame, Shame!


“You should be ashamed of yourself,” is a line all of us have heard at one time or another in our lives.  The words have a powerful effect, no matter how much we hate to admit it. When said with a dash of harshness, that’s not harsh enough to provoke rebellion, these words can break us down, make us feel foolish, bad, and ashamed.  Whether we are guilty or not, they can also touch such a sensitive core that makes us feel like children, again, being scolded by our grandmother.  We don’t like feeling this way, no one does, and we all know this when we use it on others.

Reveal to Justine Sacco, Lindsey Stone, Jonah Lehrer, and the cast of others that have been recently experienced worldwide shaming, via the internet, the basic plot of the 1973 version of the Wicker Man, and how it involves (spoiler alert) villagers sacrificing a man, by burning him alive, to provide for the coming harvest, and they may tell you that they would not be able to sit through such a movie.  The correlation may not be perfect, but if you replace the harvest with social order and couple it with the proverbial act of condemning someone for the purpose of advancing a social order, those that have been regarded as sacrificial by social media, may experience such a wicked case of déjà vu that they may physically shudder during the final scenes of that movie.

stocksOne of the first images that comes to mind when one hears about a group sacrificing a human for the common good is this Wicker Man image of a relatively primitive culture sacrificing one of their own to appease their gods or nature.  We think of people dressed like pilgrims, we think of chanting, mind control, and individuals being shamed by the shameless.  We think of arcane and severe moral codes, and the extreme manner in which they handled those that strayed from the collective ideal.

Members of those cultures might still stand by the idea that some of these ritualistic practices were necessary.  They might concede that the whole idea of sacrificing humans for the purpose of yielding a better harvest was ill-conceived, especially if they were being grilled by a lawyer on their agricultural records, but burning people at the stake, hangings, and putting people in stocks, however, were punishments they provided to the truly guilty, they might say.  And these were necessary, they might argue, to keep their relatively fragile communities in line.  They might argue that such over-the-top displays of punishments were necessary to burn images into the mind of what could happen to those that are tempted to stray from the moral path.  They might suggest that based on the fact that our law enforcement is so much more comprehensive nowadays, we cannot understand the omnipresent fear they had of chaos looming around the corner, and the use of shame and over the top punishments were the only measures they could conceive to keep it at bay.

We may never cede these finer points to them, in lieu of the punishments they exacted, but as evidenced by the cases of the four individuals listed in the second paragraph, the greater need for symbolic, town hall-style shaming has not died.  Our punishments may no longer involve a literal sacrifice, as it did in that bygone era, but the need to shame an emblematic figure remains for those of us that feel a call to order is justified to do whatever it takes to keep total chaos at bay.

The conundrum we experience when trying to identify with how our ancestors acted is easier to grasp when we convince ourselves that these actions were limited to the leadership of those communities.  We can still identify with a suspect politician, an inept town council, and a couple of corrupt and immoral judges, but when we learn that most of the villagers involved themselves in the group’s agreed upon extremes, we can only shake our head in dismay.

Writers from that era, and beyond, describe the blood lust that occurred among the spectators in the form of shouts for someone’s head, and the celebratory shouts of “Huzzah!” that occurred immediately after the guillotine exacted its bloody form of justice on the alleged perpetrator.  How could they all cheer this on?  How could so many people be so inhumane?

Some would argue that the very idea that we read history from a distance –believing that the human being has advanced so far beyond such archaic practices that it’s tough for us now to grasp their motivations– while engaging in similar, but different behaviors, is what makes the study of group thought so fascinating.

In his promotional interview with Salon.com, for the book So You’ve Been Publicly Shamed, author Jon Ronson details the Twitter treatment he wrote about in that book, directed at a publicist named Justine Sacco.  Justine Sacco took an eleven hour plane trip to Africa.  Before boarding the plane, Justine sent out a number of tweets to her 170 Twitter followers. Among those tweets was a now infamous one:

“Going to Africa.  Hope I don’t get AIDS.  Just kidding.  I’m White!”

No matter how one chooses to characterize this tweet, it’s tough to say that it’s the most inflammatory tweet ever put out on Twitter. For varying reasons, millions of people latched onto this statement and took this relatively unknown tweeter from 170 followers to the number one worldwide trend on Twitter, all while Ms. Sacco remained oblivious, in the air, en route to Africa.  She received everything from death threats, people wishing that she would get AIDS as retribution for her heartlessness, and the varying degrees of near lustful excitement that began mounting among those villagers gathering around the intangible town square, imagining the look on her face when the lowering, technological guillotine finally became apparent to her when she landed, so they could all shout “Huzzah!” in unison.

“I’m dying to go home but everybody at this bar is so into #hasjustinelandedyet. Can’t leave til it’s over,” was a tweet Mr. Ronson found soon after the publication of his book to illustrate the excitement that had been building among those that couldn’t wait for Ms. Sacco to land and discover that the life she lived prior to that tweet was now over. 

Shaming in the Modern Era

Before purchasing RonsonSo You’ve Been Shamed book, one might be tempted to think that it is little more than a detailed list of those, like Ms. Sacco, that have committed purported transgressions.  The fact that it is not, is illustrated by the decision Mr. Ronson made to focus on incidents that would’ve been considered inconsequential were it not for the varying reactions observers had to them.

Ms. Sacco, for example, wasn’t inferring that she hoped that more black people contract AIDS, or that she hoped that the AIDS virus would continue to attack black people almost exclusively.  One could say, reading her tweet literally, that she may have been intending to speak out against the infection for being racially biased.  Perhaps it is the confusion regarding who, exactly, Ms. Sacco was condemning that led so many to fill in the blanks for their own purpose.  Whatever the case was, they did fill in those blanks, and the pack mentality did frame that single tweet in a manner that encouraged tweeters, 24-7 news programs, and all of the other venues around the world to heap scorn and shame on her in a manner that could leave no observer with the belief that shaming is dead.

It could also be guessed that Ms. Sacco was attempting to provide her followers poignant humor.  Her tweet was, presumably, her attempt to garner empathy for sufferers of a disease that appeared unnaturally selective, and that she was probably attempting to spearhead some form of awareness among her 170 Twitter followers without sufficient regard for how it could be misinterpreted by those that would choose to misinterpret her tweet for the purpose of spearheading a movement to garner empathy for sufferers of a disease that appeared unnaturally selective. Those that responded on Twitter not only appeared to relish the opportunity to champion a cause, for greater definition among their peers, but to technologically burn whomever they had to to get there.

And while we can only guess that most of the offended had to know that Ms. Sacco wasn’t intentionally infringing on their ideological issue, the opportunities to prove one’s bona fides on an issue don’t come along very often, and when they do they’re often limited to coffee shop and office water cooler conversations with two-to-four people.  And those two-to-four people, are often forced to soft-peddle their outrage, because they will have to work around, or otherwise be around, the target of their condemnation in the aftermath.  Ms. Sacco, on the other hand, was an intangible victim that most of those in the intangible town square would never meet, so they didn’t have to worry about her feelings, and her tweet provided them the perfect venue to establish their bona fides on a worldwide stage.

“If we were in one of those Salem town squares witnessing a witch burning,” one of Ms. Sacco’s Twitter shamers might argue, “We would be shouting at the throng gathered around the witch, calling for them to be burned, and not the alleged witch.  We’re not shaming with the sort of moral certitude of those people of a bygone era, we’re shaming the shamers here.  It’s different!”  They might also argue that their goal, in shaming the Justine Saccos of the world, is to not only to redirect shame back on the shamers, but to effectively eradicate the whole practice of shaming … unless it’s directed at those that continue to shame others.

On this revised act of shaming, the Salon.com interviewer of Jon Ronson, Laura Miller, provided the following summation:

“If you are a journalist or a commentator on Twitter or even just aspiring to that role, you have to build this fortress of ideology.  You have to get it exactly right, and when you do it becomes a hammer you can use against your rivals.  If you even admit that you could have possibly been wrong, that undermines both your armor and your weapon.  It’s not just something you got mad about on social media; it’s your validity as a commentator on society that’s at stake.”

If that’s true, then no one angrily wished death and disease on Justine Sacco, but they felt a need to sound more brutal than any that had tweeted before them to establish their bona fides on the issue.  They weren’t angrier than any of the previous tweeters, they were just late to the dog pile, and they felt a need to jump harder on top of the pile to generate as much impact as those on the bottom had with their initial hits.  The idea of the target’s guilt, and the severity of her guilt, kind of got lost in all of the mayhem.  Each jumper became progressively concerned about the impact their hit would make, and how it would define them, until they felt validated by the proverbial screams of the subject at the bottom of the pile.

US

If you’ve reached a point, in this conversation, where you’ve recognize the different, but similar shame tactics employed by the primitive and advanced societies, you’ve probably reached a point where you’ve recognized the correlation, and you’re shaking your head at both parties.  In his book, however, author Jon Ronson cautions us against doing so.  It’s not about them, the central theme of his book suggests, it’s about you, him, and us.  In one interview, he stated that he thought of pounding that point home by simply calling the book “Us”, but that he feared some may infer that meant that he was specifically referring to the United States, or the U.S.

The subjects of shame, and the shamers that exacted their definition of justice on them, he appears to be saying, are but anecdotal evidence of the greater human need to shame. It’s endemic to the human being, to us, and while the issues may change and evolve, and the roles may reverse over time to adapt to the social mores of the day, the art of shaming remains as prevalent among the modern man as it did during a B.C. stoning.

The elephant in the room that Mr. Ronson did not discuss in his book is the idea that the viciousness the modern day shamed person experiences may have something to do with the vacuous hole created by the attempt to eradicate shame from our culture.  Our grandmothers taught us this very effective tool, as I wrote.  They used it to try and keep us on the straight and narrow, and they did it to keep us from embarrassing ourselves.  When we witnessed our childhood friends engaging in the very same behavior that we had been shamed into avoiding –thus displaying the fact that they hadn’t been properly shamed against such behavior– we stepped in to fill the void.  We shamed them in the manner our grandmother had, using –as kids often will– the same words our grandmother had.  We then felt better about ourselves in the shadow of their shame.

As adults in a modern, enlightened era, we learned that we are no longer to use the tool of shame.  The lessons that our grandmother taught us, we’re now being told, were either half-right, or so baked in puritanical, traditional lines of thought that they no longer apply.  Ours is an advanced, “do what you feel” generation that struggles to believe that there is no right and wrong, unless someone gets hurt.  The benefit, we hope, is that if we eradicate judgment and shame from our society, we can also be liberated from it.  Yet, there is a relative line in the sand where attempting to avoid judgment and shaming will eventually, and incidentally, encourage that activity.

We all know that this activity will eventually lead to internal decay and rot for the individual, and eventually the culture, and we know that some judgment and some shaming is necessary to keep the framework intact.  It’s a super-secret part of us that knows this, and the need to shame and judge gnaws at us in a manner we may never knew existed, until that perfect, agreed upon, transgression arises.  When it finally happens that we find someone that it’s safe to shame, it fills that need, and that pressurized need that we’ve hidden so far back in the recesses of our minds in a quest to acquiesce to the new ways of thinking that the act of shaming explodes on that person, regardless of their degree of guilt.

Those of us that have learned some of the particulars of the Salem Witch Trials believe that early on in the situation there may have been a need for greater order.  The fear of chaos probably prompted them to believe some of the accused actually were witches, looking to infiltrate their youth with evil.  As we all know, it eventually began spiraling out of control to a point that people began randomly accusing others of being witches over property disputes and congregational feuds.  One can also guess that many accusers leveled their accusations for the purpose of attaining some form of superiority over the accused that they could not attain otherwise.  Those citizens of colonial Massachusetts eventually learned their lessons from the entire episode, and some would say their lesson is our lesson as of yet unlearned, as accusations of racism, and anti-patriotism, are leveled at those that may have been guilty of nothing more than a poorly worded joke, or participating in an ill-advised photograph, as in the case of Lindsey Stone.  Our era is different though.  The lessons of the self-righteous, puritanical man do not apply to today, and we don’t need to know the whole story before we make that leap to a defense of the social order that provides us the characterization we desire in the dog pile?

Deserve vs. Earn


Read the various periodicals on the net, and you’ll find the words earndeserveand merit listed as interchangeable. Various periodicals conflate these three words so often that they might be some of the few words, in the usage war of words, that the prescriptive dictionaries (1) and the descriptive dictionaries (2) agree upon. Read those blogs that make a serious attempt at getting the definitions correct, and the reader will find that if the writer does not consider these three words synonyms, they consider them derivatives of one another.

This casual, but curious, observer of language would not go so far to write that any of those authors are incorrect, but in the lexicon of the common man the words ‘deserve’ and ‘earn’ have grown so far apart as to be almost antonyms. When the office worker speaks of deserving a raise, even those that know the standard measurements of the company would not bring up the word earn, for fear that doing do might spark a confrontation that would forever alter the relationship. When the sports fan speaks of his team deserving a championship, it is only his antagonists that will mention the fact that they haven’t earned it yet, and when the lovelorn and politicians speak of the deserving, it is an emotional appeal that cannot be countered without one doing some damage to their public perception. In all walks of life deserve is used as a word to describe that which one is entitled to, as if by birthright, and merit has become the exclusive right of the word earn.

The definition of deserve, in the lexicon of the common man at the proverbial water cooler, has regressed to the succinct definition: “To have, or display, qualities that should result in one attaining rewards by natural means.” In this sense, deserve has come to be something of an adjective to describe those that should attain, and earn is more a verb to describe the hard work put into attaining a goal. Deserve is also a term used by those that feel they are owed something by being a good person, a human, or a human being that is alive.

All philosophical differences aside, this causal, but curious, observer can’t help but think that those that invest full emotions in this idea of being a deserving person, at the expense of earning, set themselves up for failure, heartache, and even diminished mental health when the reality of their circumstances continues to dispel their notions. One would think that, at some point, the confused would take a step back and provide the situation a reexamination steeped in rational objectivity, but for most that’s easier said than done, as it could lead them to believe that they’re a lot less deserving than they once believed.

Love is difficult to calculate by standard measurements of course, and past behaviors do not dictate future success. As such, no single person should ever say that they deserve to be loved, but it’s not something one can earn entirely by merit either. Love, we could say, is a complicated algorithm fraught with failure that begins with simple, intangible superficialities. These superficialities can be as simple as the way a person comes their hair, their scent, the clothes they wear, the way they smile when they see you coming down the aisle at Cracker Barrel, and all of the other, otherwise meaningless intangibles that form superficial attraction. Some could argue that the superficial nature of the early stages of love is nothing more than a crush, but a crush forms the fundamental layer of all that will spring from it. Love gains meaning as it progresses into shared values, complicated ideas, and philosophies, until it eventuates from that initial, superficial attraction into the ultimate, comprehensive decision we make about another person called love. We earn love every day thereafter by maintaining the conditions that the other party lays out for us in either overt or implicit ways to form adult, conditional love.

“You think you should be afforded love simply by being?” I would ask the deserving person. “Do you think that you should be able to walk up to a total stranger on the street and inform them that you are a good person, and therefore deserving of love, and that they should do their civic duty, as a good citizen of the world, and love you? If that’s what you think, you’ll get the type of love you deserve.” 

The point is that by being deserving the individual opens up a whole can of why, for those that are asked to believe it. ‘Why do I deserve,’ should be the first question a person asks themselves, and ‘why am I more deserving than anyone else’ should be the next, and all of the answers should culminate in self-evident facts and figures that result in the definitions of the words ‘merit’ and ‘earn’.

High-minded types would tell their audience that love is nothing more than a complex mixture of chemicals in the brain, and they do so under the theoretical umbrella of a human being no more complex than, say, the penguin. They would suggest that certain animals, like some penguins, have long-term, monogamous relationships based on decision-making. If our decision making abilities are no more complex than the penguin’s, and our drive to be loved, and love, is nothing more than a natural and primal need to procreate, then all humans do deserve to be loved by the primal, prospective mate that senses when we’re in heat. Alternatively, if the human’s senses are so inferior to the penguin’s that prospects can’t tell when we deserve love, we may want to develop a mating call that informs prospective mates when we feel ‘deserving’ to see what comes running down the alley to us.

Most of us prefer to believe that we earn the love we receive on a perpetual basis, a love that is more complex than the penguins, and that it progresses based on the variables that we introduce to it on a day-to-day basis. If we settle on this primal, penguin definition of love, and we choose to believe that the love we earn should be nonjudgmental, and lacking in morals and values, and nothing more than a stick that stirs the chemicals in our brain, the love we get will be as meaningless as the penguins’, and what we deserve.

  • (1) Prescriptive dictionaries are concerned with the formal, or correct, usage of words.
  • (2) Descriptive dictionaries are concerned with how language is being used in a more casual, less formal manner.