Scat Mask Replica III


1) The Rasputin Paradox. Are you involved in an enterprise in which one person’s alleged ineptitude is holding you back from realizing the vast potential of that enterprise? Is your enterprise one-step away from removing that alleged ineptitude? Those who know the history of the Russian Empire know to be careful what they wish for. Some speculate that Grigori Yefimovich Rasputin had far less influence in the Russian Empire (circa WWI) than history details, and they double down on that by saying that the Romanovs would not refute what others said about Rasputin’s influence, because they enjoyed having Rasputin play the role of the scapegoat. If they did not know the level of blame others placed on Rasputin while he was alive, they definitely found out after his death, because after Rasputin was murdered the focal point for the Empire’s ineptitude was gone. Those in politics, business, and in personal crisis should note that casting blame on one particular person for the failure of your enterprise may prove cathartic in the short-term, but once that person’s gone, it might reveal more about the general ineptitude of that enterprise than any of the other players ever imagined.   

2) “If you have facts on your side, pound the facts. If you have the law on your side, pound the law. If you don’t have either, pound the table.” One of the more uncomfortable situations I’ve experienced involve someone pleading with me to accept them as a genuine person. It’s a gross over simplification to suggest that anytime someone pounds the proverbial table to convince me of something that they’re lying. We’re all insecure about our presentations, and some of us pound the table even when we have the facts on our side. I know it’s easy to say, but those with facts on their side should relax and allow them to roll out as they may. The truth teller who finds it difficult to avoid pleading their case should also know that after we reveal enough supportive evidence most will believe you, but some just enjoy watching us squirm.

3) Speaking of the genuine article, it has recently come to my attention that some pathetic soul stole at least two of the articles from this site. Some call this plagiarism, but I call it pathetic. If imitation is the sincerest form of flattery, I suppose I should consider it a compliment, but this is outright theft. It seems redundant to me to clarify the rules on this matter, but if a writer is going to “repost” they are required to provide attribution. (For those unclear on the definition of this term, it means that a writer is supposed to inform their audience that they didn’t write the article.) Not only did this pathetic soul avoid attributing the article to me, but they also didn’t provide proper attribution to the quotes I did in the article they stole. So, this person (who provides no discernible path back their identity) anonymously steals posts to presumably receive checks from companies that pay writers to sport ads on their site. I don’t care how much those sponsored ads pay, how does this person sleep at night knowing that the profession or hobby they chose is one in which they cannot produce their own quality material. If I were ever to reach a level of such a desperate act, I would seek another profession or hobby. 

4) The difference between selfishness and self-awareness. A complaint about young men and women is that they’re too selfish. It’s the root of the problem, they suggest. I don’t know if it’s true, but if it is I would suggest that those speaking out against it are delivering an incomplete message. My platform would suggest that these selfish types are focusing on self-awareness, and that they should seek it to achieve a level of fulfillment. We could view striving to achieve greater self-awareness as a selfish pursuit, but self-awareness can take several forms. Performing selfless acts, for example, can teach a person a lot about themselves, and it should be encouraged, as people performing many selfless acts can become more aware of themselves and more selfless. The process could lead to an antonym of the vicious cycle these complainers decry. If I had a pulpit, I would also declare that an individual could learn more about themselves through spirituality. I’ve been on both sides of the value of scripture, and I think this gives me greater perspective on the matter. I look at scripture and other Biblical teachings as a roadmap to personal happiness through reflection. Self-interest drives me to follow those teachings because I believe it’s in my best interests to follow them. In short, I would play my sermon to the selfish predilections of the young. I hear sermons that suggest otherwise, and I can’t help but think that the priest is missing a beat.

5) As a former service industry employee, I’ve encountered my share of disgruntled customers. I could provide a list of examples, but the material of their complaints is irrelevant. Most experienced service industry employees know that the most disgruntled customers are the most disgruntled people. They might hate their kids, the spouse, and their life. Whatever the case is, the discrepancy they find causes them to unload, “What kind of Mickey Mouse operation are you running here? Your ad says this item is on sale today for two bucks. If you think I’m going to pay more than that, you must think I’m stupid! Or, are you singling me out based on my characteristics?” These statements are often a mere introduction to a heated exchange that reveals the effort of the disgruntled customer to achieve some satisfaction they can’t find elsewhere in life. A more confident customer would simply say, “Your ad says that this item is on sale today for two dollars.” Those of us who have experience in the service industry know how intimidating a confident presentation of the facts can be, especially from a more secure individual.

6) A new documentary captures an ant crawling down from a piece of cheesecake with a piece of it lodged in its mandibles. The makers of this documentary capture the ant’s progress, in stop action photography, as this permits progressed commentary from various filmmakers talking about the brilliance of each segment. Where does the ant go, and what will it do with the small, round ball of cheesecake? This is the plotline of an amazing new documentary called Posterula. (Spoiler alert) The ant makes it off the plate, but the viewers don’t know if the ant ever takes the piece to the colony to feed the queen. This leads this viewer to believe that an as of yet undisclosed plan for a sequel to this brilliant documentary is in the works.

Hi, I’m Rilaly, and if I were to take you on a tour of my young mind, this would be but an example of what you would read. Some suggest that such humor is too niche, and if that’s the case I would’ve niched my way out of the market. If I had one of my stories published, customers at bookstores would’ve walked past my serious pieces, thinking that I’m nuts, too far gone, and unserious. They probably still think that. I’m niche.

7) I landed upon the term “vague and flexible by design” the other day. The author of the term intended it as a compliment for the subject, but if they directed such a characterization at me, I would view it as an insult. I understand that we’re different people in different surroundings, and that we should all remain flexible with our ideals to prepare for new findings on the subject in question, but the “vague and flexible by design” compliment would register a ‘no core’ insult to me.

8) What hotel, or meeting space, first decided to serve a ball of meat as a solitary entrée? Someone somewhere should’ve stepped in and said, “Woops, you forgot the fixins.” Those who have attended more than twenty corporate galas, weddings, or any catered event are now more than accustomed to the items served in a buffet line. I now eat before I attend one of these functions, because I cannot eat another pinwheel, I’m burnt out on hot wings, and I hit my personal threshold on room temperature potatoes au gratin somewhere around 2004. I am not a finicky eater, but I can no longer stomach this list of dietary choices. I will acknowledge that being American provides me the luxury of making odd and unreasonable dietary choices, but if I’m going to limit myself to one meal a day to maintain a plump figure, as opposed to fat or obese, I’m not going to eat something just because others provide it in a visually pleasing manner.  

9) There is a difference between writing a grammatically correct sentence and quality writing. I took college classes on creative writing, I’ve read the MLAs, and I’ve learned through word-of-mouth what leads to quality reading. I’ve fixed the passive voice sentences, deleted the word “had” as often as possible, and I’ve tried to avoid what fellow writers call “the you-yous”. The goal for the writer is to adhere to the rules of writing while attempting to maintain a stream-of-consciousness style that makes for quality reading. It’s not considered grammatically incorrect to write that you may not enjoy this sentence, but writing that the reader may enjoy it without the word you is considered a more pleasant reading experience. I’ve also attempted to write “who” instead of “that”, and I’ve attempted to limit my need to “that” too often. Example: “You don’t want to write that it was someone else that said something, when who said it is much more familiar to you.” In that sentence, fellow writers suggest using the word “Writers” to replace the first you, and “Readers” is an advisable replacement for the second you. Beta readers suggest that doing otherwise means the writer has a bad case of the you-yous. You is too familiar to you, and that is too unfamiliar, and you do not want to be too familiar or too unfamiliar. The first reason for following this rule is that the writer does not want to write in the manner they speak, because the way one speaks in one locale may not be as familiar to a reader in another locale. These standards set a common base for readers, free from colloquialisms. The you yous also creep up on a writer in free flow, and they may not notice how redundant the use of the word is in their document. The question that haunts me is do I want a perfect document to impress accomplished writers, or do I want to pleasure myself with a document that might have some flaws. The notion one writer lofted was every writer makes mistakes, we readers weave them into the cloth of our expectations, but is there a point when the mistakes distract from the whole.

10) “He’s such an idiot,” Teri said after her boyfriend left the party table to go to the bathroom. “He cheats on me all the time. For all I know, he’s arranged something in the bathroom. I’m serious. I can’t even trust him to go to the bathroom.” Such comments are so unexpected that they’re hilarious.

“Why the hell are you dating him then?” I asked. Room silencing, impulsive comments like these are my gift to the world. I can flatten the smile of any decent person from fifty yards with a single thought implanted in their brain.

The comment sat right with me, but the moment after I delivered it I realized it was so loaded with complications that no one in the right mind would deliver it to a table of people gathered together for the sole purpose of mixing in some laughter with their fun. I thought it might add to the fun, or spur her into extensions on the joke, but I was wrong. I made her uncomfortable.    

As soon as she recovered from the blow, aided by my discomfort, she displayed the idea that she locked herself into a certain, cynical dynamic of life. She knew the world was full of it, and everyone around her was too, in one way or another, because she knew she was. She thought her beau was full of it too, but “He’s a nice guy…most of the time.” I didn’t know if that was her final answer, but I overemphasized my acknowledgement of her answer to suggest that was what I sought.

No matter how often I affirmed her answers, Teri kept coming at me with answers. She said he was “Funny and fun to be around.” She said he was good looking, and she said he did “Sweet things for her.” I couldn’t get out of this uncomfortable spiral of my own making. I pretended to be interested, because I knew I put her in the uncomfortable position of having to explain one of life’s most illustrating choices, but I was trying to end the episode with every word she said to me.

Most of us cannot explain our life altering choices so well that we can weather interrogations. I knew this, but I thought I could explain most of my choices at the time. The question that even the most reflective must ask themselves is, is their base so solid that we make rational, informed choices in the impulsive moments? I don’t think many reflective types would pass their own interrogations, in the moment, for I think we color in the blanks later to make us believe we made informed choices.

Teri told me he was a good man, with a good job, and he had an unusual curiosity about life that she found fascinating. I also learned that while it was obvious he had a restless, nervous energy about him, “He’s incredibly lazy. If he had his choice, he would spend his day on a couch.”

I still didn’t understand the dynamics of their relationship, even though she provided me numerous answers. I wouldn’t understand it for a while. I had no idea at the time that their relationship depended on the idea I had that she enjoyed playing the jealous girl, because, I can only assume, she considered him worthy of her jealousy, and in a world of average men with no discernible qualities, that is something. He was the naughty boy, and he enjoyed that role. “We fight like cats and dogs,” she said with a gleam in her eye, “but then we have makeup sex.” I wondered if she dated guys that wouldn’t cheat on her. I wonder if they wouldn’t fight with her. I wondered if they bored her. He provided her something to focus on other than herself. He was the dunce, but he was an amiable dunce. He provided her drama. He was always on the cusp of cheating on her. She also had a desire to date a guy that she could be better than, and she wasn’t much. Either that, or there is a desire to care for something that could break. “He’s an idiot, he doesn’t know how good he has it,” she said more than twice. The guy was fulfilling the age-old male need of feeling like a bad boy. Most guys need this coursing through their veins, and some girls apparently need a guy like this too.

11) Unhappy couples fascinate me. They don’t smile often, but smiles are a refuge of the simple minded. They don’t hug, kiss, or touch very often, but they’re not that type of people. They’re emotionally distant people, and happy people make them sick. Do they have a greater understanding about who they are than we ever will, or are they jealous? She didn’t date in high school, and he was a broken boy. Death of a loved one breaks some, divorce breaks others, and still others experience a seismic betrayal that creates an irreparable break. Yet, they found something in one another that they always wanted. As an outsider looking in, we can’t understand the allure, but the two of them stay together for years. Some stay in a job they hate, because they fear the unknown. Do people stay in relationships for the same reason? He doesn’t speak often, and relatives find it difficult to strike up a conversation with him. He gives off the vibe that he’s not interested in what others have to say, and this affects the way others react to him.

My initial instinct was that he wasn’t interested in what I had to say, for reasons endemic to our relationship, until others informed me they shared similar experiences with him. He’s more interesting when he drinks, but when the night is over, the participants realize he wasn’t really interesting in the truest sense of the word, but he was more interesting than they expected him to be. A couple of drinks loosen our inhibitions. A couple more might loosen them even more, until the potential exists for us to become interesting. That’s the mindset of the drinker anyway, I’m not sure if this is his mindset, but he does have a drinking problem. He is emotionally distant, because those that formed him devastated him emotionally. Yet, it many ways he appears satisfied with who he is.

12) No one is as boring as we think they are, but no one is as interesting as we think we are either. How many of us look back to our authentic years with the belief that we weren’t nearly as authentic as we are now, and how many of us will look back ten years from now with the same thought? One could say that the effort put into being authentic provides progressively diminishing returns. 

13) How many of us remember the first person who told us about America’s atrocities? Did they package it with a provocative statement such as, “This is something your moms and dads don’t want you to know about.” For those of us who are now parents, it’s probably been so long since someone introduced us to the dark side that we forget how intoxicating it was at the time. I don’t remember my first messenger because I’ve heard about these atrocities so many times since that they’ve all but drowned out my first messenger. Thanks to a myriad of resources I’ve encountered since, I am now able to frame those atrocities with the virtuous acts America has done throughout her history to arrive at the independent conclusion that America has been a noble nation overall. It did take me a while, however, to arrive at that conclusion. 

Some might think that learning of the atrocities for the first time might leave the recipient feeling cold, disillusioned, and/or depressed that their parents sold them a pack of lies. In the combative environment of my youth, one of the many focal points of ridicule was naïveté. “Don’t tell me you believed all that baseball and apple pie crap?” someone would say in the aftermath of a discussion on American’s atrocities. I did, and those early messengers in my life provided me information to combat the characterization that I was naïve. I considered them more informed, brave and righteous. I thought they were cooler than cool for speaking out against the marketing arm of America, and I thought they were treating me with the type of respect than my dad never did.

Now that I’m a seasoned adult, I know my dad wasn’t necessarily lying to me, and he wasn’t withholding a truth, but he didn’t give me the whole picture either. He didn’t know some of the atrocities these messengers told me, but there were incidents that he did know, and he neglected to tell me about them. Anyone who remembers their teenage mind knows how much we exaggerate the characterizations of our parents, especially when “truth tellers” package such information accordingly. Their presentations excited me in a way that’s tough to describe. I thought I was finally hearing the truth from someone.

A vital mindset for parents to have, while sharing our knowledge of American history, is that they are in a constant battle with their peers to avoid appearing naïve. For those worried about telling their children about the awful things the country has done, consider it ammunition to combat these stories with the stories of the country’s virtues. Our goal should be to instill a love of country in a comprehensive manner. To a certain point, we parents have told them what to think and how to think for so long that we may have a difficult time giving up those reins. On this particular subject, however, we need to present this information in a manner that allows them to decide, and we might even add that we understand it’s a lot to take in one setting, so we should allow them to think about it.

If we don’t do this, the truth will rear its ugly head when we least expect it. Those who provide them this information will likely not frame it in the manner we think they should, and our kids might turn around and accuse us of lying, telling half-truths, and not trusting them enough to deal with such sensitive information. Whatever the case is, we might never be able to win them back. My advice is we teach them the virtues of this country and couple it with a healthy dose of the horror some Americans have done since the country’s birth. Do some research on the atrocities and prepare for the follow up questions, because there will be questions. Once we’re done, we should repeat the cycle so often that by the time that cool, rebellious person tells our children, “The things we don’t want them to hear,” they will turn on that person and say, “I’ve heard all of this a million times, and to tell you the truth I’m sick of hearing about it.” If condemning your country in such a manner is difficult, much less teaching it to your child, ask yourself how you would prefer America’s atrocities framed? Would you rather provide your child with a more comprehensive narrative, or would you rather someone who hates their country do it for you? One way or another, your child will learn this information.

14) I’m about 15 years into using devices to stream music on a daily basis at this point in my life, so it might seem a little odd to show appreciation now. Anytime I take a very short drive, I gain greater appreciation for the freedom technology has offered when I turn on my local FM stations and I hear a DJ offer tidbits from their life. I’m not talking about morning show hosts, as I think I listened to one-show decades ago, just to hear what everyone was talking about, and I never listened to another one. When a DJ informs me about a day in their life, I switch the channel so hard my fingers hurt later. I don’t care about the private lives of celebrities, but I understand that some do. No one knows who these DJs are, and I think even less care. Yet, when they are on the clock, moving from one song to another, they tell us about their day. They tell us about a party they attended, a soup they enjoyed yesterday, and something their significant other said to them in the movie theater. Nobody cares! The only line we should hear from a radio DJ is, “That was one song, and here’s another.”  

15) Most people have heard the quote, “The definition of insanity is doing the same thing over and over and expecting a different result.” The quote is widely attributed to Albert Einstein. Most people know this quote, but they only apply to innovative qualities that appeal to them and their relative definitions of the status quo. When another is in the process of doing the same thing in a different way, their process receives scorn and ridicule. “Do you know the quote?” we ask. “Yes, but it doesn’t apply here. That just isn’t the way we do things.” Okay, but the way you do things hasn’t worked for decades now. The counter argument is that they’re on the cusp of it working and the new person could damage all of the progress they’ve made. Again, they’ve been on the cusp for decades, and they might even recognize some merits of the innovative pursuit of the matter, but most innovators take arrows in the process.

Advertisements

Leonardo’s Lips and Lines


My takeaway from Walter Isaacson’s Leonard da Vinci biography is that hypervigilance is not a switch an artist turns on to create. Artistic creations are often a display of one’s genuine curiosity about the world, a culmination of obsessive research into the miniscule details that others missed, and a portal through which the artist can reveal their findings. Did Leonardo da Vinci’s obsessions drive him to be an artist, or did he become obsessed with the small details of life to become a better artist?

Da Vinci might have started obsessively studying various elements, such as water, rock formations, and all of the other natural elements to inform his art, but he became so obsessed with his initial findings that he pursued them for reasons beyond art. He pursued them, the author states, for the sake of knowledge.

I don’t think I’ve ever read a book capture an artist’s artistic process as well as this one did. The thesis of the book is that da Vinci’s artistic creations were not merely the work of a gifted artist, but of an obsessive genius honing in on scientific discoveries to inform the minutiae of his process. Some reviews argue that this bio focuses too much on the minutiae of da Vinci’s work, but after reading the book, I don’t see how an author could capture the essence of what da Vinci’s accomplished without focusing on his obsessions, as focusing and obsessing on the finer details separated him from all of the brilliant artists that followed.

Some have alluded to the idea that da Vinci just happened to capture Lisa Gherardini, or Lisa del Giocondo, in the perfect smile for his famous painting The Mona Lisa. The inference is that da Vinci asked her to do a number of poses, and that his gift was merely in working with Lisa to find that perfect pose and then capture it, in the manner a photographer might. Such theories, Isaacson illustrates, shortchange the greatest work of one of history’s greatest artists.

Isaacson also discounts the idea that da Vinci’s finished products were the result of a divine gift, and I agree in the sense that suggesting his work was a result of a gift discounts everything da Vinci did to inform his work. There were other artists with similar gifts in da Vinci’s time, and there have been many more since, yet da Vinci’s work maintains a rarified level of distinction in the art world.

As an example of Leonardo’s obsessiveness, he dissected cadavers to understand the musculature elements involved in producing a smile. Isaacson provides exhaustive details of Leonardo’s work, but writing about such endeavors cannot properly capture how tedious this research must have been. Writing that da Vinci spent years exploring cadavers to discover all the ways the brain and spine work in conjunction to produce expression, for example, cannot capture the trials and errors da Vinci must have experienced before finding the subtle muscular formations inherent in the famous, ambiguous smile that captured the deliberate effect he was trying to achieve. (Isaacson’s description of all the variables that inform da Vinci’s process regarding The Mona Lisa’s ambiguous smile that historians suggest da Vinci used more than once is the best paragraph in the book.) One can only guess that da Vinci spent most of his time researching for these artistic truths alone, and that even his most loyal assistants pleaded that he not put them on the insanely tedious lip detail.

Isaacson also goes to great lengths to reveal Leonardo’s study of lights and shadows, in the sfumato technique, to provide the subjects of his paintings greater dimension and realistic and penetrating eyes. Da Vinci then spent years, sometimes decades, putting changes on his “incomplete projects”. Witnesses say that he could spend hours looking at an incomplete project only to add one little dab of paint.

The idea of a gift implies that all an artist has to do is apply their gift to whatever canvas stands before them and that they should do it as often as possible to pay homage to that gift until they achieve a satisfactory result. As Isaacson details this doesn’t explain what separates da Vinci from other similarly gifted artists in history. The da Vinci works we admire to this day were but a showcase of his ability, his obsessive research on matters similarly gifted artists might consider inconsequential, and the application of that knowledge he attained from the research.

Why, for example, would one spend months, years, and decades studying the flow of water, and its connections to the flow of blood in the heart? The nature of da Vinci’s obsessive qualities belies the idea that he did it for the sole purpose of fetching a better price for his art. He also, as the author points out, turned down more commissions than he accepted. This coupled with the idea that while he might have started an artistic creation on a commissioned basis, he often did not give the finished product to the one paying him for the finished product. As stated with some of his works, da Vinci hesitated to do this because he didn’t consider it finished, completed, or perfect. As anyone who understands the artistic process understands, the idea that art has reached a point where it cannot be improved upon is often more difficult to achieve for the artist than starting one. Some might suggest that achieving historical recognition drove him, but da Vinci had no problem achieving recognition in his lifetime, as most connoisseurs of art considered him one of the best painters of his era. We also know that da Vinci published little of what would’ve been revolutionary discoveries in his time, and he carried most of his artwork with him for most of his life, perfecting it, as opposed to selling it, or seeking more fame with it.

After reading all that informed da Vinci’s process, coupled with the appreciation we have for the finished product, I believe we can now officially replace the meme that uses the Sgt. Pepper’s Lonely Hearts Club Band album to describe an artist’s artistic peak with The Mona Lisa.

Historical Inevitability


The idea that history is cyclical has been put forth by many historians, philosophers, and fiction writers, but one Italian philosopher, named Giovanni Battista Vico (1668-1744), wrote that a fall is also an historical inevitability. In his book La Scienza Nuova, Vico suggested that evidence of this can be found by reading history from the vantage point of the cyclical process of the rise-fall-rise, or fall-rise-fall recurrences, as opposed to studying it in a straight line, dictated by the years in which events occurred. By studying history in this manner, Vico suggested, the perspective of one’s sense of modernity is removed and these cycles of historical inevitability are revealed.

To those of us that have been privy to the lofty altitude of the information age, this notion seems impossible to the point of being implausible. If we are willing to cede to the probability of a fall, as it portends to a certain historical inevitability, we should only do so in a manner that suggests that if there were a fall, it would be defined relative to the baseline of our modern advancements. To these people, an asterisk may be necessary in any discussion of cultures rising and falling in historical cycles. This asterisk would require a footnote that suggests that all eras have had creators lining the top of their era’s hierarchy, and those that feed upon their creations at the bottom. The headline grabbing accomplishments of these creators might then define an era, in an historical sense, to suggest that the people of that era were advancing, but were the bottom feeders advancing on parallel lines? Or, did the creators’ accomplishments, in some way, inhibit their advancement?

“(Chuck Klosterman) suggests that the internet is fundamentally altering the way we intellectually interact with the past because it merges the past and present into one collective intelligence, and that it’s amplifying our confidence in our beliefs by (a) making it seem like we’ve always believed what we believe and (b) giving us an endless supply of evidence in support of whatever we believe. Chuck Klosterman suggests that since we can always find information to prove our points, we lack the humility necessary to prudently assess the world around us. And with technological advances increasing the rate of change, the future will arrive much faster, making the questions he poses more relevant.” –Will Sullivan on Chuck Klosterman

My initial interpretation of this quote was that it sounded like a bunch of gobbeldy gook, until the reader rereads it and plugs the change of the day plugged into it. The person that works for a small, upstart company knows to pay attention to their inbox, for the procedures and methods of operation change by the day. Those that have worked for a larger company, on the other hand, know that change is a long, slow, and often grueling process. It’s the difference between changing the direction of a kayak versus a battleship. 

When transformational changes we have experienced in the last ten years could be said to fill a battleship, occurring with the rapidity of a kayak’s change of direction. The question is how do young minds adapt to this volume of changes at this breakneck pace? Those that are forty-years-old and older often react slowly to change, particularly technological change, but teens and early twenty somethings are more eager to incorporate the latest and greatest advancements, regardless what it does to them, and what are the as of yet unforeseen, and unintended consequences.

Had the rapid course of change over the last ten years occurred over 100 years, it could’ve characterized that century as one of rapid change. How many of us have changed so rapidly that we fondly recall the life before that change? 

If we change our minds on an issue as quickly as Klosterman suggests, with the aid of our new information resources, are we prudently assessing these changes in a manner that allows for unforeseen and unintended consequences? Is this concern expressly devoted to technology, or does human nature change as a matter of course?

These rapid changes, and our adaptation to them, reminds me of the catch phrase mentality. When one hears a particularly catchy, or funny, catchphrase, they begin repeating it. When another asks that person where they first heard that catchphrase, the person that now  uses the catchphrase as a matter of routine, but didn’t as far back as a month ago, say that they don’t remember, and as far as they’re concerned they’ve always been saying it. They subconsciously alter their memory in such a way that suits them.  

Another way of interpreting this quote is that with all of this information at our fingertips, the immediate information we receive on a topic, in our internet searches, loses value. One could say as much with any research, but in past searches required greater effort on the part of the curious. For today’s consumer of knowledge, just about every piece of information one can imagine is at their fingertips. 

Who is widely considered the primary writer of the Constitution, for example? A simple Google search will produce a name: James Madison. Who was James Madison, and what were his influences in regard to the document called The Constitution? What was the primary purpose of this finely crafted document that assisted in providing Americans near unprecedented freedom from government tyranny, and rights that were nearly unprecedented when coupled with amendments in the Bill of Rights. How much blood and treasure was spent to pave the way for the creation of this document, and how many voices were instrumental in the Convention that crafted and created this influential document?

Being able to punch these questions into a smart phone, and receive the names of those involved can provide them a static quality. The names James Madison, Gouvernor Morris, Alexander Hamilton, and all of the other delegates of the Constitutional Convention that shaped, crafted, and created this document could become an answer to a Google search, nothing more and nothing less. Over time, and through repeated searches, a Google searcher could accidentally begin to assign a certain historical inevitability to the accomplishments of these men. The notion being that if these answers aren’t the answers, other answers could be.

Removing my personal opinion that Madison, Morris, Hamilton, and those at the Constitutional Convention the composed the document, for just a moment, the question has to be asked, could the creation of Americans’ rights and liberties have occurred at any time, with any men or women in the history of our Republic? The only answer, as I see it, involves another question: How many politicians in the history of the world have voted to limit the power they wield, and any future power they might achieve in the future? How many current politicians, for example, vote for their own term-limits? Only politicians that have spent half their life under what they considered tyrannical rule would fashion a document that could result in their own limitations.   

How many great historical achievements, and people, have been lost to this idea of historical inevitability? Was it an historical inevitability that America would gain her freedom from Britain? Was the idea that most first world people would have the right to speak out against their government, vote, and thus have some degree of self-governance inevitable? How many of the freedoms, opportunities, and other aspects of American exceptionalism crafted in the founding documents are now viewed as so inevitable that someone, somewhere would’ve come along and figured out how to make that possible? Furthermore, if one views such actions as inevitable, how much value do they attach to the ideas, and ideals, created by them? If they attain a certain static inevitability, how susceptible are those ideas to condemnation? If an internet searcher has a loose grasp of the comprehensive nature of what these men did, and the import of these ideas on the current era, will it become an historical inevitability that they’re taken away in a manner that might initiate philosopher Vico’s theory on the cyclical inevitability of a fall?

I’ve heard it theorized that for every 600,000 people born, one will be a transcendent genius. I heard this quote secondhand, and the person that said it attributed it to Voltaire, but I’ve never been able to properly source it. The quote does provide a provocative idea, however, that I interpret to mean that the difference between one that achieves the stature of genius on a standardized test, or Intelligence Quotient (IQ) test, and the transcendent genius lies in this area of application. We’ve all met extremely intelligent people in the course of our lives, in other words, and some of us have met others that qualify as genius, but how many of them figured out a way to apply that abundant intelligence in a productive manner? This, I believe, is the difference between what many have asserted is a genius in a one in fifty-seven ratio and the one in 600,000 born. The implicit suggestion of this idea is that every dilemma, or tragedy, is waiting for a transcendent genius to come along and fix it. These are all theories of course, but it does beg the question what happens to the other 599,999 that feed off the ingenious creations and thoughts of transcendent geniuses for too long? It also begs the question that if the Italian philosopher Vico’s theories on the cyclical nature of history hold true, and modern man is susceptible to a great fall, will there be a transcendent genius that is able to fix the dilemmas and tragedies that await the victims of this great fall? 

Why Adults Hate Their Parents


‘I am so glad I don’t have to go through all that anymore,’ I think when I hear an adult say they still hate their parents. When they say it with such animosity and rage, I remember the raging insecurity and confusion that drove me to say such things, and I’m happy to be past all that. When I hear someone say that their parents are bumbling fools, idiots, or backwater hicks from the 1950’s, I remember saying such things, and I regret some of it. As has been said of regrets, there is little that we can do about them now. Yet, I have also heard others say that the struggle to correct past errors defines us.

The question I would love to ask of those adults that continue to hate the ‘absolute morons’ that happen to be their parents is, “Why is it so important to you that they still be wrong?”

“I’m smarter than my dad,” writes a twenty-something blogger. “I really wish I wasn’t. It’s like finding out Santa isn’t real.” 

That isn’t an exact quote, but it is a decent summary of her snarky blog. The blogger goes onto rap about how intelligence and cultural sensitivity are a cross that she must now bear in her discussions with her parents. She never states that she hates her parents. She states that she, in fact, loves them a great deal, but she characterizes that definition of love with an element of pity, bordering on condescension, that appears to be endemic in twenty-somethings.

Some carry this teenage hatred well into their twenties. The teen years are a period of cultivation, containing rebellion, learning, etc., that occur before our minds fully form. As we age, our mind matures, and so does our rebellion, until it manifests into either full-fledged hatred, or a condescending pity that recognizes their backwater modes of thought for what they are. This matured rebellion is also based on the fact our parents still have some authority over us, and that reminds us of those days when our parents had total authority over us, and how they “abused it to proselytize their closed-minded beliefs on us.”

When we finally reach a point when they’re no longer helping us pay for tuition, a car, or rent, and we’re able to flex independent muscles, we spend the next couple of years fortifying this notion that they were wrong, all wrong, all along.

By the time we progress to our thirties, circumstances reveal to us some of the logic and wisdom our parents attempted to pass down to us, and the idea that it does apply in some circumstances. (Some will never admit this. Some remain stuck in a peak of rebellion.) Their advice may not have applied in all circumstances, of course, but it applied in so many that took the prominent bumbling fool banner down. Then, when we reach our forties, we begin to think that they’re idiots all over again.

I wrote the last line to complete a joke I read. It’s a funny line, because there is an element of truth in it, but in my experience the truth lies somewhere in the middle. The truth is a hybrid of the lifelong recognition we have of our parents’ failings combined with the points we begrudgingly give them on some matters. We also respect them in a manner we never did as kids, because we now have our own kids, and we view them as fellow parents that tried to lead us down a path most conducive for happiness and success in life.

This specific timeline may not apply to everyone, as we all go through these stages on our own time. The word hate may be too stark a term for the adults still experiencing some animosity towards their parents, but anyone that has been through the roller coaster ride knows that the peaks and valleys can be one hell of an emotional roller coaster ride.

Theory formed the foundation of much of my uninformed rebellion, and real-world circumstances revealed to me that some of the archaic and antiquated advice my dad offered me had some merit. These circumstances, as I said, included having my own child and my own attempts to protect the sanctity of his childhood, in the same manner my dad attempted to protect mine. As evidence of this, I once thought my dad committed some errors in raising me by sheltering me too much, until some know-it-all said that means my dad did his job. “How so?” I asked. I was all ready to launch into a self-righteous screed about how he knew nothing about my childhood, until he said, “By allowing your childhood to last as long as possible.”

Another circumstance arrived when I tried to get along with my co-workers, and I tried to appease my boss. My father warned me that this would be more difficult than I assumed, and he was right, but I regarded that as nothing more than an inconvenient coincidence in my path to individuality.   

It’s not debatable to me that I was right about some of the things I planted a flag in, but these circumstances led me to understand that my dad lived a rich, full life by the time he became my mentor, and some of my impulsive, theoretical thoughts about the world were, in fact, wrong. (Even after gaining some objectivity on this matter, it still pains me to write that line.)

Having my own job, my own money, and my own car did a great deal to provide me the independence I desired, but I wanted more. Having my own home, and friends, and a life completely devoid of my dad’s influence gained me even more, but it wasn’t enough.

I wanted to be free of the figurative shackles being my dad’s son implied. Every piece of information I received about history, the culture, and the world was exciting, and new, and mine, because it stood in stark contrast to everything my dad believed. The information I received, that confirmed my dad’s wisdom, bored me so much I dismissed it. The new age information coincided with everything I wanted to believe about the brave new world that my dad knew nothing about, and it confirmed my personal biases.

I didn’t ask myself the question that I now pose to the blogger when I was a twenty-something, regarding why I still needed my dad to be wrong. I probably would not have had much of an answer, even if I searched for it. I probably would have said something along the lines of “Why is it so important to him that he cling to that age-old, traditional mode of thought?”

This redirect would not have been an attempt at deception or evasiveness. I just did not have the awareness necessary to answer such a question. Moreover, as a twenty-something, new age thinker, I was rarely called upon to establish my bona fides. All parties concerned considered me a righteous rebel, and the old guard was, by tradition, the party on trial. They often felt compelled to answer my questions, as opposed to forcing me to define my rebellion, and I enjoyed that because on some level I knew I couldn’t answer those questions.  

My twenty-something definition of intelligence relied on emotion, theory, and very little in the way of facts. I thought they were facts, however, and I had the evidence to back them up. I thought I was intelligent, more intelligent than my dad was, but the question I did not ask is what is intelligence? The answer is it depends on whom you ask.

In Abraham Lincoln’s day, the ability to drop a pertinent reference from Shakespeare and The Bible in any given situation formed the perception of one’s intelligence level. My generation believed that dropping a well-timed, pertinent quote from Friends and Seinfeld defined intelligence, coupled with a thorough knowledge of the IMBD list of Bruce Willis. To the next generation, it has something to do with knowing more than your neighbor about Kim Kardashian and Lady Gaga. (I concede that the latter may be an epic fail on my part.)

My dad knew nothing of Seinfeld, or Bruce Willis, so he knew nothing as far as I was concerned. He knew nothing about computers, or devices, and a third party introduced him to gold records (These gold records were CDs, compact discs, LOL! Gold records?) shortly before his death. This lack of knowledge about pop culture and technological innovation transcended all matters, as far as I was concerned. I believed my dad was a bumbling fool, traditionalist trapped in 1950’s traditionalist modes of thought, and that he could’ve never survived in our current, more sensitive culture. He was backwater, hick, and whatever other adjectives we apply to one trapped in a time warp of the sixties, maybe seventies, but definitely not nineties, the noughties, or the deccas.

The question that we in the smarter-than-our-parents contingent must ask ourselves is how much of the divide between our parents’ level of intelligence and ours is in service of anything? I, like the snarky and provocative blog writer, can say that I knew more about more than my dad did, but I defined that divide and most of what I used to inform that divide involved inconsequential information that I will never use for any substantial purpose. The conditions of my dad’s life were such that he didn’t receive what most would call a quality education, but he used whatever he learned to prosper on a relative basis. One could say that the difference between my education and my dad’s, and the education of the snarky contingent versus her parents’, could be whittled down to quantity versus quality.    

In the Workplace  

Much to my shock, I began quoting my dad to fellow tenured employees, well into my thirties:

“Everyone has a boss,” and “You can learn everything there is to know about the world from books, but the two words most conducive to success in life are going to revert to either: ‘Yes sir!’ and ‘No sir’.” 

I loathed those words for much of my young life, as they implied that even after escaping my dad’s management of my life –a level of authority that turned out to be far more macro than I ever considered possible– I would always have a boss, and the bosses that followed my dad taught me the difference between his level of macro management, and my boss’s definition (Hint: micro) when I was out on my own, and out from under his totalitarian thumb. I would also learn that my boss’s moods would forever dictate whether my day would be a good one or a bad one, in the same manner days under my dad’s moods affected me, only tenfold.

Dad’s advice derived from his experience in the workplace, but that experience occurred in an era that required reverence of a boss. Thanks to the new age ideas of boards and panels conducting arbitration cases for those that have been fired, the various wrongful termination lawsuits, and the threat thereof that gave life to the Human Resources department, the reverence requirement was no longer as mandatory in my era.

I would also learn that my newfound freedom would contain a whole slew of asterisks that included the idea that no matter how much free time I had, I would spend a great portion of my life in a workplace, under the watchful eye of authority, compromising my personal definition of freedom every step of the way.

Throughout the course of my life, I’ve met those that never went through these stages of rebellion. If you find this as incomprehensible as I did, all I can tell you is I’ve met them. They said rational things like this, in their twenties, “I never thought my parents were perfect, but I know that they always tried to steer me into what they believed to be the right course.”

As soon as I picked myself off the floor from laughter –believing that I was on the receiving end of a comedic bit– I realized they were serious. The fact that their upbringing was so much healthier than mine, caused me to envy them in some ways, but after chewing on that for years I realized that all of the tumult I experienced, self-inflicted and otherwise, defined my character and my current individual definition of independence.

We are our parent’s children, and at times, we feel trapped by it. Therefore, we focus on the differences. We may mention some of the similarities, but we take those characteristics for granted, and we think all parties concerned do too. Even when we reach a stage in life when we begin to embrace some elements of that trap, somewhere in our thirties and forties, we cling to the idea that we’re so different. The answers as to why these dichotomies exist within us are as confusing to us as the fact that they are a fait accompli.

When immersed in the tumult of the younger brain, trying to make some sense of our world, we may fantasize about what it would be like to have other parents. Our friend’s parents seem so normal by comparison. We think most of our problems could be resolved if we had their parents, or any normal people as parents. We might even fantasize about what it might be like to have been free of all patriarchal influence. We consider how liberating it might be to be an orphan, until we recognize how confusing that must also be. Those without parents must lack a frame of reference, a substantial framework, or a familiar foundation from which to rebel. When we consider this, we realize that our whole identity involves pushes and pulls of acquiescence and rebellion to our parents.

While there is some acknowledgement of the ‘the more things change, the more they stay the same’ dictum when we receive advice from our parents, our rebellion operates under the “It was the best of times, it was the worst of times” principle when we process that advice and apply it to our era. When we acknowledge that knowledge of innovations and pop culture are superfluous, that removes a substantial plank of our rebellion, until politics takes its place. We then sit down at our proverbial dinner table to resolve the political and geopolitical problems of the day, for our nation, in a manner we deem substantial. It fires us up. We deliver nuke after nuke, until we realize that the effort to persuade our parents is futile. We also recognize that nestled within this effort was our juvenile, sometimes snarky need to prove them wrong. While a more substantial plane than pop culture, political discussions can be just as silly for us, as it was for our parents when they discussed such issues at their parents’ dinner table, and they considered their parents to be bumbling idiots that offered nothing new to the discussion and stubbornly resisted the winds of culture change. The one import that they may have taken from their discussions with their parents, as we will with ours, over time, is that the more things change, the more they stay the same, and human nature doesn’t change as much as we may believe it does with innovations, cultural advancements, and social awareness. A kiss is still a kiss, a boss is still a boss, and the fundamental things still apply, as time goes by.

***

One final piece of advice this former rebel turned-individual offers to the provocative, parent-hating rebels is that we should all thank our parents for raising us. Thanking them could be one of the hardest things we ever do, as we may lose most of the provocative, parent-hating points we’ve spent our whole life accumulating, but it might turn out to be one of the best things we ever do too.

I thanked my dad for everything he did for me, and I did not add all of the qualifiers and addendum I would have added years earlier. I managed to put all grievances behind me for the ten seconds it took me to thank him.

Was it hard? I will not bore you with the details of my rearing, but suffice it to say my dad could be a difficult man, and he played a significant role in the anger, frustration, and the feelings of estrangement I felt for much of my life.

I could go into further detail to ingratiate myself further with those currently struggling with the idea that I don’t understand their dilemma. To display my empathy, I have a quote that served me well throughout the traumatic years: “Not every person who becomes a parent is a good person.” Parents are people too, and some of them are as misguided, confused, immoral, and selfish as the rest of us are. Yet, we are people too, and some of us are susceptible to making the mistake of amplifying their faults in our myopic view of them. If we were able to shake that myopic view, I think most of us will see that our parents were essentially good people who tried to move past their limitations to make us better than they were.

I dedicate this addendum to those who acknowledge that there might be anecdotes in this post that provide clarity on this subject, and they might even admit that thanking their parents would be noble, but the wound is too fresh and raw to forgive or thank them today. I empathize on a relative basis, but all I can tell my fellow angry offspring is that it would not have sat well with me if I waited.

As I sat in a pew staring at the pine box, I realized that no matter how obnoxious, self-serving, and angry my father could be at times, he was a member of an endangered species comprised of those who truly care what happens to me. How many people truly care what happens to us? Our closest friends may say they do, but they have their own lives to live. We know our parents care, but some of them show it by seeking constant updates, harping, and telling us how to live our lives, long after the tie that binds us has been broken. As impossible as this is to believe today, expressing some level of gratitude in whatever manner your relationship with your parents require might be the best thing you have ever done. We might not see it that way, today, but my guess is that even the most obnoxious rebel will see it one day, and my hope is that this addendum will convince someone, somewhere that waiting one more day might be one day too late.

To worry, or too worried?


Nestled within the quest to be free and to experience life through the portal of YOLO (You Only Live Once), or FOMO (Fear of Missing Out), lies a fear, concern, and worry that we might be too free.  Born, if the thesis of Francis O’Gorman’s book, from a need to be led.

It may seem illogical to make an argument that we’re too free, in lieu of the technological, and governmental, advances have led us to believe every move we make, and every thought we have is monitored, infringed upon, and legislated against.  Francis O’Gorman Worrying: a Literary and Cultural History is not a study of freedom, but one of the common man worrying about how the people, places, and things around us that are affected by the freedom.  Mr. O’Gorman makes this proclamation, in part, by studying the literature of the day, and the themes of that literature.  He also marks this with the appearance, and eventual proliferation of self-help guides to suggest that this greater sense of concern, or worry, led to readers, of another era, rewarding writers that provided them more intimate, more direct answers.  This study leads Mr. O’Gorman to the conclusion that this general sense of worry is a relatively new phenomenon, as compared to even our recent ancestral history.

yes_me_worryOne fascinating concept Mr. O’Gorman introduces to this idea is that the general sense of worry appears to have a direct relation to the secularization of a culture.  As we move further and further away from the religious philosophies to a more individualistic one, we may feel freer to do what we want to do, but we are also more worried about the susceptibility we have to the consequences of unchecked, mortal decision making. We humans have an almost inherent need to be led.

How often does a secular person replace religion with politics?  Politics, at its core, is leadership, and in our dining room table discussions of politics, most of our discussions revolve around why one person is capable of leading our locale, our state, and our nation.  It involves why one person’s idea of leadership may be inept, while another –that abides by our principles– is more capable. As much as those adults that believe themselves fully capable of living without leadership would hate to admit it, all political thought revolves around the desire to be led.

Reading through the various histories of man, we have learned that our ancestors had more of a guiding principle, as provided by The Bible.  The general theory, among those that preach the tenets of The Bible is that man’s mental stability, and happiness, can be defined in direct correlation to his desire to suborn his will to God’s wishes.  God gave us free will, they will further, but in doing so He also gave us guiding principles that would lead us to a path of righteousness and ultimate happiness.

If a man has a poor harvest –an agrarian analogy most preachers use to describe the whole of a man’s life– it is a commentary on how this man lived.  The solution they provide is that the man needs to clean up his act and live in a Godlier manner.  At this point in the description, the typical secular characterization of the devoutly religious comes to the fore, and their agreed upon truth has it that that these people are unhappier because they are unwilling to try new things, and puritanical in a sense that leads them to be less free.  The modern, more secularized man, as defined by the inverse characterization, has escaped such moral trappings, and he is freer, happier, and more willing to accept new ideas and try new things.  If the latter is the case, why are they so worried?

We’ve all heard snide secularists say that they wish they could set aside their mind and just believe in organized religion, or as they say a man in the sky.  It would be much easier, they say, to simply set their intelligence aside and believe.  What they’re also saying, if Mr. O’Gorman’s thesis can be applied to them, is that it would give them some solace to believe that everything was in God’s hands, so that they wouldn’t have to worry all the time.

Like the child that rebels against authority, but craves the guidance that authority provides, the modern, enlightened man appears to reject the idea of an ultimate authority while secretly craving many of its tenets at the same time.  A part of them, like the child, craves the condemnation of immorality, a reason to live morally, and for some greater focus in general.  The randomness of the universe appears to be their concern.

One other cause for concern –that is not discussed in Mr. O’Gorman’s book– is that the modern man may have less to worry about.   If social commentators are to be believed, Americans have never been more prosperous:

“(The) poorest fifth of Americans are now 17 percent richer than they were in 1967,” according to the U.S. Census Bureau

They also suggest that the statistics on crime are down, and teenage pregnancy, and drinking and experimental drug use by young people are all down.  If that’s the case, then we have less to worry about than we did even fifteen years ago.  It’s a concern.  It’s a concern in the same manner that a parent is most concerned when a child is at its quietest.  It’s the darkness before the storm.

Francis O’Gorman writes that the advent of this general sense worry occurred in the wake of World War I.  Historians may give these worriers some points for being prescient about the largely intangible turmoil that occurred in the world after the Great War, but World War I ended in 1918 and World War II didn’t begin until 1939, a gap of twenty-one years of people worrying about the silence and calm that precedes a storm.  This may have propelled future generations into a greater sense of worry, after listening to their parents’ concerns over a generation, only to have them proved right.

The idea that we worry about too much freedom, as in freedom from the guidelines and borders that religion, or God, can provide, can be accomplished without consequences, writes The New Republic writer, Josephine Livingstone in her review of Francis O’Gorman’s book:

“The political concept of freedom gets inside our heads.  It is a social principle, but it structures our interiority.  This liberty worries us; it extends to the realm of culture too, touching the arts as much as it touches the individual human heart and mind.

“In this way, O’Gorman joins the tide of humanities scholars linking their discipline with the history of emotion, sensory experience, and illness. It’s an approach to culture most interested in human interiority and the heuristics that govern the interpretation of experience: Happiness can be studied; sound can be thought through; feeling can be data.”

Ms. Livingstone furthers her contention by writing that the human mind can achieve worry-free independence, in a secular society, by studying select stories, from select authors:

“Worrying also fits into the tradition of breaking down myths and tropes into discrete units, a bit like Mircea Eliade’s Myth and Reality or C. S. Lewis’ Studies in Words. We care about these books because we need stories about the cultural past so that we might have a sense of ourselves in time. The real value of O’Gorman’s book lies, I think, in the way it flags the politics of the stories we tell ourselves. In its attribution of emotional drives to the ideas behind modernist culture and neoliberal politics alike, Worrying shows that their architects –writers, mostly– are as much victims of emotion as masters of thought. If we can see the emotional impulses behind our definitions of rationality, liberty, and literary craftsmanship, we can understand our own moment in cultural time more accurately and more fairly: Perhaps we can become our own gods, after all.”

One contradiction –not covered in the O’Gorman book, or the Livingstone review– is the trope that religious people are miserable in their constraints.  This is ostensibly based on the premise that they fear the wrath of God so much that they’re afraid to live the life that the secular man does.  Yet, O’Gorman infers that religious people tend to worry less, because they follow the guidelines laid out in The Bible, and they place their destiny, and fate, in the hands of God.  The import of this is that for religious minds, the universe is less random.  Ms. Livingstone’s review basically says that the secular life doesn’t have to be so random, and it doesn’t have to cause such concern.  She basically states that if we study happiness as if it were an algorithm of either physical or aural data points, and incrementally form our thoughts around these findings we can achieve happiness.  She also states that through reading literature we can discover our own master plan, through their mastery of emotions through thoughts and ideas.  On the latter point, I would stress the point –in a manner Ms. Livingstone doesn’t– that if you want to lead a secular life, there are the ways to do so and still be worry free.  The key words being if you want to.  If you’re on the fence, however, a religious person could argue that all of the characteristics Ms. Livingstone uses to describe the virtues of the stories and the authors she considers masters of thought, could also be applied to the stories, and writers of The Bible, and the many other religious books.  If her goal, in other words, is to preach to her choir, she makes an interesting, if somewhat flawed case.  (I’m not sure how a living, breathing human being, could study a data sheet on happiness and achieve the complicated and relative emotion.)  If her goal, on the other hand, is to persuade a fence sitter that secularism is the method to becoming your own god, this reader doesn’t think she made a persuasive case.

An Intellectual Exercise in Exercising the Intellect


“There are no absolutes,” a friend of mine said in counterargument.  The snap response I had was to counter her counter with one of a number of witty responses I had built up over the years for this statement.  I decided, instead, to remain on topic, undeterred by her attempts to muddle the issue at hand, because I believe that for the most part this whole philosophy has been whittled down to a counterargument tactic for most people.

Whenever I hear the “No Absolutes” argument, I think of the initial stages of antimatter production.  In order to get the protons, neutrons, or electrons spinning fast enough, a physicist needs to use a Particle Accelerator to attempt the production of an atomic nuclei, otherwise known as antimatter.  The acceleration of these atoms occurs in a magnetic tube that leads them to a subject, upon which they smashed to produce this final product.  The process is a lot more intricate and complex than that, but for the purpose of this discussion this simplified description can be used as an analogy for the “There are No Absolutes” argument that is often introduced in an echo chamber of like-minded thinkers, until it is smashed upon a specific subject, and the subject matter at hand is then annihilated in a manner that produces intellectual antimatter in the mind of all parties concerned.

Tower of Babel

Tower of Babel

The “No Absolutes” argument is based on the post-structuralism idea that because we process, or experience, reality through language –and language can be declared unstable, inconsistent, and relative– then nothing that is said, learned, or known can be said to be 100% true.

This degree of logic could be the reason that a number of philosophers have spent so much time studying what rational adults would consider “Of Course!” truths.  One such example, is the idea of presentism.  Presentism, as presented by the philosopher John McTaggart Ellis McTaggart, could also be termed the philosophy of time. The central core of McTaggart’s thesis has it that the present is the lone timeframe that exists, and that the past, and the future cannot exist at the same time.  The past has happened, he states, and the future will happen, but they do not exist in the sense that the present does.  This philosophy is regarded in some circles (to the present day!) as so insightful that it is included in some compilations of brilliant philosophical ideas.

Anyone that is familiar with McTaggart’s philosophy, or will be by clicking here, can read through the description of the man’s theory a number of times without grasping what questions the man was answering.  His description of time is so elementary that the reader wonders more about the audience that needed that explained to them, than they do the philosophy of Mr. McTaggart.  Was McTaggart arguing against the linguists attempts to muddle the use of language, or was he attempting to argue for the reinforcement of agreed upon truths?  Regardless, the scientific community had problems with McTaggart’s statement, as depicted by the unnamed essayist writing in this article:

If the present is a point (in time) it has no existence, however, if it is thicker than a point then it is spread through time and must have a past and future and consequently can’t be classed as purely the present.  The present is immeasurable and indescribable” because it is, we readers can only assume, too finite to be called a point.”

Those that want to dig deep into the physicist’s definition of time, of which this unnamed essayist seems to be a party, will find that time is a measurement that humans have invented to aid them in their day-to-day lives, and that the essence of time cannot be measured.  Time is not linear, and it cannot be seen, felt or heard.  They will argue that there is nothing even close to an absolute truth regarding time.  Setting aside the physicists’ definition of time, however, humans do have an agreed upon truth of time that McTaggart appeared to want to bolster through elementary, agreed upon truths of time to thwart the confusion that sociolinguists, with a political orientation, introduced to susceptible minds.

There’s nothing wrong with a man of science, or math, challenging our notions, perceptions, and agreed upon truths.  Some of these challenges are fascinating, intoxicating, and provocative, but some have taken these challenges to another level, a “No Absolutes” level to this point of challenging our beliefs system that has resulted in damage to our discourse, our sense of self, free-will, and a philosophy we have built on facts and agreed upon truths in a manner that may lead some to question if everything they believe in is built on a house of cards that can be blown over by even the most subtle winds of variance.

There was a time when I believed that most of the self-referential, circuitous gimmicks of sociolinguistics –that ask you to question everything you and I hold dear– were little more than an intellectual exercise that professors offered their students to get them using their minds in a variety of ways.  After questioning the value of the subject of Geometry, my high school teacher informed me: “It is possible that you may never use any aspect of Geometry ever again, but in the course of your life you’ll be called upon to use your brain in ways you cannot now imagine.  Geometry could be called a training ground for those times when others will shake you out of your comfort zone and require a mode of thinking that you may have never considered before, or use again.” This Geometry professor’s sound logic left me vulnerable to the post-structuralist “No Absolutes” Philosophy professors I would encounter in college.  I had no idea what they were talking about, I saw no value in their lectures, and I thought that the ideas that I was being introduced to, such as those nihilistic ideas of Nietzsche, always seemed to end up in the same monotonous result, but I thought their courses were an exercise in using my brain in ways I otherwise wouldn’t.

Thus, when I first began hearing purveyors of the “No Absolutes” argument use it in everyday life, for the purpose of answering questions of reality, I wanted to inform them that this line of thought was just an intellectual exercise reserved for theoretical venues, like a classroom.  It, like Geometry, had little-to-no place in the real world.  I wanted to inform them that the “No Absolutes” form of logic wasn’t a search for truth, so much as it was a counterargument tactic to nullify truths, or an intellectual exercise devoted to exercising your intellect.  It is an excellent method of expanding your mind in dynamic ways, and for fortifying your thoughts, but if you’re introducing this concept to me as evidence for how you plan on answering real questions in life, I think you’re going to find it an exercise in futility over time.

Even when a debate between two truth seekers ends in the amicable agreement that neither party can sway the other to their truth, the art of pursuing the truth seems to me to be a worthwhile pursuit.  What would be the point of contention for two “No Absolutes” intellectuals engaging in a debate?  Would the crux of their argument focus on pursuing the other’s degree of error, or their own relative definition of truth?  If they pursued the latter, they would have to be careful not to proclaim their truths to be too true, for fear of being knocked back by the “There are No Absolutes,” “Go back to the beginning” square.  Or would their argument be based on percentages: “I know there are no absolutes, but my truth is true 67% of the time, while yours is true a mere 53% percent of the time.”  Or, would they argue that their pursuit of the truth is less true than their opponents, to therefore portray themselves as a true “No Absolutes” nihilist?

Some may argue that one of the most vital components of proving a theoretical truth in science, is the attempt to disprove it, and others might argue that this is the greatest virtue of the “No Absolutes” argument, and while we cannot dismiss this as a premise, purveyors of this line of thought appear to use it as nothing more than a counterargument to further a premise that neither party is correct.  Minds that appear most confused by the facts, find some relief in the idea that this argument allows them to introduce confusion to those minds that aren’t.  Those that are confused by meaning, or intimidated by those that have a unique take on meaning, may also find some comfort in furthering the notion that life has no meaning, and nothing matters.  They may also enjoy informing the informed that a more complete grasp on meaning requires one to have a firmer grasp on the totality of meaninglessness.  The question I’ve always had, when encountering a mind that has embraced the “ No Absolutes” philosophy is, are they pursuing a level of intelligence I’m not capable of attaining, or are they pursuing the appearance of it?

Shame, Shame, Shame!


“You should be ashamed of yourself,” is a line all of us have heard at one time or another in our lives.  The words have a powerful effect, no matter how much we hate to admit it. When said with a dash of harshness, that’s not harsh enough to provoke rebellion, these words can break us down, make us feel foolish, bad, and ashamed.  Whether we are guilty or not, they can also touch such a sensitive core that makes us feel like children, again, being scolded by our grandmother.  We don’t like feeling this way, no one does, and we all know this when we use it on others.

Reveal to Justine Sacco, Lindsey Stone, Jonah Lehrer, and the cast of others that have been recently experienced worldwide shaming, via the internet, the basic plot of the 1973 version of the Wicker Man, and how it involves (spoiler alert) villagers sacrificing a man, by burning him alive, to provide for the coming harvest, and they may tell you that they would not be able to sit through such a movie.  The correlation may not be perfect, but if you replace the harvest with social order and couple it with the proverbial act of condemning someone for the purpose of advancing a social order, those that have been regarded as sacrificial by social media, may experience such a wicked case of déjà vu that they may physically shudder during the final scenes of that movie.

stocksOne of the first images that comes to mind when one hears about a group sacrificing a human for the common good is this Wicker Man image of a relatively primitive culture sacrificing one of their own to appease their gods or nature.  We think of people dressed like pilgrims, we think of chanting, mind control, and individuals being shamed by the shameless.  We think of arcane and severe moral codes, and the extreme manner in which they handled those that strayed from the collective ideal.

Members of those cultures might still stand by the idea that some of these ritualistic practices were necessary.  They might concede that the whole idea of sacrificing humans for the purpose of yielding a better harvest was ill-conceived, especially if they were being grilled by a lawyer on their agricultural records, but burning people at the stake, hangings, and putting people in stocks, however, were punishments they provided to the truly guilty, they might say.  And these were necessary, they might argue, to keep their relatively fragile communities in line.  They might argue that such over-the-top displays of punishments were necessary to burn images into the mind of what could happen to those that are tempted to stray from the moral path.  They might suggest that based on the fact that our law enforcement is so much more comprehensive nowadays, we cannot understand the omnipresent fear they had of chaos looming around the corner, and the use of shame and over the top punishments were the only measures they could conceive to keep it at bay.

We may never cede these finer points to them, in lieu of the punishments they exacted, but as evidenced by the cases of the four individuals listed in the second paragraph, the greater need for symbolic, town hall-style shaming has not died.  Our punishments may no longer involve a literal sacrifice, as it did in that bygone era, but the need to shame an emblematic figure remains for those of us that feel a call to order is justified to do whatever it takes to keep total chaos at bay.

The conundrum we experience when trying to identify with how our ancestors acted is easier to grasp when we convince ourselves that these actions were limited to the leadership of those communities.  We can still identify with a suspect politician, an inept town council, and a couple of corrupt and immoral judges, but when we learn that most of the villagers involved themselves in the group’s agreed upon extremes, we can only shake our head in dismay.

Writers from that era, and beyond, describe the blood lust that occurred among the spectators in the form of shouts for someone’s head, and the celebratory shouts of “Huzzah!” that occurred immediately after the guillotine exacted its bloody form of justice on the alleged perpetrator.  How could they all cheer this on?  How could so many people be so inhumane?

Some would argue that the very idea that we read history from a distance –believing that the human being has advanced so far beyond such archaic practices that it’s tough for us now to grasp their motivations– while engaging in similar, but different behaviors, is what makes the study of group thought so fascinating.

In his promotional interview with Salon.com, for the book So You’ve Been Publicly Shamed, author Jon Ronson details the Twitter treatment he wrote about in that book, directed at a publicist named Justine Sacco.  Justine Sacco took an eleven hour plane trip to Africa.  Before boarding the plane, Justine sent out a number of tweets to her 170 Twitter followers. Among those tweets was a now infamous one:

“Going to Africa.  Hope I don’t get AIDS.  Just kidding.  I’m White!”

No matter how one chooses to characterize this tweet, it’s tough to say that it’s the most inflammatory tweet ever put out on Twitter. For varying reasons, millions of people latched onto this statement and took this relatively unknown tweeter from 170 followers to the number one worldwide trend on Twitter, all while Ms. Sacco remained oblivious, in the air, en route to Africa.  She received everything from death threats, people wishing that she would get AIDS as retribution for her heartlessness, and the varying degrees of near lustful excitement that began mounting among those villagers gathering around the intangible town square, imagining the look on her face when the lowering, technological guillotine finally became apparent to her when she landed, so they could all shout “Huzzah!” in unison.

“I’m dying to go home but everybody at this bar is so into #hasjustinelandedyet. Can’t leave til it’s over,” was a tweet Mr. Ronson found soon after the publication of his book to illustrate the excitement that had been building among those that couldn’t wait for Ms. Sacco to land and discover that the life she lived prior to that tweet was now over. 

Shaming in the Modern Era

Before purchasing RonsonSo You’ve Been Shamed book, one might be tempted to think that it is little more than a detailed list of those, like Ms. Sacco, that have committed purported transgressions.  The fact that it is not, is illustrated by the decision Mr. Ronson made to focus on incidents that would’ve been considered inconsequential were it not for the varying reactions observers had to them.

Ms. Sacco, for example, wasn’t inferring that she hoped that more black people contract AIDS, or that she hoped that the AIDS virus would continue to attack black people almost exclusively.  One could say, reading her tweet literally, that she may have been intending to speak out against the infection for being racially biased.  Perhaps it is the confusion regarding who, exactly, Ms. Sacco was condemning that led so many to fill in the blanks for their own purpose.  Whatever the case was, they did fill in those blanks, and the pack mentality did frame that single tweet in a manner that encouraged tweeters, 24-7 news programs, and all of the other venues around the world to heap scorn and shame on her in a manner that could leave no observer with the belief that shaming is dead.

It could also be guessed that Ms. Sacco was attempting to provide her followers poignant humor.  Her tweet was, presumably, her attempt to garner empathy for sufferers of a disease that appeared unnaturally selective, and that she was probably attempting to spearhead some form of awareness among her 170 Twitter followers without sufficient regard for how it could be misinterpreted by those that would choose to misinterpret her tweet for the purpose of spearheading a movement to garner empathy for sufferers of a disease that appeared unnaturally selective. Those that responded on Twitter not only appeared to relish the opportunity to champion a cause, for greater definition among their peers, but to technologically burn whomever they had to to get there.

And while we can only guess that most of the offended had to know that Ms. Sacco wasn’t intentionally infringing on their ideological issue, the opportunities to prove one’s bona fides on an issue don’t come along very often, and when they do they’re often limited to coffee shop and office water cooler conversations with two-to-four people.  And those two-to-four people, are often forced to soft-peddle their outrage, because they will have to work around, or otherwise be around, the target of their condemnation in the aftermath.  Ms. Sacco, on the other hand, was an intangible victim that most of those in the intangible town square would never meet, so they didn’t have to worry about her feelings, and her tweet provided them the perfect venue to establish their bona fides on a worldwide stage.

“If we were in one of those Salem town squares witnessing a witch burning,” one of Ms. Sacco’s Twitter shamers might argue, “We would be shouting at the throng gathered around the witch, calling for them to be burned, and not the alleged witch.  We’re not shaming with the sort of moral certitude of those people of a bygone era, we’re shaming the shamers here.  It’s different!”  They might also argue that their goal, in shaming the Justine Saccos of the world, is to not only to redirect shame back on the shamers, but to effectively eradicate the whole practice of shaming … unless it’s directed at those that continue to shame others.

On this revised act of shaming, the Salon.com interviewer of Jon Ronson, Laura Miller, provided the following summation:

“If you are a journalist or a commentator on Twitter or even just aspiring to that role, you have to build this fortress of ideology.  You have to get it exactly right, and when you do it becomes a hammer you can use against your rivals.  If you even admit that you could have possibly been wrong, that undermines both your armor and your weapon.  It’s not just something you got mad about on social media; it’s your validity as a commentator on society that’s at stake.”

If that’s true, then no one angrily wished death and disease on Justine Sacco, but they felt a need to sound more brutal than any that had tweeted before them to establish their bona fides on the issue.  They weren’t angrier than any of the previous tweeters, they were just late to the dog pile, and they felt a need to jump harder on top of the pile to generate as much impact as those on the bottom had with their initial hits.  The idea of the target’s guilt, and the severity of her guilt, kind of got lost in all of the mayhem.  Each jumper became progressively concerned about the impact their hit would make, and how it would define them, until they felt validated by the proverbial screams of the subject at the bottom of the pile.

US

If you’ve reached a point, in this conversation, where you’ve recognize the different, but similar shame tactics employed by the primitive and advanced societies, you’ve probably reached a point where you’ve recognized the correlation, and you’re shaking your head at both parties.  In his book, however, author Jon Ronson cautions us against doing so.  It’s not about them, the central theme of his book suggests, it’s about you, him, and us.  In one interview, he stated that he thought of pounding that point home by simply calling the book “Us”, but that he feared some may infer that meant that he was specifically referring to the United States, or the U.S.

The subjects of shame, and the shamers that exacted their definition of justice on them, he appears to be saying, are but anecdotal evidence of the greater human need to shame. It’s endemic to the human being, to us, and while the issues may change and evolve, and the roles may reverse over time to adapt to the social mores of the day, the art of shaming remains as prevalent among the modern man as it did during a B.C. stoning.

The elephant in the room that Mr. Ronson did not discuss in his book is the idea that the viciousness the modern day shamed person experiences may have something to do with the vacuous hole created by the attempt to eradicate shame from our culture.  Our grandmothers taught us this very effective tool, as I wrote.  They used it to try and keep us on the straight and narrow, and they did it to keep us from embarrassing ourselves.  When we witnessed our childhood friends engaging in the very same behavior that we had been shamed into avoiding –thus displaying the fact that they hadn’t been properly shamed against such behavior– we stepped in to fill the void.  We shamed them in the manner our grandmother had, using –as kids often will– the same words our grandmother had.  We then felt better about ourselves in the shadow of their shame.

As adults in a modern, enlightened era, we learned that we are no longer to use the tool of shame.  The lessons that our grandmother taught us, we’re now being told, were either half-right, or so baked in puritanical, traditional lines of thought that they no longer apply.  Ours is an advanced, “do what you feel” generation that struggles to believe that there is no right and wrong, unless someone gets hurt.  The benefit, we hope, is that if we eradicate judgment and shame from our society, we can also be liberated from it.  Yet, there is a relative line in the sand where attempting to avoid judgment and shaming will eventually, and incidentally, encourage that activity.

We all know that this activity will eventually lead to internal decay and rot for the individual, and eventually the culture, and we know that some judgment and some shaming is necessary to keep the framework intact.  It’s a super-secret part of us that knows this, and the need to shame and judge gnaws at us in a manner we may never knew existed, until that perfect, agreed upon, transgression arises.  When it finally happens that we find someone that it’s safe to shame, it fills that need, and that pressurized need that we’ve hidden so far back in the recesses of our minds in a quest to acquiesce to the new ways of thinking that the act of shaming explodes on that person, regardless of their degree of guilt.

Those of us that have learned some of the particulars of the Salem Witch Trials believe that early on in the situation there may have been a need for greater order.  The fear of chaos probably prompted them to believe some of the accused actually were witches, looking to infiltrate their youth with evil.  As we all know, it eventually began spiraling out of control to a point that people began randomly accusing others of being witches over property disputes and congregational feuds.  One can also guess that many accusers leveled their accusations for the purpose of attaining some form of superiority over the accused that they could not attain otherwise.  Those citizens of colonial Massachusetts eventually learned their lessons from the entire episode, and some would say their lesson is our lesson as of yet unlearned, as accusations of racism, and anti-patriotism, are leveled at those that may have been guilty of nothing more than a poorly worded joke, or participating in an ill-advised photograph, as in the case of Lindsey Stone.  Our era is different though.  The lessons of the self-righteous, puritanical man do not apply to today, and we don’t need to know the whole story before we make that leap to a defense of the social order that provides us the characterization we desire in the dog pile?