You are Not so Dumb


“You are what we call a processor,” my boss said in a one-on-one meeting. “You study the details of a question before you answer. It might take you more time to arrive at a conclusion, but once you do, you come up with some unique, creative thoughts. There’s nothing wrong with it. We just think differently, and when I say we,” Merri added to soften the blow, “I include myself, for I am a bit of a processor too. So, it takes one to know one.”
Merri added some personal anecdotes to elucidate her point, but the gist of her comment appeared to spring from the fact that she was a quality manager who knew I was struggling under the weight of a quick thinking co-worker that she considered a marvel. I may be speculating here, but I think Merri knew that the best way to get the most out of me was to sit me down and inform me that in my individual manner I was a quality employee too. That woman just called me slow, I thought as she continued. She may have dressed her analysis up with a bunch of pretty adjectives, but the gist of her analysis is that I was a slow learner. I tried to view the comment objectively, but the sociocultural barometers list a wide array of indicators of intelligence, but foremost among them are speed and quickness. She just informed me that I was the opposite of that, so I considered her analysis the opposite of a compliment. I also tried to come up with some compelling evidence to defeat her analysis of me. Yet, every anecdote I came up with only proved her point, so I chose to focus on how unfair it was that those of us who analyze situations before us, to the point of over-analyzing, and at times obsessing over them, receive less recognition for the final solutions we find. We receive some praise, of course, when we develop a solution, but it pales in comparison to those who “Boom!” the room with a quick formulation of the facts followed by a quick one. Even on those occasions when my superiors eventually deemed my solution a better one, I didn’t receive as much praise as the person who came up with a quick, quality one in the moment. I don’t know how long Merri spoke, or how long I debated my response internally, but I changed my planned response seven or eight times based on what she was saying. Two things dawned on me before Merri’s silence called for a response. The first was that any complaint I had about the reactions people have to deep, analytical responses as opposed to superficial, quick thoughts, were complaints I had regarding human nature, and the second thought I had was any response I gave her would be a well thought out, thoroughly vetted response that would only feed into her characterization. I figured she might ever respond, “And that’s exactly what I’m talking about.” Putting those complaints about human nature aside for a moment, Merri’s characterization of my thinking pattern was spot on. It took me a while to appreciate the depth of her comment, and that probably proves her point, but she didn’t really know me well enough to make such a characterization. I think it was a guess on her part that just happened to be more right on than she’ll ever know. Merri’s characterization gradually evolved my thinking about thinking, and it led me to know a little bit more about knowing than I did before my one-on-one with her. Her comment also led to be a little more aware of how I operated. Before I sat down with her, I knew I thought different. I went through a variety of different methods to pound facts home in my head, but I never considered the totality of what she was saying before. This was my fault for the most part, but I never met a person who thought about the thinking process in this manner before. They may have dropped general platitudes on thinking, with regard to visual, auditory, and kinesthetic learning styles, but no one ever sat me down and said, “You’re not a dumb guy, you just need to learn how you think.” Merri’s commentary on my thinking process was an epiphany in this regard, for it led to a greater awareness about my sense of awareness, or what psychologists call my metacognition. The first level of knowledge occurs when we receive information, the second regards how we process it in a manner that reaches beyond memorization to application, and the third might be achieving a level of awareness for how we do all of the above. When she opened my mind’s eye to the concept of processing speeds, I began to see commentary on it everywhere. I witnessed some characterize it as ‘deep thinking’. This might be true in a general sense, but I am inclined to view this as a self-serving term. Slow processors have endured so much abuse over the years that we consider this re-characterization a subtle form of revenge against those who have called us slow. When a person informed me that I might be a deep thinker, I loved it so much that I wanted to repeat it, but I cringed every time I felt the urge, because I think we should leave such characterizations to others. There is an element of truth to it, however, and it arrives soon after a processor begins to believe he’s incompetent, slow, or dumb. Most reflective processors are former dumb people. Intelligent people may disagree, but if most theories are autobiographical then we must factor my intelligence into the equation. My autobiographical theory goes something like this. I spent my schooling years trying to achieve the perception of a quick thinker, and I failed miserably. When the teacher asked a question, I would raise my hand. My answers were wrong so often that a fellow student said, “Why do you keep raising your hand? You’re always wrong.” I would also hear groans, ridicule, and embarrassment for other incorrect answers in other classes, until I was so intimidated that I didn’t answer questions anymore. The byproduct of this was that I began considering my answers to the questions more often, until it achieved a cumulative effect on my thinking processes. Before Merri provided my thought process a much-needed title, I assumed I didn’t know enough to know enough. I took this perspective into everyday situations. I didn’t just consider other, more knowledgeable perspectives to resolve my dilemmas I relied on them for answers. The cumulative effect of this approach led me to begin processing information more and more often, until I gathered enough information to achieve some level of knowledge on a given subject. In my search to find intellectuals who could conceptualize this notion in different ways, I discovered the term ‘down the stairs’ thinking. If a ‘down the stairs’ thinker attends a corporate meeting in which a corporate idea, or concept, is introduced, the supervisor will conclude that meeting by asking if anyone has any questions or input they would like to add. The processor says nothing, because he can’t think of anything while in the moment. The meeting ends, and he walks back to his desk (down the proverbial stairs), when an idea hits him. I write that specific timelines to stay true to the analogy, but my ideas unfortunately do not occur that quickly. I often have to chew on the problem at hand for far too long, and the cliché ‘let me sleep on it’ definitely applies to my thinking type. This dilemma might lead one to ask, if an idea is good enough, who cares when an idea hits as long as it hits? The processor who wants the perception of being quick cares. He wants others to marvel at his intellect in the moment. The seeds of frustration and confusion are borne here, until someone comes along and clarifies the matter for us. A college professor once praised a take-home, assigned essay I wrote on some required reading. She claimed that the ideas I expressed in that essay were “unique and insightful” and she wrote that she wanted me to participate more in in-class discussions, because she said she thought I could add something to add to them. My wrong answers in high school and the resultant teasing all but beat class participation out of me, but I wanted to live up to her compliments. I did try to participate more often in the college class, the next day, but the experience only reiterated why I shouldn’t be answering questions in class. I was so wrong so often that she gave me a worried look. When we took the final in this class, it involved an in-class essay on another book. This teacher watched me in a manner shop owner might a suspected shoplifter. I think she suspected that I cheated on the take home essay, and she wanted to see if I could provide an equal performance on an in-class essay. I received the same grade on that final, and many of the same comments followed that grade. She and I both walked away from that experience with the knowledge that no matter how hard one tries to promote it, or affect it, we all think different. There are quick-thinking, reactive brains that can process information quickly and instinctively produce an answer in the manner a knee pops up when a doctor hits it with one of those rubber hammers. Others require some slow roasting, and while it may be embarrassing and frustrating for those who can’t come up with a quick answer, once they learn how they learn, think about how they think, and become more comfortable with the way in which they operate, it can liberate them from the idea that they’re as dumb as they once feared. The theme of David McRaney’s You are Not so Smart was obviously that we are not as smart as we think we are. The various essays in that book describe why we do the things we do, and how various psychological mechanisms condition us to do the things we do. I loved that book so much that I’ve written probably thirty of my own articles on the theme. This particular article is the antithesis of that book, and its purpose is to provide some relief for the confusion and frustration some have regarding their thinking style. If the information in this article spares one person from the decades of frustration I experienced in this regard, I might even consider this the best article I’ve ever written. I would do so without ego, for I am merely passing information along. If the reader identifies with the characterizations we’ve outlined here, I do have one note of caution: You may never rid yourself of this notion that you’re less intelligent than the firecracker over there in the corner, but if you can come to grips with the manner in which you think, process information, and know it to the point of arriving at an answer without all of the frustration you experience when everyone else is shouting answers out, I think you might be able to achieve some surprising results. You might never reach a point of bragging for I don’t know how they would, but attaining knowledge of self can go a long way to understanding how we operate, and it’s our job to take such information and use it accordingly. 
Advertisements

Scat Mask Replica III


1) The Rasputin Paradox. Are you involved in an enterprise in which one person’s alleged ineptitude is holding you back from realizing the vast potential of that enterprise? Is your enterprise one-step away from removing that alleged ineptitude? Those who know the history of the Russian Empire know to be careful what they wish for. Some speculate that Grigori Yefimovich Rasputin had far less influence in the Russian Empire (circa WWI) than history details, and they double down on that by saying that the Romanovs would not refute what others said about the levels of Rasputin’s influence, because they enjoyed having Rasputin play the role of the scapegoat. If they did not know the level of blame others placed on Rasputin while he was alive, they definitely found out after his death, because after Rasputin was murdered the focal point for the Empire’s ineptitude was gone. Those in politics, business, and in personal crisis should note that casting blame on one particular person for the failure of your enterprise might prove cathartic in the short-term, but once that person’s gone, it might reveal more about the general ineptitude of that enterprise than any of the other players ever imagined.   

2) “If you have facts on your side, pound the facts. If you have the law on your side, pound the law. If you don’t have either, pound the table.” One of the more uncomfortable situations I’ve experienced involved someone pleading with me to accept them as a genuine person. It’s a gross over simplification to suggest that anytime someone pounds the proverbial table to convince me of something that they’re lying, but experience informs me that the more someone pounds the table the more insecure they are about the information they’re trying to pound into my head. We’re all insecure about our presentations, and some of us pound the table even when we have the facts on our side. I know it’s easy to say, but those with facts on their side should relax and allow them to roll out as they may. The truth teller who finds it difficult to avoid pleading their case should also know that after we reveal enough supportive evidence most will believe us, but some just enjoy watching us squirm.

3) Speaking of the genuine article, it has recently come to my attention that some pathetic soul stole at least two of the articles from this site. Some call this plagiarism, but I call it pathetic. If imitation is the sincerest form of flattery, I suppose I should consider it a compliment, but this is outright theft. It seems redundant to me to clarify the rules on this matter, but if a writer is going to “repost” they are required to provide attribution. (For those unclear on the definition of this term, it means that a writer is supposed to inform their audience that they didn’t write the article.) Not only did this pathetic soul avoid attributing the article to me, but they also didn’t provide proper attribution to the quotes I did in the article they stole. So, this person (who provides no discernible path back their identity) anonymously steals posts to presumably receive checks from companies that pay writers to sport ads on their site. I don’t care how much those sponsored ads pay, how does this person sleep at night knowing that the profession or hobby they chose is one in which they cannot produce their own quality material. If I were ever to reach a level of such a desperate act, I would seek another profession or hobby. 

4) The difference between selfishness and self-awareness. A complaint about young men and women is that they’re too selfish. It’s the root of the problem, they suggest. I don’t know if it’s true, but if it is I would suggest that those speaking out against it are delivering an incomplete message. My platform would suggest that these selfish types are focusing on self-awareness, and that they should seek it to achieve a level of fulfillment. We could view striving to achieve greater self-awareness as a selfish pursuit, but self-awareness can take several forms. Performing selfless acts, for example, can teach a person a lot about themselves, and it should be encouraged, as people performing many selfless acts can become more aware of themselves and more selfless. The process could lead to an antonym of the vicious cycle these complainers decry. If I had a pulpit, I would also declare that an individual could learn more about themselves through spirituality. I’ve been on both sides of the value of scripture, and I think this gives me greater perspective on the matter. I look at scripture and other Biblical teachings as a roadmap to personal happiness through reflection. Self-interest drives me to follow those teachings because I believe it’s in my best interests to follow them. In short, I would play my sermon to the selfish predilections of the young. I hear sermons that suggest otherwise, and I can’t help but think that the priest is missing a beat.

5) As a former service industry employee, I’ve encountered my share of disgruntled customers. I could provide a list of examples, but the material of their complaints is irrelevant. Most experienced service industry employees know that the most disgruntled customers are the most disgruntled people. They might hate their kids, the spouse, and their life. Whatever the case is, the discrepancy they find causes them to unload, “What kind of Mickey Mouse operation are you running here? Your ad says this item is on sale today for two bucks. If you think I’m going to pay more than that, you must think I’m stupid! Or, are you singling me out based on my characteristics?” These statements are often a mere introduction to a heated exchange that reveals the effort of the disgruntled customer to achieve some satisfaction they can’t find elsewhere in life. A more confident customer would simply say, “Your ad says that this item is on sale today for two dollars.” Those of us who have experience in the service industry know how intimidating a confident presentation of the facts can be, especially from a more secure individual.

6) A new documentary captures an ant crawling down from a piece of cheesecake with a piece of it lodged in its mandibles. The makers of this documentary capture the ant’s progress, in stop action photography, as this permits progressed commentary from various filmmakers talking about the brilliance of each segment. Where does the ant go, and what will it do with the small, round ball of cheesecake? This is the plotline of an amazing new documentary called Posterula. (Spoiler alert) The ant makes it off the plate, but the viewers don’t know if the ant ever takes the piece to the colony to feed the queen. This leads this viewer to believe that an as of yet undisclosed plan for a sequel to this brilliant documentary is in the works.

(Hi, I’m Rilaly, and if I were to take you on a tour of my young mind, this would be but an example of what you would read. Some suggest that such humor is too niche, and if that’s the case I would’ve niched my way out of the market. If I had one of my stories published, customers at bookstores would’ve walked past my serious pieces, thinking that I’m nuts, too far gone, and unserious. They probably still think that. I’m niche.)

7) I landed upon the term “vague and flexible by design” the other day. The author of the term intended it as a compliment for the subject, but if they directed such a characterization at me, I would view it as an insult. I understand that we’re different people in different surroundings, and that we should all remain flexible with our ideals to prepare for new findings on the subject in question, but the “vague and flexible by design” compliment would register a ‘no core’ insult to me.

8) What hotel, or meeting space, first decided to serve a ball of meat as a solitary entrée? Someone somewhere should’ve stepped in and said, “Woops, you forgot the fixins.” Those who have attended more than twenty corporate galas, weddings, or any catered event are now more than accustomed to the items served in a buffet line. I now eat before I attend one of these functions, because I cannot eat another pinwheel, I’m burnt out on hot wings, and I hit my personal threshold on room temperature potatoes au gratin somewhere around 2004. I am not a finicky eater, but I can no longer stomach this list of dietary choices. I will acknowledge that being American provides me the luxury of making odd and unreasonable dietary choices, but if I’m going to limit myself to one meal a day to maintain a plump figure, as opposed to fat or obese, I’m not going to eat something just because others provide it in a visually pleasing manner.  

9) There is a difference between writing a grammatically correct sentence and quality writing. I took college classes on creative writing, I’ve read the MLAs, and I’ve learned through word-of-mouth what leads to quality reading. I’ve fixed the passive voice sentences, deleted the word “had” as often as possible, and I’ve tried to avoid what fellow writers call “the you-yous”. The goal for the writer is to adhere to the rules of writing while attempting to maintain a stream-of-consciousness style that makes for quality reading. It’s not considered grammatically incorrect to write that you may not enjoy this sentence, but writing that the reader may enjoy it without the word you is considered a more pleasant reading experience. I’ve also attempted to write “who” instead of “that”, and I’ve attempted to limit my need to “that” too often. Example: “You don’t want to write that it was someone else that said something, when who said it is much more familiar to you.” In that sentence, fellow writers suggest using the word “Writers” to replace the first you, and “Readers” is an advisable replacement for the second you. Beta readers suggest that doing otherwise means the writer has a bad case of the you-yous. You is too familiar to you, and that is too unfamiliar, and you do not want to be too familiar or too unfamiliar. The first reason for following this rule is that the writer does not want to write in the manner they speak, because the way one speaks in one locale may not be as familiar to a reader in another locale. These standards set a common base for readers, free from colloquialisms. The you yous also creep up on a writer in free flow, and they may not notice how redundant the use of the word is in their document. The question that haunts me is do I want a perfect document to impress accomplished writers, or do I want to pleasure myself with a document that might have some flaws. The notion one writer lofted was every writer makes mistakes, we readers weave them into the cloth of our expectations, but is there a point when the mistakes distract from the whole.

10) “He’s such an idiot,” Teri said after her boyfriend left the party table to go to the bathroom. “He cheats on me all the time. For all I know, he’s arranged something in the bathroom. I’m serious. I can’t even trust him to go to the bathroom.” Such comments are so unexpected that they’re hilarious.

“Why the hell are you dating him then?” I asked. Room silencing, impulsive comments like these are my gift to the world. I can flatten the smile of any decent person from fifty yards with a single thought implanted in their brain.

The comment sat right with me, but the moment after I delivered it I realized it was so loaded with complications that no one in the right mind would deliver it to a table of people gathered together for the sole purpose of mixing in some laughter with their fun. I thought it might add to the fun, or spur her into extensions on the joke, but I was wrong. I made her uncomfortable.    

As soon as she recovered from the blow, aided by my discomfort, she displayed the idea that she locked herself into a certain, cynical dynamic of life. She knew the world was full of it, and everyone around her was too, in one way or another, because she knew she was. She thought her beau was full of it too, but “He’s a nice guy…most of the time.” I didn’t know if that was her final answer, but I overemphasized my acknowledgement of her answer to suggest that was what I sought.

No matter how often I affirmed her answers, Teri kept coming at me with answers. She said he was “Funny and fun to be around.” She said he was good looking, and she said he did “Sweet things for her.” I couldn’t get out of this uncomfortable spiral of my own making. I pretended to be interested, because I knew I put her in the uncomfortable position of having to explain one of life’s most illustrating choices, but I was trying to end the episode with every word she said to me.

Most of us cannot explain our life altering choices so well that we can weather interrogations. I knew this, but I thought I could explain most of my choices at the time. The question that even the most reflective must ask themselves is, is their base so solid that we make rational, informed choices in the impulsive moments? I don’t think many reflective types would pass their own interrogations, in the moment, for I think we color in the blanks later to make us believe we made informed choices.

Teri told me he was a good man, with a good job, and he had an unusual curiosity about life that she found fascinating. I also learned that while it was obvious he had a restless, nervous energy about him, “He’s incredibly lazy. If he had his choice, he would spend his day on a couch.”

I still didn’t understand the dynamics of their relationship, even though she provided me numerous answers. I wouldn’t understand it for a while. I had no idea at the time that their relationship depended on the idea I had that she enjoyed playing the jealous girl, because, I can only assume, she considered him worthy of her jealousy, and in a world of average men with no discernible qualities, that is something. He was the naughty boy, and he enjoyed that role. “We fight like cats and dogs,” she said with a gleam in her eye, “but then we have makeup sex.” I wondered if she dated guys that wouldn’t cheat on her. I wonder if they wouldn’t fight with her. I wondered if they bored her. He provided her something to focus on other than herself. He was the dunce, but he was an amiable dunce. He provided her drama. He was always on the cusp of cheating on her. She also had a desire to date a guy that she could be better than, and she wasn’t much. Either that, or there is a desire to care for something that could break. “He’s an idiot, he doesn’t know how good he has it,” she said more than twice. The guy was fulfilling the age-old male need of feeling like a bad boy. Most guys need this coursing through their veins, and some girls apparently need a guy like this too.

11) Unhappy couples fascinate me. They don’t smile often, but smiles are a refuge of the simple minded. They don’t hug, kiss, or touch very often, but they’re not that type of people. They’re emotionally distant people, and happy people make them sick. Do they have a greater understanding about who they are than we ever will, or are they jealous? She didn’t date in high school, and he was a broken boy. Death of a loved one breaks some, divorce breaks others, and still others experience a seismic betrayal that creates an irreparable break. Yet, they found something in one another that they always wanted. As an outsider looking in, we can’t understand the allure, but the two of them stay together for years. Some stay in a job they hate, because they fear the unknown. Do people stay in relationships for the same reason? He doesn’t speak often, and relatives find it difficult to strike up a conversation with him. He gives off the vibe that he’s not interested in what others have to say, and this affects the way others react to him.

My initial instinct was that he wasn’t interested in what I had to say, for reasons endemic to our relationship, until others informed me they shared similar experiences with him. He’s more interesting when he drinks, but when the night is over, the participants realize he wasn’t really interesting in the truest sense of the word, but he was more interesting than they expected him to be. A couple of drinks loosen our inhibitions. A couple more might loosen them even more, until the potential exists for us to become interesting. That’s the mindset of the drinker anyway, I’m not sure if this is his mindset, but he does have a drinking problem. He is emotionally distant, because those that formed him devastated him emotionally. Yet, it many ways he appears satisfied with who he is.

12) No one is as boring as we think they are, but we’re not as interesting as we think we are either. How many of us look back to our authentic years with the belief that we weren’t nearly as authentic as we thought we were, especially with the level of authenticity we’ve currently achieved. How many of us will look back ten years from now with the same thought? One could say that the level of effort put into being authentic provides a corresponding level of diminishing returns. 

13) How many of us remember the first person who told us about America’s atrocities? Did they package it with a provocative statement such as, “This is something your moms and dads don’t want you to know about.” For those of us who are now parents, it’s probably been so long since someone introduced us to the dark side that we forget how intoxicating it was at the time. I don’t remember my first messenger because I’ve heard about these atrocities so many times since that they’ve all but drowned out my first messenger. Thanks to a myriad of resources I’ve encountered since, I am now able to frame those atrocities with the virtuous acts America has done throughout her history to arrive at the independent conclusion that America has been a noble nation overall. It did take me a while, however, to arrive at that conclusion. 

Some might think that learning of the atrocities for the first time might leave the recipient feeling cold, disillusioned, and/or depressed that their parents sold them a pack of lies. In the combative environment of my youth, one of the many focal points of ridicule was naïveté. “Don’t tell me you believed all that baseball and apple pie crap?” someone would say in the aftermath of a discussion on American’s atrocities. I did, and those early messengers in my life provided me information to combat the characterization that I was naïve. I considered them more informed, brave and righteous. I thought they were cooler than cool for speaking out against the marketing arm of America, and I thought they were treating me with the type of respect than my dad never did.

Now that I’m a seasoned adult, I know my dad wasn’t necessarily lying to me, and he wasn’t withholding a truth, but he didn’t give me the whole picture either. He didn’t know some of the atrocities these messengers told me, but there were incidents that he did know, and he neglected to tell me about them. Anyone who remembers their teenage mind knows how much we exaggerate the characterizations of our parents, especially when “truth tellers” package such information accordingly. Their presentations excited me in a way that’s tough to describe. I thought I was finally hearing the truth from someone.

A vital mindset for parents to have, while sharing our knowledge of American history, is that they are in a constant battle with their peers to avoid appearing naïve. For those worried about telling their children about the awful things the country has done, consider it ammunition to combat these stories with the stories of the country’s virtues. Our goal should be to instill a love of country in a comprehensive manner. To a certain point, we parents have told them what to think and how to think for so long that we may have a difficult time giving up those reins. On this particular subject, however, we need to present this information in a manner that allows them to decide, and we might even add that we understand it’s a lot to take in one setting, so we should allow them to think about it.

If we don’t do this, the truth will rear its ugly head when we least expect it. Those who provide them this information will likely not frame it in the manner we think they should, and our kids might turn around and accuse us of lying, telling half-truths, and not trusting them enough to deal with such sensitive information. Whatever the case is, we might never be able to win them back. My advice is we teach them the virtues of this country and couple it with a healthy dose of the horror some Americans have done since the country’s birth. Do some research on the atrocities and prepare for the follow up questions, because there will be questions. Once we’re done, we should repeat the cycle so often that by the time that cool, rebellious person tells our children, “The things we don’t want them to hear,” they will turn on that person and say, “I’ve heard all of this a million times, and to tell you the truth I’m sick of hearing about it.” If condemning your country in such a manner is difficult, much less teaching it to your child, ask yourself how you would prefer America’s atrocities framed? Would you rather provide your child with a more comprehensive narrative, or would you rather someone who hates their country do it for you? One way or another, your child will learn this information.

14) I’m about 15 years into using devices to stream music on a daily basis at this point in my life, so it might seem a little odd to show appreciation now. Anytime I take a very short drive, I gain greater appreciation for the freedom technology has offered when I turn on my local FM stations and I hear a DJ offer tidbits from their life. I’m not talking about morning show hosts, as I think I listened to one morning show decades ago, just to hear what everyone was talking about, and I never listened to another one. When a DJ informs me about a day in their life, I switch the channel so hard my fingers hurt later. I don’t care about the private lives of celebrities, but I understand that some do. No one knows who these DJs are, and I think even less care. Yet, when they are on the clock, moving from one song to another, they tell us about their day. They tell us about a party they attended, a soup they enjoyed yesterday, and something their significant other said to them in the movie theater. Nobody cares! The only line we should hear from a radio DJ is, “That was one song, and here’s another.”  

15) Most people have heard the quote, “The definition of insanity is doing the same thing over and over and expecting a different result.” The quote is widely attributed to Albert Einstein. Most people know this quote, but they only apply it to innovative qualities that appeal to them and their relative definitions of the status quo. When another innovator sticks their nose out and tries to revamp other things that might not fit within the established definition of change, they receive nothing but scorn and ridicule. “Do you know the quote?” we ask.

“Yes,” they reply, “but it doesn’t apply here. This proposed new way of doing things, just isn’t the way we do things.” Okay, but the way we do things hasn’t worked for decades now. The counter argument is that we’re on the cusp of it working, and they provide some details of that progress. Those details are often talking points, and they don’t detail, in any meaningful way, actual progress. They then conclude that this new person, with all of their new ways of thinking, might damage all of the progress we’ve made. Again, we’ve been on the cusp of their way working for decades, and it hasn’t worked. Why shouldn’t we try a new way? Because that isn’t how things are done?  

The thing that bothers me is we’ve been lopping off innovative noses off for decades, and it leads me to believe that many innovators have shied away from the spotlight, because they like their noses, and future innovators will be just as shy. We might even recognize some of the merits of this proposed solution, but we will cede to the better minds and continue to do things as they’ve always been done, because that’s the way we’ve always done it.  

Leonardo’s Lips and Lines


My takeaway from Walter Isaacson’s Leonard da Vinci biography is that hypervigilance is not a switch an artist turns on to create. Artistic creations are often a display of one’s genuine curiosity about the world, a culmination of obsessive research into the miniscule details that others missed, and a portal through which the artist can reveal their findings. Did Leonardo da Vinci’s obsessions drive him to be an artist, or did he become obsessed with the small details of life to become a better artist?

Da Vinci might have started obsessively studying various elements, such as water, rock formations, and all of the other natural elements to inform his art, but he became so obsessed with his initial findings that he pursued them for reasons beyond art. He pursued them, the author states, for the sake of knowledge.

I don’t think I’ve ever read a book capture an artist’s artistic process as well as this one did. The thesis of the book is that da Vinci’s artistic creations were not merely the work of a gifted artist, but of an obsessive genius honing in on scientific discoveries to inform the minutiae of his process. Some reviews argue that this bio focuses too much on the minutiae of da Vinci’s work, but after reading the book, I don’t see how an author could capture the essence of what da Vinci’s accomplished without focusing on his obsessions, as focusing and obsessing on the finer details separated him from all of the brilliant artists that followed.

Some have alluded to the idea that da Vinci just happened to capture Lisa Gherardini, or Lisa del Giocondo, in the perfect smile for his famous painting The Mona Lisa. The inference is that da Vinci asked her to do a number of poses, and that his gift was merely in working with Lisa to find that perfect pose and then capture it, in the manner a photographer might. Such theories, Isaacson illustrates, shortchange the greatest work of one of history’s greatest artists.

Isaacson also discounts the idea that da Vinci’s finished products were the result of a divine gift, and I agree in the sense that suggesting his work was a result of a gift discounts everything da Vinci did to inform his work. There were other artists with similar gifts in da Vinci’s time, and there have been many more since, yet da Vinci’s work maintains a rarified level of distinction in the art world.

As an example of Leonardo’s obsessiveness, he dissected cadavers to understand the musculature elements involved in producing a smile. Isaacson provides exhaustive details of Leonardo’s work, but writing about such endeavors cannot properly capture how tedious this research must have been. Writing that da Vinci spent years exploring cadavers to discover all the ways the brain and spine work in conjunction to produce expression, for example, cannot capture the trials and errors da Vinci must have experienced before finding the subtle muscular formations inherent in the famous, ambiguous smile that captured the deliberate effect he was trying to achieve. (Isaacson’s description of all the variables that inform da Vinci’s process regarding The Mona Lisa’s ambiguous smile that historians suggest da Vinci used more than once is the best paragraph in the book.) One can only guess that da Vinci spent most of his time researching for these artistic truths alone, and that even his most loyal assistants pleaded that he not put them on the insanely tedious lip detail.

Isaacson also goes to great lengths to reveal Leonardo’s study of lights and shadows, in the sfumato technique, to provide the subjects of his paintings greater dimension and realistic and penetrating eyes. Da Vinci then spent years, sometimes decades, putting changes on his “incomplete projects”. Witnesses say that he could spend hours looking at an incomplete project only to add one little dab of paint.

The idea of a gift implies that all an artist has to do is apply their gift to whatever canvas stands before them and that they should do it as often as possible to pay homage to that gift until they achieve a satisfactory result. As Isaacson details this doesn’t explain what separates da Vinci from other similarly gifted artists in history. The da Vinci works we admire to this day were but a showcase of his ability, his obsessive research on matters similarly gifted artists might consider inconsequential, and the application of that knowledge he attained from the research.

Why, for example, would one spend months, years, and decades studying the flow of water, and its connections to the flow of blood in the heart? The nature of da Vinci’s obsessive qualities belies the idea that he did it for the sole purpose of fetching a better price for his art. He also, as the author points out, turned down more commissions than he accepted. This coupled with the idea that while he might have started an artistic creation on a commissioned basis, he often did not give the finished product to the one paying him for the finished product. As stated with some of his works, da Vinci hesitated to do this because he didn’t consider it finished, completed, or perfect. As anyone who understands the artistic process understands, the idea that art has reached a point where it cannot be improved upon is often more difficult to achieve for the artist than starting one. Some might suggest that achieving historical recognition drove him, but da Vinci had no problem achieving recognition in his lifetime, as most connoisseurs of art considered him one of the best painters of his era. We also know that da Vinci published little of what would’ve been revolutionary discoveries in his time, and he carried most of his artwork with him for most of his life, perfecting it, as opposed to selling it, or seeking more fame with it.

After reading all that informed da Vinci’s process, coupled with the appreciation we have for the finished product, I believe we can now officially replace the meme that uses the Sgt. Pepper’s Lonely Hearts Club Band album to describe an artist’s artistic peak with The Mona Lisa.

Historical Inevitability


The idea that history is cyclical has been put forth by many historians, philosophers, and fiction writers, but one Italian philosopher, named Giovanni Battista Vico (1668-1744), wrote that a fall is also an historical inevitability. In his book La Scienza Nuova, Vico suggested that evidence of this can be found by reading history from the vantage point of the cyclical process of the rise-fall-rise, or fall-rise-fall recurrences, as opposed to studying it in a straight line, dictated by the years in which events occurred. By studying history in this manner, Vico suggested, the perspective of one’s sense of modernity is removed and these cycles of historical inevitability are revealed.

To those of us who have been privy to the lofty altitude of the information age, this notion seems impossible to the point of being implausible. If we are willing to cede to the probability of a fall, as it portends to a certain historical inevitability, we should only do so in a manner that suggests that if there were a fall, it would be defined relative to the baseline of our modern advancements. To these people, an asterisk may be necessary in any discussion of cultures rising and falling in historical cycles. This asterisk would require a footnote that suggests that all eras have had creators lining the top of their era’s hierarchy, and those that feed upon their creations at the bottom. The headline grabbing accomplishments of these creators might then define an era, in an historical sense, to suggest that the people of that era were advancing, but were the bottom feeders advancing on parallel lines? Or, did the creators’ accomplishments, in some way, inhibit their advancement?

“(Chuck Klosterman) suggests that the internet is fundamentally altering the way we intellectually interact with the past because it merges the past and present into one collective intelligence, and that it’s amplifying our confidence in our beliefs by (a) making it seem like we’ve always believed what we believe and (b) giving us an endless supply of evidence in support of whatever we believe. Chuck Klosterman suggests that since we can always find information to prove our points, we lack the humility necessary to prudently assess the world around us. And with technological advances increasing the rate of change, the future will arrive much faster, making the questions he poses more relevant.” –Will Sullivan on Chuck Klosterman

My initial interpretation of this quote was that it sounded like a bunch of gobbeldy gook, until I reread it and plugged the changes of the day into it. The person that works for a small, upstart company pays acute attention to their inbox, for the procedures and methods of operation change by the day. Those that have worked for a larger company, on the other hand, know that change is a long, slow, and often grueling process. It’s the difference between changing the direction of a kayak versus a battleship. 

When transformational changes we have experienced in the last ten years could be said to fill a battleship, occurring with the rapidity of a kayak’s change of direction, how do we adapt to them at such a breakneck pace? Those that are 40 years-old and older often react slowly to change, particularly technological change, but teens and early twenty somethings are quicker and more eager to adapt and incorporate the latest and greatest advancements, regardless the unforeseen, and unintended consequences.

Had the rapid course of change over the last 10 years occurred over 100 years, it would’ve characterized that century as one of rapid change. Is it possible for us to change as quickly, fundamentally, or is there some methodical lag time that we all factor in?

If we change our minds on an issue as quickly as Klosterman suggests, with the aid of our new information resources, are we prudently assessing these changes in a manner that allows us to examine and process unforeseen and unintended consequences before making a change? How does rapid adaption to technological change affect human nature? Does it change as quickly, and does human nature change as a matter of course, or does human nature require a more methodical hand?

These rapid changes, and our adaptation to them, reminds me of the catch phrase mentality. When one hears a particularly catchy, or funny, catchphrase, they begin repeating it. When another asks that person where they first heard that catchphrase, the person that now uses the catchphrase so often now that it has become routine, say they don’t remember where they heard it. Even if they began using it less than a month ago, they believe they’ve always been saying it. They subconsciously adapted to it and altered their memory in such a way that suits them.  

Another way of interpreting this quote is that with all of this information at our fingertips, the immediate information we receive on a topic, in our internet searches, loses value. One could say as much with any research, but in past such research required greater effort on the part of the curious. For today’s consumer of knowledge, just about every piece of information we can imagine is at our fingertips. 

Who is widely considered the primary writer of the Constitution, for example? A simple Google search will produce a name: James Madison. Who was James Madison, and what were his influences in regard to the document called The Constitution? What was the primary purpose of this finely crafted document that assisted in providing Americans near unprecedented freedom from government tyranny, and rights that were nearly unprecedented when coupled with amendments in the Bill of Rights. How much blood and treasure was spent to pave the way for the creation of this document, and how many voices were instrumental in the Convention that crafted and created this influential document?

Being able to punch these questions into a smart phone, and receive the names of those involved can give them a static quality. The names James Madison, Gouvernor Morris, Alexander Hamilton, and all of the other delegates of the Constitutional Convention that shaped, crafted, and created this document could become nothing more than answer to a Google search. Over time, and through repeated searches, a Google searcher could accidentally begin to assign a certain historical inevitability to the accomplishments of these otherwise disembodied answers. The notion being that if these answers aren’t the correct answers, another one could be.

Removing my personal opinion that Madison, Morris, Hamilton, and those at the Constitutional Convention the composed the document, for just a moment, the question has to be asked, could the creation of Americans’ rights and liberties have occurred at any time, with any men or women in the history of our Republic? The only answer, as I see it, involves another question: How many politicians in the history of the world have voted to limit the power they wield, and any future power they might attain through future endeavors? How many current politicians, for example, are likely to vote for their own term-limits? Only politicians who have spent half their life under what they considered tyrannical rule would fashion a document that could result in their own limitations.   

How many great historical achievements, and people, have been lost to this idea of historical inevitability? Was it an historical inevitability that America would gain her freedom from Britain? Was the idea that most first world people would have the right to speak out against their government, vote, and thus have some degree of self-governance inevitable? How many of the freedoms, opportunities, and other aspects of American exceptionalism crafted in the founding documents are now viewed as so inevitable that someone, somewhere would’ve come along and figured out how to make that possible? Furthermore, if one views such actions as inevitable, how much value do they attach to the ideas, and ideals, created by them? If the answers to these questions attain a certain static inevitability, how susceptible are they to condemnation? If an internet searcher has a loose grasp of the comprehensive nature of what these men did, and the import of these ideas on the current era, will it become an historical inevitability that they’re taken away in a manner that might initiate philosopher Vico’s theory on the cyclical inevitability of a fall?

I’ve heard it theorized that for every 600,000 people born, one will be a transcendent genius. I heard this quote secondhand, and the person that said it attributed it to Voltaire, but I’ve never been able to properly source it. The quote does provide a provocative idea, however, that I interpret to mean that the difference between one that achieves the stature of genius on a standardized test, or Intelligence Quotient (IQ) test, and the transcendent genius lies in this area of application. We’ve all met extremely intelligent people in the course of our lives, in other words, and some of us have met others that qualify as genius, but how many of them figured out a way to apply that abundant intelligence in a productive manner? This, I believe, is the difference between the 1 in 57 ratio that some have asserted is the genius ratio and the 1 in 600,000 born. The implicit suggestion of this idea is that every dilemma, or tragedy, is waiting for a transcendent genius to come along and fix it. These are all theories of course, but it does beg the question of what happens to the other 599,999 that feed off the ingenious creations and thoughts of transcendent geniuses for too long? It also begs the question that if the Italian philosopher Vico’s theories on the cyclical nature of history hold true, and modern man is susceptible to a great fall, will there be a transcendent genius that is able to fix the dilemmas and tragedies that await the victims of the next great fall? 

Why Adults Hate Their Parents


‘I am so glad I don’t have to go through all that anymore,’ I think when I hear an adult say they still hate their parents. When they say it with such animosity and rage, I remember the raging insecurity and confusion that drove me to say such things, and I’m happy to be past all that. When I hear someone say that their parents are bumbling fools, idiots, or backwater hicks from the 1950’s, I remember saying such things, and I regret some of it. As has been said of regrets, there is little that we can do about them now. Yet, I have also heard others say that the struggle to correct past errors defines us.

The question I would love to ask of those adults that continue to hate the ‘absolute morons’ that happen to be their parents is, “Why is it so important to you that they still be wrong?”

“I’m smarter than my dad,” writes a twenty-something blogger. “I really wish I wasn’t. It’s like finding out Santa isn’t real.” 

That isn’t an exact quote, but it is a decent summary of her snarky blog. The blogger goes onto rap about how intelligence and cultural sensitivity are a cross that she must now bear in her discussions with her parents. She never states that she hates her parents. She states that she, in fact, loves them a great deal, but she characterizes that definition of love with an element of pity, bordering on condescension, that appears to be endemic in twenty-somethings.

Some carry this teenage hatred well into their twenties. The teen years are a period of cultivation, containing rebellion, learning, etc., that occur before our minds fully form. As we age, our mind matures, and so does our rebellion, until it manifests into either full-fledged hatred, or a condescending pity that recognizes their backwater modes of thought for what they are. This matured rebellion is also based on the fact our parents still have some authority over us, and that reminds us of those days when our parents had total authority over us, and how they “abused it to proselytize their closed-minded beliefs on us.”

When we finally reach a point when they’re no longer helping us pay for tuition, a car, or rent, and we’re able to flex independent muscles, we spend the next couple of years fortifying this notion that they were wrong, all wrong, all along.

By the time we progress to our thirties, circumstances reveal to us some of the logic and wisdom our parents attempted to pass down to us, and the idea that it does apply in some circumstances. (Some will never admit this. Some remain stuck in a peak of rebellion.) Their advice may not have applied in all circumstances, of course, but it applied in so many that took the prominent bumbling fool banner down. Then, when we reach our forties, we begin to think that they’re idiots all over again.

I wrote the last line to complete a joke I read. It’s a funny line, because there is an element of truth in it, but in my experience the truth lies somewhere in the middle. The truth is a hybrid of the lifelong recognition we have of our parents’ failings combined with the points we begrudgingly give them on some matters. We also respect them in a manner we never did as kids, because we now have our own kids, and we view them as fellow parents that tried to lead us down a path most conducive for happiness and success in life.

This specific timeline may not apply to everyone, as we all go through these stages on our own time. The word hate may be too stark a term for the adults still experiencing some animosity towards their parents, but anyone that has been through the roller coaster ride knows that the peaks and valleys can be one hell of an emotional roller coaster ride.

Theory formed the foundation of much of my uninformed rebellion, and real-world circumstances revealed to me that some of the archaic and antiquated advice my dad offered me had some merit. These circumstances, as I said, included having my own child and my own attempts to protect the sanctity of his childhood, in the same manner my dad attempted to protect mine. As evidence of this, I once thought my dad committed some errors in raising me by sheltering me too much, until some know-it-all said that means my dad did his job. “How so?” I asked. I was all ready to launch into a self-righteous screed about how he knew nothing about my childhood, until he said, “By allowing your childhood to last as long as possible.”

Another circumstance arrived when I tried to get along with my co-workers, and I tried to appease my boss. My father warned me that this would be more difficult than I assumed, and he was right, but I regarded that as nothing more than an inconvenient coincidence in my path to individuality.   

It’s not debatable to me that I was right about some of the things I planted a flag in, but these circumstances led me to understand that my dad lived a rich, full life by the time he became my mentor, and some of my impulsive, theoretical thoughts about the world were, in fact, wrong. (Even after gaining some objectivity on this matter, it still pains me to write that line.)

Having my own job, my own money, and my own car did a great deal to provide me the independence I desired, but I wanted more. Having my own home, and friends, and a life completely devoid of my dad’s influence gained me even more, but it wasn’t enough.

I wanted to be free of the figurative shackles being my dad’s son implied. Every piece of information I received about history, the culture, and the world was exciting, and new, and mine, because it stood in stark contrast to everything my dad believed. The information I received, that confirmed my dad’s wisdom, bored me so much I dismissed it. The new age information coincided with everything I wanted to believe about the brave new world that my dad knew nothing about, and it confirmed my personal biases.

I didn’t ask myself the question that I now pose to the blogger when I was a twenty-something, regarding why I still needed my dad to be wrong. I probably would not have had much of an answer, even if I searched for it. I probably would have said something along the lines of “Why is it so important to him that he cling to that age-old, traditional mode of thought?”

This redirect would not have been an attempt at deception or evasiveness. I just did not have the awareness necessary to answer such a question. Moreover, as a twenty-something, new age thinker, I was rarely called upon to establish my bona fides. All parties concerned considered me a righteous rebel, and the old guard was, by tradition, the party on trial. They often felt compelled to answer my questions, as opposed to forcing me to define my rebellion, and I enjoyed that because on some level I knew I couldn’t answer those questions.  

My twenty-something definition of intelligence relied on emotion, theory, and very little in the way of facts. I thought they were facts, however, and I had the evidence to back them up. I thought I was intelligent, more intelligent than my dad was, but the question I did not ask is what is intelligence? The answer is it depends on whom you ask.

In Abraham Lincoln’s day, the ability to drop a pertinent reference from Shakespeare and The Bible in any given situation formed the perception of one’s intelligence level. My generation believed that dropping a well-timed, pertinent quote from Friends and Seinfeld defined intelligence, coupled with a thorough knowledge of the IMBD list of Bruce Willis. To the next generation, it has something to do with knowing more than your neighbor about Kim Kardashian and Lady Gaga. (I concede that the latter may be an epic fail on my part.)

My dad knew nothing of Seinfeld, or Bruce Willis, so he knew nothing as far as I was concerned. He knew nothing about computers, or devices, and a third party introduced him to gold records (These gold records were CDs, compact discs, LOL! Gold records?) shortly before his death. This lack of knowledge about pop culture and technological innovation transcended all matters, as far as I was concerned. I believed my dad was a bumbling fool, traditionalist trapped in 1950’s traditionalist modes of thought, and that he could’ve never survived in our current, more sensitive culture. He was backwater, hick, and whatever other adjectives we apply to one trapped in a time warp of the sixties, maybe seventies, but definitely not nineties, the noughties, or the deccas.

The question that we in the smarter-than-our-parents contingent must ask ourselves is how much of the divide between our parents’ level of intelligence and ours is in service of anything? I, like the snarky and provocative blog writer, can say that I knew more about more than my dad did, but I defined that divide and most of what I used to inform that divide involved inconsequential information that I will never use for any substantial purpose. The conditions of my dad’s life were such that he didn’t receive what most would call a quality education, but he used whatever he learned to prosper on a relative basis. One could say that the difference between my education and my dad’s, and the education of the snarky contingent versus her parents’, could be whittled down to quantity versus quality.    

In the Workplace  

Much to my shock, I began quoting my dad to fellow tenured employees, well into my thirties:

“Everyone has a boss,” and “You can learn everything there is to know about the world from books, but the two words most conducive to success in life are going to revert to either: ‘Yes sir!’ and ‘No sir’.” 

I loathed those words for much of my young life, as they implied that even after escaping my dad’s management of my life –a level of authority that turned out to be far more macro than I ever considered possible– I would always have a boss, and the bosses that followed my dad taught me the difference between his level of macro management, and my boss’s definition (Hint: micro) when I was out on my own, and out from under his totalitarian thumb. I would also learn that my boss’s moods would forever dictate whether my day would be a good one or a bad one, in the same manner days under my dad’s moods affected me, only tenfold.

Dad’s advice derived from his experience in the workplace, but that experience occurred in an era that required reverence of a boss. Thanks to the new age ideas of boards and panels conducting arbitration cases for those that have been fired, the various wrongful termination lawsuits, and the threat thereof that gave life to the Human Resources department, the reverence requirement was no longer as mandatory in my era.

I would also learn that my newfound freedom would contain a whole slew of asterisks that included the idea that no matter how much free time I had, I would spend a great portion of my life in a workplace, under the watchful eye of authority, compromising my personal definition of freedom every step of the way.

Throughout the course of my life, I’ve met those that never went through these stages of rebellion. If you find this as incomprehensible as I did, all I can tell you is I’ve met them. They said rational things like this, in their twenties, “I never thought my parents were perfect, but I know that they always tried to steer me into what they believed to be the right course.”

As soon as I picked myself off the floor from laughter –believing that I was on the receiving end of a comedic bit– I realized they were serious. The fact that their upbringing was so much healthier than mine, caused me to envy them in some ways, but after chewing on that for years I realized that all of the tumult I experienced, self-inflicted and otherwise, defined my character and my current individual definition of independence.

We are our parent’s children, and at times, we feel trapped by it. Therefore, we focus on the differences. We may mention some of the similarities, but we take those characteristics for granted, and we think all parties concerned do too. Even when we reach a stage in life when we begin to embrace some elements of that trap, somewhere in our thirties and forties, we cling to the idea that we’re so different. The answers as to why these dichotomies exist within us are as confusing to us as the fact that they are a fait accompli.

When immersed in the tumult of the younger brain, trying to make some sense of our world, we may fantasize about what it would be like to have other parents. Our friend’s parents seem so normal by comparison. We think most of our problems could be resolved if we had their parents, or any normal people as parents. We might even fantasize about what it might be like to have been free of all patriarchal influence. We consider how liberating it might be to be an orphan, until we recognize how confusing that must also be. Those without parents must lack a frame of reference, a substantial framework, or a familiar foundation from which to rebel. When we consider this, we realize that our whole identity involves pushes and pulls of acquiescence and rebellion to our parents.

While there is some acknowledgement of the ‘the more things change, the more they stay the same’ dictum when we receive advice from our parents, our rebellion operates under the “It was the best of times, it was the worst of times” principle when we process that advice and apply it to our era. When we acknowledge that knowledge of innovations and pop culture are superfluous, that removes a substantial plank of our rebellion, until politics takes its place. We then sit down at our proverbial dinner table to resolve the political and geopolitical problems of the day, for our nation, in a manner we deem substantial. It fires us up. We deliver nuke after nuke, until we realize that the effort to persuade our parents is futile. We also recognize that nestled within this effort was our juvenile, sometimes snarky need to prove them wrong. While a more substantial plane than pop culture, political discussions can be just as silly for us, as it was for our parents when they discussed such issues at their parents’ dinner table, and they considered their parents to be bumbling idiots that offered nothing new to the discussion and stubbornly resisted the winds of culture change. The one import that they may have taken from their discussions with their parents, as we will with ours, over time, is that the more things change, the more they stay the same, and human nature doesn’t change as much as we may believe it does with innovations, cultural advancements, and social awareness. A kiss is still a kiss, a boss is still a boss, and the fundamental things still apply, as time goes by.

***

One final piece of advice this former rebel turned-individual offers to the provocative, parent-hating rebels is that we should all thank our parents for raising us. Thanking them could be one of the hardest things we ever do, as we may lose most of the provocative, parent-hating points we’ve spent our whole life accumulating, but it might turn out to be one of the best things we ever do too.

I thanked my dad for everything he did for me, and I did not add all of the qualifiers and addendum I would have added years earlier. I managed to put all grievances behind me for the ten seconds it took me to thank him.

Was it hard? I will not bore you with the details of my rearing, but suffice it to say my dad could be a difficult man, and he played a significant role in the anger, frustration, and the feelings of estrangement I felt for much of my life.

I could go into further detail to ingratiate myself further with those currently struggling with the idea that I don’t understand their dilemma. To display my empathy, I have a quote that served me well throughout the traumatic years: “Not every person who becomes a parent is a good person.” Parents are people too, and some of them are as misguided, confused, immoral, and selfish as the rest of us are. Yet, we are people too, and some of us are susceptible to making the mistake of amplifying their faults in our myopic view of them. If we were able to shake that myopic view, I think most of us will see that our parents were essentially good people who tried to move past their limitations to make us better than they were.

I dedicate this addendum to those who acknowledge that there might be anecdotes in this post that provide clarity on this subject, and they might even admit that thanking their parents would be noble, but the wound is too fresh and raw to forgive or thank them today. I empathize on a relative basis, but all I can tell my fellow angry offspring is that it would not have sat well with me if I waited.

As I sat in a pew staring at the pine box, I realized that no matter how obnoxious, self-serving, and angry my father could be at times, he was a member of an endangered species comprised of those who truly care what happens to me. How many people truly care what happens to us? Our closest friends may say they do, but they have their own lives to live. We know our parents care, but some of them show it by seeking constant updates, harping, and telling us how to live our lives, long after the tie that binds us has been broken. As impossible as this is to believe today, expressing some level of gratitude in whatever manner your relationship with your parents require might be the best thing you have ever done. We might not see it that way, today, but my guess is that even the most obnoxious rebel will see it one day, and my hope is that this addendum will convince someone, somewhere that waiting one more day might be one day too late.

To worry, or too worried?


Nestled within the quest to be free and to experience life through the portal of YOLO (You Only Live Once), or FOMO (Fear of Missing Out), lies a fear, concern, and worry that we might be too free.  Born, if the thesis of Francis O’Gorman’s book, from a need to be led.

It may seem illogical to make an argument that we’re too free, in lieu of the technological, and governmental, advances have led us to believe every move we make, and every thought we have is monitored, infringed upon, and legislated against.  Francis O’Gorman Worrying: a Literary and Cultural History is not a study of freedom, but one of the common man worrying about how the people, places, and things around us that are affected by the freedom.  Mr. O’Gorman makes this proclamation, in part, by studying the literature of the day, and the themes of that literature.  He also marks this with the appearance, and eventual proliferation of self-help guides to suggest that this greater sense of concern, or worry, led to readers, of another era, rewarding writers that provided them more intimate, more direct answers.  This study leads Mr. O’Gorman to the conclusion that this general sense of worry is a relatively new phenomenon, as compared to even our recent ancestral history.

yes_me_worryOne fascinating concept Mr. O’Gorman introduces to this idea is that the general sense of worry appears to have a direct relation to the secularization of a culture.  As we move further and further away from the religious philosophies to a more individualistic one, we may feel freer to do what we want to do, but we are also more worried about the susceptibility we have to the consequences of unchecked, mortal decision making. We humans have an almost inherent need to be led.

How often does a secular person replace religion with politics?  Politics, at its core, is leadership, and in our dining room table discussions of politics, most of our discussions revolve around why one person is capable of leading our locale, our state, and our nation.  It involves why one person’s idea of leadership may be inept, while another –that abides by our principles– is more capable. As much as those adults that believe themselves fully capable of living without leadership would hate to admit it, all political thought revolves around the desire to be led.

Reading through the various histories of man, we have learned that our ancestors had more of a guiding principle, as provided by The Bible.  The general theory, among those that preach the tenets of The Bible is that man’s mental stability, and happiness, can be defined in direct correlation to his desire to suborn his will to God’s wishes.  God gave us free will, they will further, but in doing so He also gave us guiding principles that would lead us to a path of righteousness and ultimate happiness.

If a man has a poor harvest –an agrarian analogy most preachers use to describe the whole of a man’s life– it is a commentary on how this man lived.  The solution they provide is that the man needs to clean up his act and live in a Godlier manner.  At this point in the description, the typical secular characterization of the devoutly religious comes to the fore, and their agreed upon truth has it that that these people are unhappier because they are unwilling to try new things, and puritanical in a sense that leads them to be less free.  The modern, more secularized man, as defined by the inverse characterization, has escaped such moral trappings, and he is freer, happier, and more willing to accept new ideas and try new things.  If the latter is the case, why are they so worried?

We’ve all heard snide secularists say that they wish they could set aside their mind and just believe in organized religion, or as they say a man in the sky.  It would be much easier, they say, to simply set their intelligence aside and believe.  What they’re also saying, if Mr. O’Gorman’s thesis can be applied to them, is that it would give them some solace to believe that everything was in God’s hands, so that they wouldn’t have to worry all the time.

Like the child that rebels against authority, but craves the guidance that authority provides, the modern, enlightened man appears to reject the idea of an ultimate authority while secretly craving many of its tenets at the same time.  A part of them, like the child, craves the condemnation of immorality, a reason to live morally, and for some greater focus in general.  The randomness of the universe appears to be their concern.

One other cause for concern –that is not discussed in Mr. O’Gorman’s book– is that the modern man may have less to worry about.   If social commentators are to be believed, Americans have never been more prosperous:

“(The) poorest fifth of Americans are now 17 percent richer than they were in 1967,” according to the U.S. Census Bureau

They also suggest that the statistics on crime are down, and teenage pregnancy, and drinking and experimental drug use by young people are all down.  If that’s the case, then we have less to worry about than we did even fifteen years ago.  It’s a concern.  It’s a concern in the same manner that a parent is most concerned when a child is at its quietest.  It’s the darkness before the storm.

Francis O’Gorman writes that the advent of this general sense worry occurred in the wake of World War I.  Historians may give these worriers some points for being prescient about the largely intangible turmoil that occurred in the world after the Great War, but World War I ended in 1918 and World War II didn’t begin until 1939, a gap of twenty-one years of people worrying about the silence and calm that precedes a storm.  This may have propelled future generations into a greater sense of worry, after listening to their parents’ concerns over a generation, only to have them proved right.

The idea that we worry about too much freedom, as in freedom from the guidelines and borders that religion, or God, can provide, can be accomplished without consequences, writes The New Republic writer, Josephine Livingstone in her review of Francis O’Gorman’s book:

“The political concept of freedom gets inside our heads.  It is a social principle, but it structures our interiority.  This liberty worries us; it extends to the realm of culture too, touching the arts as much as it touches the individual human heart and mind.

“In this way, O’Gorman joins the tide of humanities scholars linking their discipline with the history of emotion, sensory experience, and illness. It’s an approach to culture most interested in human interiority and the heuristics that govern the interpretation of experience: Happiness can be studied; sound can be thought through; feeling can be data.”

Ms. Livingstone furthers her contention by writing that the human mind can achieve worry-free independence, in a secular society, by studying select stories, from select authors:

“Worrying also fits into the tradition of breaking down myths and tropes into discrete units, a bit like Mircea Eliade’s Myth and Reality or C. S. Lewis’ Studies in Words. We care about these books because we need stories about the cultural past so that we might have a sense of ourselves in time. The real value of O’Gorman’s book lies, I think, in the way it flags the politics of the stories we tell ourselves. In its attribution of emotional drives to the ideas behind modernist culture and neoliberal politics alike, Worrying shows that their architects –writers, mostly– are as much victims of emotion as masters of thought. If we can see the emotional impulses behind our definitions of rationality, liberty, and literary craftsmanship, we can understand our own moment in cultural time more accurately and more fairly: Perhaps we can become our own gods, after all.”

One contradiction –not covered in the O’Gorman book, or the Livingstone review– is the trope that religious people are miserable in their constraints.  This is ostensibly based on the premise that they fear the wrath of God so much that they’re afraid to live the life that the secular man does.  Yet, O’Gorman infers that religious people tend to worry less, because they follow the guidelines laid out in The Bible, and they place their destiny, and fate, in the hands of God.  The import of this is that for religious minds, the universe is less random.  Ms. Livingstone’s review basically says that the secular life doesn’t have to be so random, and it doesn’t have to cause such concern.  She basically states that if we study happiness as if it were an algorithm of either physical or aural data points, and incrementally form our thoughts around these findings we can achieve happiness.  She also states that through reading literature we can discover our own master plan, through their mastery of emotions through thoughts and ideas.  On the latter point, I would stress the point –in a manner Ms. Livingstone doesn’t– that if you want to lead a secular life, there are the ways to do so and still be worry free.  The key words being if you want to.  If you’re on the fence, however, a religious person could argue that all of the characteristics Ms. Livingstone uses to describe the virtues of the stories and the authors she considers masters of thought, could also be applied to the stories, and writers of The Bible, and the many other religious books.  If her goal, in other words, is to preach to her choir, she makes an interesting, if somewhat flawed case.  (I’m not sure how a living, breathing human being, could study a data sheet on happiness and achieve the complicated and relative emotion.)  If her goal, on the other hand, is to persuade a fence sitter that secularism is the method to becoming your own god, this reader doesn’t think she made a persuasive case.

An Intellectual Exercise in Exercising the Intellect


“There are no absolutes,” a friend of mine said in counterargument.  The snap response I had was to counter her counter with one of a number of witty responses I had built up over the years for this statement.  I decided, instead, to remain on topic, undeterred by her attempts to muddle the issue at hand, because I believe that for the most part this whole philosophy has been whittled down to a counterargument tactic for most people.

Whenever I hear the “No Absolutes” argument, I think of the initial stages of antimatter production.  In order to get the protons, neutrons, or electrons spinning fast enough, a physicist needs to use a Particle Accelerator to attempt the production of an atomic nuclei, otherwise known as antimatter.  The acceleration of these atoms occurs in a magnetic tube that leads them to a subject, upon which they smashed to produce this final product.  The process is a lot more intricate and complex than that, but for the purpose of this discussion this simplified description can be used as an analogy for the “There are No Absolutes” argument that is often introduced in an echo chamber of like-minded thinkers, until it is smashed upon a specific subject, and the subject matter at hand is then annihilated in a manner that produces intellectual antimatter in the mind of all parties concerned.

Tower of Babel

Tower of Babel

The “No Absolutes” argument is based on the post-structuralism idea that because we process, or experience, reality through language –and language can be declared unstable, inconsistent, and relative– then nothing that is said, learned, or known can be said to be 100% true.

This degree of logic could be the reason that a number of philosophers have spent so much time studying what rational adults would consider “Of Course!” truths.  One such example, is the idea of presentism.  Presentism, as presented by the philosopher John McTaggart Ellis McTaggart, could also be termed the philosophy of time. The central core of McTaggart’s thesis has it that the present is the lone timeframe that exists, and that the past, and the future cannot exist at the same time.  The past has happened, he states, and the future will happen, but they do not exist in the sense that the present does.  This philosophy is regarded in some circles (to the present day!) as so insightful that it is included in some compilations of brilliant philosophical ideas.

Anyone that is familiar with McTaggart’s philosophy, or will be by clicking here, can read through the description of the man’s theory a number of times without grasping what questions the man was answering.  His description of time is so elementary that the reader wonders more about the audience that needed that explained to them, than they do the philosophy of Mr. McTaggart.  Was McTaggart arguing against the linguists attempts to muddle the use of language, or was he attempting to argue for the reinforcement of agreed upon truths?  Regardless, the scientific community had problems with McTaggart’s statement, as depicted by the unnamed essayist writing in this article:

If the present is a point (in time) it has no existence, however, if it is thicker than a point then it is spread through time and must have a past and future and consequently can’t be classed as purely the present.  The present is immeasurable and indescribable” because it is, we readers can only assume, too finite to be called a point.”

Those that want to dig deep into the physicist’s definition of time, of which this unnamed essayist seems to be a party, will find that time is a measurement that humans have invented to aid them in their day-to-day lives, and that the essence of time cannot be measured.  Time is not linear, and it cannot be seen, felt or heard.  They will argue that there is nothing even close to an absolute truth regarding time.  Setting aside the physicists’ definition of time, however, humans do have an agreed upon truth of time that McTaggart appeared to want to bolster through elementary, agreed upon truths of time to thwart the confusion that sociolinguists, with a political orientation, introduced to susceptible minds.

There’s nothing wrong with a man of science, or math, challenging our notions, perceptions, and agreed upon truths.  Some of these challenges are fascinating, intoxicating, and provocative, but some have taken these challenges to another level, a “No Absolutes” level to this point of challenging our beliefs system that has resulted in damage to our discourse, our sense of self, free-will, and a philosophy we have built on facts and agreed upon truths in a manner that may lead some to question if everything they believe in is built on a house of cards that can be blown over by even the most subtle winds of variance.

There was a time when I believed that most of the self-referential, circuitous gimmicks of sociolinguistics –that ask you to question everything you and I hold dear– were little more than an intellectual exercise that professors offered their students to get them using their minds in a variety of ways.  After questioning the value of the subject of Geometry, my high school teacher informed me: “It is possible that you may never use any aspect of Geometry ever again, but in the course of your life you’ll be called upon to use your brain in ways you cannot now imagine.  Geometry could be called a training ground for those times when others will shake you out of your comfort zone and require a mode of thinking that you may have never considered before, or use again.” This Geometry professor’s sound logic left me vulnerable to the post-structuralist “No Absolutes” Philosophy professors I would encounter in college.  I had no idea what they were talking about, I saw no value in their lectures, and I thought that the ideas that I was being introduced to, such as those nihilistic ideas of Nietzsche, always seemed to end up in the same monotonous result, but I thought their courses were an exercise in using my brain in ways I otherwise wouldn’t.

Thus, when I first began hearing purveyors of the “No Absolutes” argument use it in everyday life, for the purpose of answering questions of reality, I wanted to inform them that this line of thought was just an intellectual exercise reserved for theoretical venues, like a classroom.  It, like Geometry, had little-to-no place in the real world.  I wanted to inform them that the “No Absolutes” form of logic wasn’t a search for truth, so much as it was a counterargument tactic to nullify truths, or an intellectual exercise devoted to exercising your intellect.  It is an excellent method of expanding your mind in dynamic ways, and for fortifying your thoughts, but if you’re introducing this concept to me as evidence for how you plan on answering real questions in life, I think you’re going to find it an exercise in futility over time.

Even when a debate between two truth seekers ends in the amicable agreement that neither party can sway the other to their truth, the art of pursuing the truth seems to me to be a worthwhile pursuit.  What would be the point of contention for two “No Absolutes” intellectuals engaging in a debate?  Would the crux of their argument focus on pursuing the other’s degree of error, or their own relative definition of truth?  If they pursued the latter, they would have to be careful not to proclaim their truths to be too true, for fear of being knocked back by the “There are No Absolutes,” “Go back to the beginning” square.  Or would their argument be based on percentages: “I know there are no absolutes, but my truth is true 67% of the time, while yours is true a mere 53% percent of the time.”  Or, would they argue that their pursuit of the truth is less true than their opponents, to therefore portray themselves as a true “No Absolutes” nihilist?

Some may argue that one of the most vital components of proving a theoretical truth in science, is the attempt to disprove it, and others might argue that this is the greatest virtue of the “No Absolutes” argument, and while we cannot dismiss this as a premise, purveyors of this line of thought appear to use it as nothing more than a counterargument to further a premise that neither party is correct.  Minds that appear most confused by the facts, find some relief in the idea that this argument allows them to introduce confusion to those minds that aren’t.  Those that are confused by meaning, or intimidated by those that have a unique take on meaning, may also find some comfort in furthering the notion that life has no meaning, and nothing matters.  They may also enjoy informing the informed that a more complete grasp on meaning requires one to have a firmer grasp on the totality of meaninglessness.  The question I’ve always had, when encountering a mind that has embraced the “ No Absolutes” philosophy is, are they pursuing a level of intelligence I’m not capable of attaining, or are they pursuing the appearance of it?