Scat Mask Replica II (20)


1) What does it say that I still manage to work The Theme of The Love Boat into everyday situations in life? I know our brains are more complex than the most sophisticated hard drive ever invented, but it seems only natural to me that when we introduce new data, some of the old data will get lost or stored in some other, internal file that I won’t access for decades. If any of this is remotely close to true, how much important data have I sacrificed, or damaged, to save space for the lyrics to The Theme of The Love Boat?

2) What emboldens those of us who publicly state that our beliefs system is superior? We all have our insecurities, and we join groups to align ourselves with an idea we consider superior, so we can mock and denigrate others that belong to the other group. Some of us need a proverbial podium to mock and denigrate the other group, so that our group might view us as superior. Some view their presentation as bold, but I can’t help but wonder about the raging insecurities that drive a person to do this.

3) At the breakfast table, a five-year-old son speaks about the death of his father. The mother informed the son that he should hope that the father lives long enough to teach him how to be a man. The son looks at the father, “Well tell me.”

4) Analysts on financial/business networks often drop the term financial purgatory. Their context suggests that the term purgatory describes one stuck in misery that an uninformed viewer might mistake for abject misery. Those more familiar with their Catholic Catechism know that purgatory is a place between heaven and hell, a stasis reserved for those awaiting further judgment from the powers that be. A better description of financial purgatory might involve a child of the lower middle class upbringing, finding a way to live among those kids whose parents make true money, and all of the judgment that follows. This kid has no pressing needs, and his life is happy in all ways other than this talk of money. Most kids don’t care about money, but as kids begin to age, how much their parents make becomes a topic of conversation. It can lead them to recognize that while his family is not poor they cannot afford to buy their way into money conversations. Some might dismiss this as a first world problem, and that children adapt well, but any child that seeks entrée into the in-crowd knows that it feels like Armageddon in the moment. Depending on the kids around them, it can lead a kid to feel he doesn’t belong in a financial heaven or hell, and the subsequent, general idea that they don’t belong can last well into adulthood.

5) The horoscope for the new sign Ophiuchus: This will be another meaningless week in your otherwise meaningless life. If someone informs you that they have something meaningful to say about your life this week, walk away. Don’t check in with yourself this week, just go through the week on autopilot for all events and information you receive will be meaningless. Your lucky weather element is wind.

6) A writer arguing about the rules of usage is not only tedious it’s an exercise in futility. Some writers pine for the age-old, linguistic purity of Geoffrey Chaucer, others argue that we should strive to remain casual for greater readability among the masses. On the latter, I know that I might be banging my spoon on my high chair, but when I read the numerous ways professional writers, overuse the word “had” a layer of glaze coats my eyes. I know writing, “I had biked over trails” is past perfect tense and “I biked over trails” is present perfect tense, and grammatically these two forms of expressing action are perfectly acceptable, but I find one causes a brief interruption and the other flows so well that the reader doesn’t pause. There is an ample middle ground for writers to explore between strict grammatical rules and readability, and most of them know it without knowing it, but a reading of Chaucer reminds one of the strict grammatical rules that have long since fallen out of favor in modern writing. On that note, I find “I had done” a most egregious violation of readability, as in “I had done my research before writing this paragraph.” It appears redundant and awkward to me, and when I read, professional writers write in such a manner, I wonder if they don’t pay their editors enough or if they overwork them.

7) Joe Theismann admitted that as a student/athlete at Notre Dame he allowed the university’s public relations department to change the pronunciation of his name from THEES-man to THIGHS-man. The pitch the PR department personnel made was that Theismann’s chances at winning college football’s most prestigious prize, the Heisman trophy, might increase if he changed the pronunciation of his name so that it rhymes with the name of the trophy. Even though Joe made this unusual sacrifice for the award, he did not win. We can only guess that his family was against this PR push, and that they scorned him for doing it on some level for the decades they spent correcting people on how to pronounce their family name. We know why Joe did it, but why did he allow us to mispronounce his name for decades? Did he prefer the new pronunciation, was he embarrassed that the PR campaign failed to win him the Heisman, and he didn’t want to have to explain that over and over, or did he consider the pronunciation of his name relatively inconsequential? Whatever the case, Joe allowed the world to mispronounce his name for decades. The former football star is now a celebrity spokesman for a company that purports to aid aging men with prostrate problems that cause them to urinate so often that it disrupts their lives. An ambitious member of marketing arm of this company –that knows about Theismann’s willingness to change the pronunciation of his name– should ask him to change the pronunciation of his name back, so that it rhymes with HE PEES-man.

8) What would you say if a grown man approached your table at an outdoor café and said, “Pardon the intrusion, but I have to say that I enjoy watching the way you eat a tortilla chip.”

9) By modern cultural standards, Joseph Hupfel is a creepy man. He is dirty, unshaven and generally unattractive. He eats a very clean blt. Mayo. Toasted. Buttered lightly, immediately upon exiting the toaster. He enjoys the sedimentary layers of the sandwich. How many sedimentary levels of the man do we know? How much of a man lies on the surface? We know creepy when we see it, until we learn more about the man. How much will we never know about him? Modern man believes he has a decent feel for the history of mankind, but how many fact-finding missions uncover something revolutionary that puts everything we thought we knew in the rear-view mirror? Some speculate that there are miles upon miles of undiscovered artifacts lying under homeowner’s homes in Rome that could further explain the history of mankind, but the homeowners won’t let excavators unearth them. How many sedimentary levels of a seeming simple man have we yet to unearth in our personal profile?

10) In the 1890 essay, A Majestic Literary Fossil, author Mark Twain provides a hilarious condemnation of two thousand years of scientific theory from esteemed intellectuals in the field of medical science. Twain focuses the theme of this essay on the repudiation of the science behind the accepted medical practice of bloodletting. This practice relied on the accepted theory that blood doesn’t circulate in the body, it stagnates, and to achieve proper health the patient needs to have old blood taken out on a regular basis to send a signal to the body that it’s time to regenerate new, healthier blood. The scientific community regarded blood as one of many humours in the body, and they believed that all humours required regular regulation. As such, they believed that a healthy patient would allow their doctor to bleed them on a regular basis, as a preventative measure. The import of Twain’s essay is not necessarily a condemnation of science, in my humble opinion, but the idea of confirmation bias, and the idea that anyone should put stock in the consensus of science. As one who knows little to nothing about science, I do know that most scientists prefer not to use that word. For anyone who wants to argue that sound science is not susceptible to occasional flights of human error, remember that the belief in the virtues of bloodletting wasn’t a blip in human history, the consensus of the scientific community considered the science behind bloodletting so sound that medical practitioners relied on it for most of human history. The import of this essay also asks us to examine what we believe today, based on a consensus of scientific theory. If we were able to go back in time to Abraham Lincoln’s day, and we witnessed the archaic act of bloodletting, how would we try to dispel them of their scientific findings? “You don’t believe in science?” is a question they might ask us. To which we would tell them that we do believe in science, but we also know that some science, their science in particular, is wrong. “You realize that you’re arguing against 2,000 years of science. Why should we take your word for it even if, as you say, you’re from the future?”

If a person from the future were to travel back in time to our day, what would they ridicule us for believing? Which archaic rituals and procedures that we derive from our scientific findings would they mock? Would they laugh at us in the same manner we laugh at the scientists of Twain’s day? Our natural inclination will be to laugh with them, for we know all too well the foolish beliefs others in our era have, but will we stop laughing when they touch upon that which we believe, or will we continue to laugh with them under the soft lie that we were never that gullible?

11) I heard a cop once say that the rule of thumb for being a cop on the beat is to believe half of what you see and none of what you hear. Those who watch network television shows and major Hollywood movies should apply the same principle to their viewing habits.

12) Listening to one party’s version of a romantic breakup is always dicey. The listener knows they’re only hearing one side of the story, and we know where to get the other side if we’re feeling especially adventurous, curious, and nosey. We suspect that we will hear an equally partisan take on the situation from the other side, and we suspect that both accounts might uncover some key discrepancies in the other’s account, and that we might be able to help both parties discover a truth that lies somewhere in the foggy middle. Before enlightening these two parties, however, the listener needs to consider the idea that our objective truth is just as subjective as the two parties concerned, and the crucial point is that what the listener might believe is true is not necessarily the truth. Just because a listener is a third party, uninterested listener does not mean that they are objective.

13) If someone were to ask me for dating advice, based on my experiences, I would say the key to attracting a person is to try and be as genuine, and as normal, as possible on a date, unless those two characteristics conflict. The best dating experience of my life involved a woman who convinced me she was relatively normal. She went through some drama in her previous life, but she managed to extricate herself from those situations relatively normal. Everyone says that they managed to escape prior relationships unaffected, but if we’re honest with ourselves, we’ll recognize how impossible that is. One of her key selling points was convincing me that she did not attempt to influence those affected parties with intimate details of her ex’s past transgressions. Most people I know adopt the time-honored tradition of slash and burn politics to assure all parties concerned of their nobility, but thoughtful people know that nobility is a long-term value that will eventually reveal itself. She claimed that my greatest attribute was authenticity. I went through some stuff in my previous life, but I maintained whatever it was she sought in a man. If the person I knew was dating someone they feared was not normal, I would warn them that putting a best foot forward and creating a façade of normalcy is easy in short spurts. I would tell them to watch that person around their family and friends and pay special attention to the way they interact with the people they’re most comfortable. Most people don’t want their friends and family to think that a boyfriend, or girlfriend, can change them. If that doesn’t work, take a long trip with that person. That prolonged involvement should reveal the characteristics of the other party and allow one to make a more informed decision on them.

14) “What do you believe in?” I’ve asked those who ridicule me for believing in a person, place, or thing that turns out to be wrong. These people inform me that I should’ve been more skeptical, and while that is true, my question to them is, “Have you ever believed in something, only to find out you’re wrong, to one degree or another?” The answer for some of them, has often been no, because they’ve wrapped themselves in a cocoon of fail-safe contrarian thinking to avoid ridicule.

After the facts roll out, it’s easy for a cynic to say that they never believed in it in the first place, but there is a point shortly after one learns of a novel idea, or a new approach to solving humanity’s problems, when the new information  excites to the reader. This point, just before the reader can personally research the subject, defines them as a hopeful person who wants to believe in people, places and things. For the purpose of discussion, let’s say that we’ve just finished an intoxicating nonfiction book that espouses radical, new secular and apolitical ideas to solving one of the world’s many problems. Let’s also say that this book is about a subject matter that covers a matter the reader knows little to nothing about, by an author they’ve never heard of before. How does one react to the information in that book, before doing personal research on it?

Some of us are more inclined to believe in something if the presenter builds a solid case for it, cynics are more inclined to seek out refutation for any person, place, or thing before the facts roll out, and then there are those cynics who ridicule everyone that believes in anything before the facts roll out. They prefer to call it skepticism, but I call it cynicism. It’s in my nature to believe in people, places, and things, until the facts prove otherwise. I believe, for example, that for just about every tragic situation mankind faces there is an ingenious problem solver who will eventually solve it. In the court of public opinion, this mindset often places me in a vulnerable position for ridicule.

When I first read John Douglas’ Mindhunter decades ago, I was a believer. I believed that Douglas laid out a solid case for how, why, and where criminal profiling could provide useful tools to assist law enforcement in their efforts to locate a criminal. It was a temporary setback for me to discover how often profilers erred. The naysayers used those instances to claim that criminal profiling is essentially a form of confirmation bias that involves throwing out a bunch of commonalities that most serial killers have, for example, to form a standard profile for the next serial killer they profile. The naysayers further this repudiation saying that after law enforcement captures the perpetrator, and the perpetrator confesses, the profiler then aligns the perpetrator’s characteristics with elements of the conclusions they made in their profile. The question these naysayers have for those who believed Douglas was, “How often was John Douglas wrong, and did he list those instances in his book?” It might have something to do with the idea that I was ready to canonize Douglas after reading his book, but the factual refutations of his work, by the naysayers, were eye opening to me. Once I recovered from the setback, I discovered that while flawed, criminal profiling might be on par with all that informs a doctor’s profile on a patient, before they reach a diagnosis on that patient’s ailments. In the back and forth on this issue, I began to question the effectiveness of criminal profiling more and more, but I also began to question the motives of the cynical naysayers. What drives an absolute cynic to tear down everything they read, hear and see? Dissecting any idea to locate truth is not only necessary it’s admirable, but how they approach their research is fundamental to their being.

Believers might approach personal research of such matters in a cynical vein, but they only do so in a scientific method to disprove. Absolute cynicism is so foreign to my thought process that it’s difficult for me to portray without bias, but I think it’s a fail-safe, contrarian approach that some use to ward off ever being incorrect and enduring subsequent ridicule for their personal track record. When I learn of an interesting new concept, or problem solving measure, it excites me until I learn that it is not as effective as it was in the author’s presentation. I view this belief as food for the mind, and that a person who doesn’t believe in anything might have a more difficult time achieving fulfillment, and again I’m reserving this space for secular, apolitical ideas and philosophies. It seems to me that those empty spaces in the mind of cynical contrarians cry out for sustenance in a manner equivalent to an empty belly crying out for food, and that those vacuous holes do get filled by the belief in something. That something, I’ve often found, are alternative modes of thought that they consider almost impossible to refute.

15) Anytime I think I might be smart, I dip into a discussion involving the creations of our universe. One such discussion involved the time-space framework, another involved the idea that our universe is flat with a slight bend due to cosmic background radiation, and a third informed us of the idea that there are efforts now looking through the Microwave Background Radiation for evidence that some other universe at one time collided with ours. I don’t know what these people are talking about, and I dare say most don’t. Most of us, even most scientists, prefer to argue about the knowable.

16) For most of my life, I’ve managed to avoid caring what happens to celebrities. I used to strive to know what was going on in their world if only to better understand the cultural references comedians drop. I’m to the point now that I don’t understand three-fourths of those references. I did manage, however, to land on a decade old story involving the messy divorce between singer Shania Twain and the producer Mutt Lange. It appears that Mutt Lange had an affair with Twain’s best friend, and he eventually married that best friend. In a noteworthy turn of events, Twain ended up marrying her best friend’s husband. The Hollywood writers love to give cute names to marrying couples like, Tomkat, Bennifer, and Brangelina. I suggest we call the Twain/Lange eventual arrangements, getting Shlanged.

17) Every time I watch a professional athlete make a mistake, I empathize. I arrive at this empathy from a much smaller vantage point, as I didn’t engage in organized sports past junior high. I played intramural games and pickup games constantly throughout my youth, however, and I made errors ESPN might have added to their Not Top 10. I have to think those laughing the hardest at the foibles of professional athletes never played sports in their life, or they’re seeking to diminish whatever laughable errors they made by laughing harder at other’s errors. What follows such laughter is some incarnation of the line, “I made some errors, sure, but I never would’ve done anything like that.” If I didn’t commit an error similar to that one, I think of all the egregious errors I made that were as embarrassing if not more so, and I follow that with the thought that at most, I had maybe twenty people witness my error. These professional athletes commit errors in front of millions, and sometimes hundreds of millions of people depending on how many times ESPN replays their errors for the enjoyment of those without empathy.

18) We’ve all mistakes large and small. Some of us have made life-altering mistakes, and some of us have made mistakes that affect others’ lives in a manner we have to live with, but few have made mistakes that change the course of history in the manner mapmaker Martin Waldseemuller did. Due to the popular observations of an Italian writer/explorer Americus Vespucci, the mapmaker named an entire continent after him. The general practice of naming continents involved leaders of expeditions, but Vespucci was more of an observer who wrote about the expeditions that he took part in. Christopher Columbus led the expedition to find a new path to the East Indies. When he arrived back in his home country, Spain, he reported as his findings. In the course of the confusion over what Columbus actually discovered, Vespucci wrote about his many expeditions to foreign lands, and conflicting accounts suggest Vespucci might have participated in Columbus’ expedition. Regardless if he participated in that particular expedition or not, Vespucci took part in expeditions following Columbus’, and he reported the discovery a new continent. Amid the sensation of that report, Waldseemuller mistakenly labeled the new continent Amerigo’s land. The standard practice of the day also suggested that continents have feminine versions, such as Asia, Africa, and Europa, so Waldseemuller took the feminine version of Americus’ name and called the land America. Some suggest that Waldseemuller attempted to correct this mistake by removing Amerigo Vespucci’s name from later editions of his maps, but it was too late to change it in the popular culture of the day. Columbus’ home country, Spain, refused to accept the name America for 200 years, saying their explorer should get credit for his accomplishment, not an Italian writer, but they couldn’t defeat the consensus on the topic. Thus, some suggest that Americans should call their homeland Columbia, the United States of Columbia, or the United States of Columbisia. From this, we can say that not only did America become a land of vagabonds, creeps, and cast offs, but we were mistakenly named after a writer who achieved some decent sales in his day, and the popular opinion derived from those sales defeated all attempts to correct the record.

19) Those who enjoy reading biographies as much as I do know how little the childhood chapter has to do with the overall narrative of the subject’s life. The childhood chapter deals with the subject’s relatively difficult childhood, the child’s genealogy, and some elements of their upbringing. Other than familiarizing the reader to the subject, the only reason to include the childhood chapter is to reveal the research the author has performed on the subject. Chekov’s Razor applies to writers of fiction, but it does not apply, unfortunately, to writers of biographies. I’ve decided to skip the passages that inform us that the subject played hopscotch, their relationships with peers and siblings, and if their parents encouraged them or not. I now start a biography at the subject’s first major accomplishment, and I find that I don’t miss anything I consider substantive.

20) Reading through the various portrayals of George Orwell, a reader finds a number of opinion makers claim the Orwell loathed the idea that right-wingers adopted many of political theories. He was, to his dying day, a libertarian socialist these authors repeat at the end of every description. Some of his works, including Animal Farm and 1984, appear to denounce Stalin and the U.S.S.R., but Orwell didn’t limit his fears of totalitarian principles to locales or leaders. He feared the idea that too many citizens of the world were willing to give up their freedom for comfort, and he feared these susceptibilities were just as inherent in people of Britain as The United States. As we’ve witnessed, such fears can be defined and redefined by both parties, but I choose to view them as apolitical. I understand that when political opponents adopt the theories of esteemed intellectuals, the other side will mount a defense, but when those theories prove correct, there will be cloistered mass of humanity vying for the peak. If a political opponent adopted one of my theories to explain their beliefs, we might find that we disagree on an end game, but if we continued to find some agreement on a principle regarding fundamental elements of human nature, I would find that a compliment regardless of their political viewpoint.

An Intellectual Exercise in Exercising the Intellect


“There are no absolutes,” a friend of mine said in counterargument.  The snap response I had was to counter her counter with one of a number of witty responses I had built up over the years for this statement.  I decided, instead, to remain on topic, undeterred by her attempts to muddle the issue at hand, because I believe that for the most part this whole philosophy has been whittled down to a counterargument tactic for most people.

Whenever I hear the “No Absolutes” argument, I think of the initial stages of antimatter production.  In order to get the protons, neutrons, or electrons spinning fast enough, a physicist needs to use a Particle Accelerator to attempt the production of an atomic nuclei, otherwise known as antimatter.  The acceleration of these atoms occurs in a magnetic tube that leads them to a subject, upon which they smashed to produce this final product.  The process is a lot more intricate and complex than that, but for the purpose of this discussion this simplified description can be used as an analogy for the “There are No Absolutes” argument that is often introduced in an echo chamber of like-minded thinkers, until it is smashed upon a specific subject, and the subject matter at hand is then annihilated in a manner that produces intellectual antimatter in the mind of all parties concerned.

Tower of Babel

Tower of Babel

The “No Absolutes” argument is based on the post-structuralism idea that because we process, or experience, reality through language –and language can be declared unstable, inconsistent, and relative– then nothing that is said, learned, or known can be said to be 100% true.

This degree of logic could be the reason that a number of philosophers have spent so much time studying what rational adults would consider “Of Course!” truths.  One such example, is the idea of presentism.  Presentism, as presented by the philosopher John McTaggart Ellis McTaggart, could also be termed the philosophy of time. The central core of McTaggart’s thesis has it that the present is the lone timeframe that exists, and that the past, and the future cannot exist at the same time.  The past has happened, he states, and the future will happen, but they do not exist in the sense that the present does.  This philosophy is regarded in some circles (to the present day!) as so insightful that it is included in some compilations of brilliant philosophical ideas.

Anyone that is familiar with McTaggart’s philosophy, or will be by clicking here, can read through the description of the man’s theory a number of times without grasping what questions the man was answering.  His description of time is so elementary that the reader wonders more about the audience that needed that explained to them, than they do the philosophy of Mr. McTaggart.  Was McTaggart arguing against the linguists attempts to muddle the use of language, or was he attempting to argue for the reinforcement of agreed upon truths?  Regardless, the scientific community had problems with McTaggart’s statement, as depicted by the unnamed essayist writing in this article:

If the present is a point (in time) it has no existence, however, if it is thicker than a point then it is spread through time and must have a past and future and consequently can’t be classed as purely the present.  The present is immeasurable and indescribable” because it is, we readers can only assume, too finite to be called a point.”

Those that want to dig deep into the physicist’s definition of time, of which this unnamed essayist seems to be a party, will find that time is a measurement that humans have invented to aid them in their day-to-day lives, and that the essence of time cannot be measured.  Time is not linear, and it cannot be seen, felt or heard.  They will argue that there is nothing even close to an absolute truth regarding time.  Setting aside the physicists’ definition of time, however, humans do have an agreed upon truth of time that McTaggart appeared to want to bolster through elementary, agreed upon truths of time to thwart the confusion that sociolinguists, with a political orientation, introduced to susceptible minds.

There’s nothing wrong with a man of science, or math, challenging our notions, perceptions, and agreed upon truths.  Some of these challenges are fascinating, intoxicating, and provocative, but some have taken these challenges to another level, a “No Absolutes” level to this point of challenging our beliefs system that has resulted in damage to our discourse, our sense of self, free-will, and a philosophy we have built on facts and agreed upon truths in a manner that may lead some to question if everything they believe in is built on a house of cards that can be blown over by even the most subtle winds of variance.

There was a time when I believed that most of the self-referential, circuitous gimmicks of sociolinguistics –that ask you to question everything you and I hold dear– were little more than an intellectual exercise that professors offered their students to get them using their minds in a variety of ways.  After questioning the value of the subject of Geometry, my high school teacher informed me: “It is possible that you may never use any aspect of Geometry ever again, but in the course of your life you’ll be called upon to use your brain in ways you cannot now imagine.  Geometry could be called a training ground for those times when others will shake you out of your comfort zone and require a mode of thinking that you may have never considered before, or use again.” This Geometry professor’s sound logic left me vulnerable to the post-structuralist “No Absolutes” Philosophy professors I would encounter in college.  I had no idea what they were talking about, I saw no value in their lectures, and I thought that the ideas that I was being introduced to, such as those nihilistic ideas of Nietzsche, always seemed to end up in the same monotonous result, but I thought their courses were an exercise in using my brain in ways I otherwise wouldn’t.

Thus, when I first began hearing purveyors of the “No Absolutes” argument use it in everyday life, for the purpose of answering questions of reality, I wanted to inform them that this line of thought was just an intellectual exercise reserved for theoretical venues, like a classroom.  It, like Geometry, had little-to-no place in the real world.  I wanted to inform them that the “No Absolutes” form of logic wasn’t a search for truth, so much as it was a counterargument tactic to nullify truths, or an intellectual exercise devoted to exercising your intellect.  It is an excellent method of expanding your mind in dynamic ways, and for fortifying your thoughts, but if you’re introducing this concept to me as evidence for how you plan on answering real questions in life, I think you’re going to find it an exercise in futility over time.

Even when a debate between two truth seekers ends in the amicable agreement that neither party can sway the other to their truth, the art of pursuing the truth seems to me to be a worthwhile pursuit.  What would be the point of contention for two “No Absolutes” intellectuals engaging in a debate?  Would the crux of their argument focus on pursuing the other’s degree of error, or their own relative definition of truth?  If they pursued the latter, they would have to be careful not to proclaim their truths to be too true, for fear of being knocked back by the “There are No Absolutes,” “Go back to the beginning” square.  Or would their argument be based on percentages: “I know there are no absolutes, but my truth is true 67% of the time, while yours is true a mere 53% percent of the time.”  Or, would they argue that their pursuit of the truth is less true than their opponents, to therefore portray themselves as a true “No Absolutes” nihilist?

Some may argue that one of the most vital components of proving a theoretical truth in science, is the attempt to disprove it, and others might argue that this is the greatest virtue of the “No Absolutes” argument, and while we cannot dismiss this as a premise, purveyors of this line of thought appear to use it as nothing more than a counterargument to further a premise that neither party is correct.  Minds that appear most confused by the facts, find some relief in the idea that this argument allows them to introduce confusion to those minds that aren’t.  Those that are confused by meaning, or intimidated by those that have a unique take on meaning, may also find some comfort in furthering the notion that life has no meaning, and nothing matters.  They may also enjoy informing the informed that a more complete grasp on meaning requires one to have a firmer grasp on the totality of meaninglessness.  The question I’ve always had, when encountering a mind that has embraced the “ No Absolutes” philosophy is, are they pursuing a level of intelligence I’m not capable of attaining, or are they pursuing the appearance of it?

The Usage War: The Undermining of American Values


When I first heard the name Noam Chomsky, I learned that some regarded him as the father of modern linguistics, and I learned that he was considered a powerful force in America. How a man whose sole concern was language could have power outside the halls of academe confused me shortly after I dismissed him. The subject of linguistics seemed a narrow conceit with a narrow appeal. As my knowledge of political science grew, and I learned of the power of language, I learned of the power of this seemingly inconsequential subject, and how it has led to the least talked about “war” of our times.

The late author, David Foster Wallace, called it a usage war and he stated that it has been occurring since the late 60’s. Wallace’s primary concern was not the narrow definition of politics. Rather, he was concerned with the use of language, and the interpretation of it. This usage war is a war between two factions that the editor-in-chief of the controversial Webster’s Third New International Dictionary, named Philip Babcock Gove, {1} described as a battle between descriptivists and prescriptivists.

“The descriptivists,” Grove writes, “are concerned with the description of how language is used, and the prescriptivists are concerned with how the language should be used.”

“The late lexicographer Robert Burchfield furthered this description thusly: “A prescriptivist by and large regards (any) changes in the language as dangerous and resistible, and a descriptivist identifies new linguistic habits and records these changes in dictionaries and grammars with no indication that they might be unwelcome or at any rate debatable.” {2}

The descriptivists say that language is elastic, and it should bend to individual interpretations. Language, they say, should largely be without rules.

“Virtually all English language dictionaries today are descriptive. The editors will usually say that they are simply recording the language and how its words are used and spelled. Most Merriam-Webster dictionaries will note if certain words are deemed nonstandard or offensive by most users; however, the words are still included. Of modern dictionaries, only the Funk and Wagnall’s contains a certain amount of prescriptive advice. All the major dictionary publishers – Merriam-Webster, Times-Mirror, World Book, and Funk and Wagnall’s – will tell you that they are primarily descriptive.”{3}

Early on in life, we learned that if we were going to succeed in school, we would have to perfect our spelling and grammar. After we entered the real world, we learned that if we were going to succeed we would have to take it a step further and correct our speech to the codes in the political correct lexicon.

We can guess that nearly everyone has learned, at some point in time, the relative machinations of acceptable discourse. We can guess that anyone that has spoken, or written, on a professional level, has learned of the perceptive gains one can accumulate and lose with the use, and misuse, of language. We can also guess that most realize how others manipulate their audience through language. The latter may be the key to the descriptivist movement in linguistics today.

Our introduction to manipulated perceptions often occurs when we enter the workforce. We may see these perceptions parlayed in movies and television, but we don’t experience them firsthand until we enter the workplace and they directly affect us. At that point, it becomes clear how others use language to shift the power of daily life.

If this form of manipulation were limited to the workplace, that would be one thing. It would be powerful, but that power would be limited to that particular environment. As we have all witnessed when one successfully manipulates language, it doesn’t end when we clock out for the day. We accidentally, or incidentally, take these rules of usage, or speech codes, out of the workplace and into our everyday lives. David Foster Wallace catalogued these incremental actions and reactions in the book Consider the Lobster. It details the fact that lexicographers, like Phillip Babcock Gove, have used dictionaries, and other methods, as a foundation for a usage war that has been occurring in America since the late 60’s.

How many of us have used incorrect terminology that violates the current rules of usage? How many of us have used the words “reverse discrimination” as opposed to the more politically correct term “affirmative action”? How many of us have called an illegal immigrant an illegal immigrant, only to be corrected with the term “undocumented worker?” How many of us have had a boss, or members of the Human Resources department tell us, “I understand you have personal beliefs on this topic, but I hope you can see that it has no place in the workplace,” they say in so many words. “You don’t want to offend anyone, I know that. You’re a nice guy.” 

Most of us are nice people, and we don’t seek to offend the people we work with, our neighbors, or anyone else for that matter. To do this, we follow the speech codes handed down from the Human Resources department to help us get along with other people. We then, unconsciously, take those speech codes to the bar, to family functions, and to our home, until we find ourselves assimilated to the point that we’re correcting our friends.

“It’s a peccadillo,” they say, “a very slight sin, or offense, it’s not sexual relations with an intern. It’s a fib,” they say. “It’s not perjury before a grand jury. It’s “environmentalist” not “anti-corporate socialist”. It’s a “feminist” not a “man hating female who can find no other way to succeed”, “multiculturalist” not “racial quota advocate”, “rainforest” not “gathering of trees”, “sexually active” not “promiscuous”, “economic justice” not “socialism”, “fairness” not “socialism”. It’s “giving back” not “class envy”, and it’s “community organizer” not “radical agitator”. This is the war, and these are the little battles within that war that the descriptivists and the liberals have been waging against the “normal” prescriptive America lexicon for generations and they have succeeded beyond their wildest dreams.

This desire to be nice to other people, and understand other cultures, is one of the advantages the descriptivists/liberals have in manipulating the language, and winning the usage wars. When we find a person that may be different from us in some manner, we want to know how best to get along with them. We want to know their sensitivities, in other words, so we do not accidentally violate them. The question that we should bring to the debate more often is how do people learn the sensitivities of others? Are these sensitivities internal, or are they taught to us through repeated messaging? Most people are insecure, and they don’t know how to demand satisfactory treatment, but they can learn. An individual can learn that something is offensive, and they can learn how to communicate that offense.

“What’s wrong with that,” is a common reply to this notion. “What’s wrong with teaching people how they should be treated? We all just want to get along with one another?”        

Prescriptivists would tell you that buried beneath all this “well-intentioned” manipulation of usage is the general loss of language authority. Prescriptivists ache over the inconsistencies brought to our language through slang, dialect, and other purposeful displays of ignorance regarding how the language works. They labor over the loss of standardized language, such as that in the classical works of a Geoffrey Chaucer. Most of them do not necessarily call for a return to Chaucer-era usage, but they are offended when we go to the opposite pole and allow words like “height” and “irregardless” into modern dictionaries. They also grow apoplectic when terms, such as “you is” and “she be” become more acceptable in our descriptivist lexicon. And They hide in a hole when standards of modernity allow sentences to begin with a conjunction, such as “and”, and they weep for the soul of language when casual conversation permits a sentence to end with an infinitive such as to.

Language provides cohesion in a society, and it provides rules that provide like-mindedness to a people that want to get along. It’s fine to celebrate individuality, and some differences inherent in a melting pot as large as America’s, but if you have nothing to bind people together the result can only be a degree of chaos.

A member of the descriptivism movement, on the other hand, celebrates the evolution of language:

“Frank Palmer wrote in Grammar: “What is correct and what is not correct is ultimately only a matter of what is accepted by society, for language is a matter of conventions within society.”

“John Lyons echoed this sentiment in Language and Linguistics: “There are no absolute standards of correctness in language.”

“Henry Sweet said of language that it is “partly rational, partly irrational and arbitrary.”

It may be arbitrary in Sweet’s theoretical world of linguists seeking to either ideologically change the culture, or update it to allow for vernaculars in the current social mores, but in the real world of America today are we doing our students, our language, or our culture any favors by constantly redefining usage? If our primary motivation for teaching arbitrary methods of usage is sensitivity to intellectual capacity, different cultures, and self-esteem is the culture as a whole made better in the long run?

On the ideological front, the descriptivism movement has successfully implemented a requirement that all writers now use the pronouns “they” and “he or she” if that writer is seeking a general description of what a general person may do, or think. Repeated use of the general pronoun “he” without qualifying it with the balanced usage of “she”, “they”, or “he or she” is not only seen as antiquated, but sexist, and incorrect. The reason it is antiquated, those of the descriptivism movement say, is that it harkens back to a patriarchal, White Anglo-Saxon Protestant (WASP) society.

If you work in an office, and you send out any form of communication to a team of people, you know how successful the descriptivism movement has been in infiltrating our language in this regard. Yet, there was a point in our history, a point in the not-so-distant past when no one knew enough to be offended by the repeated use of “he” as a general pronoun. No one that I know of regarded this as improper, much less incorrect. Years of repeated messaging have created ‘gender neutral’ solutions to the point that schools, workplaces, and friends in our daily lives suggest that using “he” as a general pronoun is not just sexist it is incorrect usage. Yet, they deem using the pronoun “she” as an acceptable alternative. If this complaint were limited to the narrow prism of politics, one could dismiss it as a member of the losing team’s hysteria, but we’re talking about the politics of language usage.

A political science professor once told our class that, in his opinion, law breaking became a little more acceptable when the federal government lowered the speed limit to fifty-five in 1974. His theory was that the fifty-five mile per hour speed limit seemed arbitrarily low to most people, and they considered it unreasonable. His theory was that most people were generally more law-abiding in the 50’s, and  –“regardless what you’ve read”– in the 60’s, but in the 70’s more people found the general idea of breaking the law more acceptable, and he deemed this 1974 unreasonable limit on speed to be the antecedent. His theory was that no one person, no matter how powerful their voice is in a society as large as ours, could successfully encourage more people to break the law, and that only the society could do this by creating a law that was seen as not only unreasonable, but a little foolish.

Whether or not his theory is correct, it illustrates the idea that seemingly insignificant issues can change minds en masse. Could one person, no matter how powerful they may be in a society, teach people to be offended more often for more power in that society? Can political linguists dictate a certain form of usage by suggesting that anyone that doesn’t assimilate does so with ulterior motives? Could it be said that Human Resource videos –that anyone that has been employed has spent countless hours watching– are not only being used to teach people how to get along with people different than them, but how those different people should be offended?

“Why does that person continues to use general pronoun “he” instead of “he or she” or “they” continue to do that? Are they trying to offend all the “shes” in the room?” 

Everything stated thus far is common knowledge to those of us who operate in public forums in which we interact with a wide variety of people. What some may not know is that this “usage war” for the hearts and minds of all language users extends to the production of dictionaries.

If this is true, how can a dictionary be ideological? There are prescriptivist dictionaries that call for “proper” interpretations and use of language, and there are descriptivist dictionaries that evolve with common use. “Usage experts”, such as David Foster Wallace, consider the creation of these two decidedly different dictionaries salvos in the Usage Wars “that have been under way ever since an editor named Phillip Babcock Gove first sought to apply the value-neutral principles of structural linguistics to lexicography in the 1971 Webster’s Third New International Dictionary of the English Language.”

“Gove’s response to the outrage expressed by those prescriptivist conservatives who howled at Gove’s inclusion of “OK” and “Ain’t” in his Third Edition of Webster’s Dictionary was: “A dictionary should have no truck with artificial notions of correctness or superiority. It should be descriptive and not prescriptive.” {4}

One of the other reasons that descriptivism eventually took hold is that it allowed for more “free form” writing. Descriptivism allows a writer to get their words down on paper without an overriding concern for proper communication. Descriptivism allows for expression without concern for proper grammar or a more formal, proper lexicon. It allowed a writer to brainstorm, free form, and journal without a “fussbudget” teacher correcting these thoughts into proper usage.

This was a relief to those that enjoyed expression without having to answer to a teacher that informed us we weren’t expressing correctly. How can one “express correctly” those of us that enjoyed expression asked. Without too much fear of refutation, I think we can say that the descriptivism movement won this argument for the reasons those that enjoyed creative expression brought forth. When one of my professors told me to get the expression down, and we’ll correct your spelling and grammar later, I considered myself liberated from what I considered the tyrannical barrier of grammatical dictates. It wasn’t too many professors later that I discovered teachers that went beyond the “correcting the spelling and grammar later” to the belief that the self-esteem of the writer was paramount. If the student doesn’t get discouraged, this theory on usage suggested, they are more apt to express themselves more often. They are more inclined to sign up for a class that doesn’t “badger” a student with constant concerns of systematic grammar, usage, semantics, rhetoric, and etymology. One argument states that colleges based this lowering of standards on economics, as much of what they did encouraged the student. Personal experience with this, along with the other examples listed above, paved the way for the descriptivism movement to move the language, and the culture, away from the prescriptivist rules of usage.

Some have said that the motivation for those in the descriptivism movement is not nearly as nefarious as those in the prescriptivism movement would have one believe. Descriptivists would have one believe that their goal was more an egalitarian attempt at inclusion and assimilation. They would have them believe that the prescriptivists’ grammar requirements, and lexicography, are exclusionary and elitist, but can we take these descriptivist interpretations and nuances into a job interview, a public speech, a formal letter, or even into a conversation among peers that we hope to impress? Can we succeed in the current climate of America today with language usage that is wide-open to a variety of interpretations?

An English as a Second Language (ESL) teacher once informed me that the “impossibly high standards” President George W. Bush, and his librarian wife, placed on her students, made her job more difficult. I conceded the fact that I was an outside looking in, listening to her complaints, and that I didn’t know the standards she had to deal with them on a daily basis. “But I said, “If we’re looking at the intention behind these impossibly high standards, could we say that they were put in place to assist these non-English speakers into learning the language at a level high enough for them to succeed in America?” This ESL teacher then complained that the standards didn’t take into account the varying cultures represented in her classroom. I again conceded to her knowledge of the particulars of these standards, but I added, “You’re theoretical recognition of other cultures is wonderful, and it has its place in our large multi-cultural society, but when one of your students sit for a job interview what chance do they have when competing against someone like the two of us that are well-versed in the “impossibly high prescriptivist, standard white English, and WASP” grammar and usage standards we were forced to learn in our class?”

{1} http://en.wikipedia.org/wiki/Philip_Babcock_Gove

{2}   http://stancarey.wordpress.com/2010/02/16/descriptivism-vs-prescriptivism-war-is-over-if-you-want-it/

{3} http://englishplus.com/news/news1100.htm

{4} Wallace, David Foster. Consider the Lobster. New York, NY. Little Brown and Company, a Hachett Book Group. 2005. eBook.