Scat Mask Replica VIII


How much money does the average Fortune 500 Company spend learning the mind of the consumer? How many psychologists, linguists, and marketers do their preferred research firms and marketing agencies consult before starting production on a commercial? Their job is to know what makes us laugh, what makes us cry, and what intrigues us long enough to pitch a product or idea. They also have the unenviable chore of finding a way to keep us from fast forwarding through commercials. The average commercial is thirty seconds long, so advertisers need to pack a lot into a tight space. With all the time, money, and information packed into one thirty-second advertisement, one could say that commercials are better than any other medium at informing us where our culture is. One could even go so far to say that each commercial is a short, detailed report on the culture. If that’s true, all one needs to do is watch commercials to know that the art of persuasion has altered dramatically in our post-literate society.

Booksellers argue that we don’t live in a post-literate society, as their quarterly reports indicate that books are selling better than ever. I don’t question their accounting numbers, but some of the commercials big corporations use to move product are so dumbed down and condescending that I wonder if fewer and fewer people are buying more and more books.

When advertisers make their pitch, they go to great pains, financially and otherwise, to display wonderful messages. They then hire a wonderful actor, or spokesman, to be the face of the company. By doing so, of course, the companies who employ the advertising agencies want the consumer to find their company is just as wonderful. If you’re not a wonderful person, their carefully tailored message suggests you can be if you follow their formula. If I am forced, for whatever reason, to watch a commercial, I find their pitches so condescending that they almost make me angry.

Thirty seconds is not a lot of time when it comes to the art of persuasion, so advertising agencies take shortcuts to appeal to us. These shortcuts often involve quick emotional appeals. The problem with this is that people who watch commercials adopt these shortcuts in casual conversations, and they begin using them in everyday life.

I find the quick, emotional appeals these research and marketing firms dig up so appalling that I avoid commercials as much as possible. I find the opposite so appealing, in comparison, that I probably give attempts at fact-based, critical thinking more credit than they deserve. I walk away thinking, “Hey, that’s a good idea!” whether it’s actually a good idea or not, I appreciate the thought they put into making a rational appeal.

Some quick, emotional appeals add crying to their art of persuasion. “Don’t cry,” I say. “Prove your point.” A picture says a thousand words, right? Wrong. We’ve all come to accept the idea that powerful figures and companies require an array of consultants to help them tailor their message for greater appeal. Yet, if one has facts on their side, they shouldn’t need to cry. They shouldn’t need to hire consultants, they shouldn’t need attractive spokesmen, and the idea that they “seem nice and wonderful” shouldn’t matter either. I know it’s too late to put the genie back in the bottle, but I think the art of persuasion should be devoid of superficial and emotional appeal.

***

Marketing firms and their research arms also spend an inordinate amount of time discussing “the future”. Some ads intone their pitch with foreboding tones, and some discuss it with excitement. Our knowledge of the future depends on our knowledge of the past. As evidence of that, we look to our senior citizens. They don’t pay attention to the present, because they find it mostly redundant. “What are you kids talking about these days?” they ask. We inform them. “That’s the same thing we were talking about 50 years ago.” Impossible, we think, we’re talking about the here and now. They can’t possibly understand the present. They can, because it’s not as different from the past as we want to believe. The one element that remains a constant throughout is human nature.

You’re saying that all the change we’ve been fighting for will amount to nothing? It depends on the nature of your fight. Are you fighting to change human nature? If so, there’s an analogy that suggests, if you’re trying to turn a speedboat, all you have to do is flick a wrist. If you’re trying to change the direction of a battleship, however, you should prepare for an arduous, complicated, and slow turn. My bet is that once we work through the squabbles and internecine battles of the next fifty years, the future will not change as much as these doomsayers want it to, and if it does, it will probably be for the better.       

***

Brian Dettmer

How many people truly want to create works of art? “I would love to write a book,” is something many people say. How many want it so badly that they’re willing to endure the trial and error involved in the process getting to the core of a unique, organic idea? How many of us know firsthand, what a true artist has to go through? If others knew what they have to go through, I think they would say, “Maybe I don’t want it that badly.”

We prefer quick, emotional appeals. How many overnight geniuses are there? How many artists write one book, one album, or paint one painting to mass appeal? How many of them were able to generate long-term appeal? We should not confuse appeal with best seller. The idea of best seller or attaining market appeal is, to some degree, not up to the artist. They might have a hand in the marketing process, but appeal is largely up to the consumer. The only thing an artist can do is create the best product possible in the large and small ways an artist creates. In this vein, creating art involves a process so arduous that most people would intimidate most.

On the flip side, some say that there are artistic, creative types, and there are the others. There’s no doubt that there are varying levels of talent, but I believe that with enough time and effort most people could create something beautiful and individualistic.

Leonardo da Vinci was a talented artist, who painted some of the greatest pieces of art in world history. From what I’ve read about the man, however, he achieved so much in the arts that it began to bore him. After working through his apprenticeship and establishing himself as one of the finest painters of his day, he received numerous commissions for various works of art from the wealthy people and government officials around him. He turned some down, never started others, and failed to complete a whole lot more. One theory I’ve heard on da Vinci is that if he had a starving artist period, he probably created hundreds of thousands of pieces in that period, but that a vast majority of those pieces were lost, destroyed, or are otherwise lost to history. By the time, he achieved a level of stature where those in his day wanted to preserve his work, painting bored him so much that he created comparatively few pieces. Either that, or in the course of his attempts to create that elusive “perfect piece” da Vinci began studying the sciences to give his works greater authenticity. In the course of those studies, he became more interested in the sciences than he was in painting. These are just theories on why we only have seventeen confirmed pieces from Leonardo da Vinci, but they sound firm to me.

***

There is a hemispheric divide between creative types and math and science types. One barometer I’ve found to distinguish the two is the Beatles. So many types love the Beatles that we can tell what type of brain we’re dealing with by asking them what Beatles era they prefer. With the obvious distinctions in style, we can break the Beatles down into two distinct eras, the moptop era includes everything they did before Sgt. Peppers Lonely Hearts Club Band, and the “drug-induced” era that followed. Numbers-oriented people generally love the moptop era more, and the creative, more right brain thinkers tend to prefer Sgt. Peppers and everything that followed. The moptop era fans believe the Beatles were a better band during the moptop era, because “they were more popular before Sgt. Peppers. Back then,” they say, “the Beatles were a phenomenon no one could deny.” Moptop era fans often add that, “the Beatles got a little too weird for my taste in the “drug-induced” albums that followed.” Although there is some argument over which album sold the most, at the time of release, it is generally argued that the latter half of their discography actually sold more than the first half. Numbers-oriented people should recognize that the latter albums were bound to sell more if for no other reason than the moptop Beatles built a fan base who would purchase just about anything they created after the moptop era. Those who lived during the era, however, generally think that the Beatles were less controversial and thus more popular during their moptop era, and if you’ve ever entered into this debate you know it’s pointless to argue otherwise. We creative types would never say that the pre-Sgt. Peppers Beatles didn’t have great singles, and Revolver and Rubber Soul were great albums, and we understand that those who lived during the era have personal romantic attachments to their era of Beatles albums, but we can’t understand how they fail to recognize the transcendental brilliance of the latter albums. We think the brilliance and the creativity they displayed on Sgt. Peppers and everything that followed provided a continental divide no one can dispute.

Further evidence of the popularity of the latter half of the Beatles catalog occurred in 1973. In 1973, the Beatles released two greatest hits compilations simultaneously for fans who weren’t aware of the Beatles during their era. The blue greatest hits album, which covered the 1967-1970, post Sgt. Peppers era has sold 17 million to date, while the red greatest hits 1962-1967, moptop-era album has sold 15 million. As anyone who has entered into this debate knows, however, it’s an unwinnable war.

Consider the Lobster: A Review


This book starts out the way most brilliant, pop psychology books do from an angle you may have never considered before. Since this book is a collection of divergent essays, it should be reviewed chapter by chapter and essay by essay. The first essay “Big Red Son” involves comedic talk of the porn industry. To be fair to the author, David Foster Wallace, this essay was first written in 1998, and some may conclude it unfair to declare it dated, but I didn’t read this until 2012, so I am forced to say that this material has been mined for all its worth at the time of my reading. (See Chuck Palahniuk’s Snuff.) The second chapter “Some Remarks on Kafka’s funniness…” whets the appetite. The general idea behind this chapter “that humor is not very sophisticated today” has been mined by those of us obsessed with pop culture, but Wallace does get some points for listing the specific problems with the current sense of humor that doesn’t understand the sophisticated and subtle humor of author Franz Kafka. He says: “Kafka’s humor has almost none of the particular forms and codes of contemporary US amusement.” This launches the Wallace into a detailed list of complaints about contemporary humor brought to the homes of TV watchers.

David Foster Wallace

David Foster Wallace

“Kafka’s humor has almost none of the particular forms and codes of contemporary U.S. amusement. There’s no recursive wordplay or verbal stunt-pilotry, little in the way of wisecracks or mordant lampoon. There is no body-function humor, nor sexual entendre, nor stylized attempts to rebel by offending convention. No slapstick with banana peels or rogue adenoids. There are none of the ba-bing ba-bang reversals of modern sitcoms; nor are there precocious children or profane grandparents or cynically insurgent coworkers. Perhaps most alien of all, Kafka’s authority figures are never just hollow buffoons to be ridiculed, but they are always absurd and scary and sad all at once.”

The point that Wallace attempts to make is that his students don’t understand Kafka’s absurdist wit, because they are more accustomed to being spoon-fed their entertainment. They’re not used to having to think through something as complex as Kafka’s central joke:

“That the horrific struggle to establish a human self-results in a self whose humanity is inseparable from that horrific struggle. That our endless and impossible journey toward home is in fact our home.”

The chapter is worth reading not for its “When I was a kid, we had to walk ten miles to school” style of complaining about the youth of the day, but the illustrative manner in which Wallace complains about humor in general. A complaint this author laments may not be generational.

The fourth chapter may be the selling point for this book. In it, Wallace describes a war that has been occurring in the English language for a couple generations now. Wallace calls it a Usage War. The Usage War describes how one side, the more traditional side, AKA the prescriptive side, pleads for a return to traditional English. He talks of the other side, the more modern side that describes itself as a more scientific study of the language, updating our usage on a more inclusive plane. The latter, called the descriptive side, calls for more political correctness in its language. It calls for a more comprehensive list of words and usage that incorporates styles of language such as Ebonics and words that are more commonly used, such as “irregardless”. Previous to this reading, I heard that tired phrase “everything is political”, but I had no idea that that phrase could be extended to dictionaries. The author’s reporting on this subject is excellent. It is informative without being biased, and it is subjective with enough objectivity to present both viewpoints in a manner that allows you to decide which side is more conducive to progress in our language.

Wallace is not as unbiased in his John McCain chapter however. He makes sure, in the opening portions of an article –that was paid for by the unabashedly liberal periodical Rolling Stone— that his colleagues know that he is not a political animal (i.e. he is stridently liberal). He lets them know he voted for Bill Bradley. Other than the requisite need a writer of a Rolling Stone article feels to display their liberal bona fides, it’s not clear why Wallace would include his opinion in a piece that purports to cover an election campaign. If I were granted the honor of being paid to cover a Nancy Pelosi campaign, for example, I would not begin this piece with a couple of paragraphs describing how I feel about her politics, but such is the state of journalism in America today…particularly in the halls of the unabashedly liberal Rolling Stone.

To have such an article begin with a political screed that is different than mine, would normally turn me off, but I’ve grown used to it. (I know, I know, there is no bias.) The real turn off occurs after the reader wades through the partisan name-calling, to the languid dissertation on the minutiae involved on a campaign bus. If you’re ever aching to know what goes on in a political campaign, I mean really aching to know, this is the chapter for you. I would say that most are curious about the machinations that occur behind the scenes, but I would say that most of those same people would have their curiosity tested by Wallace’s treatment here. He says that the editors at Rolling Stone edited the piece. He says that he always wanted to provide his loyal readers a director’s cut. After reading through the first twenty pages of this chapter, I was mentally screaming for that editor to step in and assist me through the piece. It’s not that it’s poor writing, nor that it’s entirely without merit, but you REALLY have to be one who aching to know the inner workings of a campaign. You have to want to know bathroom difficulties —such as keeping a bathroom door closed on a tour bus— you have to want to know what reporters eat, why they eat it, and when. You have to want your minutiae wrapped in minutiae, until your eyes bleed with detail. It’s a cardinal rule of mine to never skip passages. I live with the notion that I can learn something from just about everything an author I deem worthy writes, and I deem Wallace to be a quality writer with an adept and varying intellect, but I had to break my cardinal rule with this chapter. It was too painful a slog.

As for the chapter on Tracy Austin, Wallace laments the fact that championship level athletes aren’t capable of achieving a degree of articulation that he wants when he purchases one of their autobiographies. Tracy Austin, for those who don’t know, was a championship level tennis player. Wallace purchased her autobiography hoping that, as an adult long since removed from the game of tennis, Austin would be able to elucidate the heart of a champion. He hoped that Austin would be able to describe for us what went through her mind at the moment when she achieved the pinnacle of her career, and he wanted to know what she thought about the accident that led to her premature retirement. He wasn’t just disappointed, he writes, in the manner that he is disappointed with sideline interviews that are loaded with “we give it 110%, one game at a time, and we rise and fall as a team” style clichés. He sums up his disappointment with the following:

“It may well be that we spectators, who are not divinely gifted as athletes, are the only ones able truly to see, articulate, and animate the experience of the gift we are denied. And that those who receive and act out the gift of athletic genius must (out of necessity) be blind and dumb about it—and not because blindness and dumbness are the price of the gift, but because they are its essence.”

imagesCAY91IXCIn other words, we are able to express these ideas based on the fact that we concentrate on the arena of the mind, and their concentration lies in physical prowess. We, non-athlete types, think about the things they do, we fantasize about them, and they do them. We think about how glorious it would be to sink a championship winning basket over Bryon Russell, Michael Jordan just does it. We think about, and write about, that incredibly perfect and physically impossible baseline shot of a Tracy Austin, Tracy just does it. We see the replays of their exploits endlessly repeated on Sportscenter, and we hear almost as many different analyses of them. We then think about these plays from all these varied angles that are provided, and we project ourselves onto that platform. We don’t think about all the rigorous hours a Michael Jordan spent in gyms preparing for that moment, we simply think about that moment, and what it would mean to us to have conquered such a moment. So, when one of these athletes steps away from that stage to offer us a few words about that moment and those few words center around the “I just did it” meme we are profoundly disappointed. To paraphrase Yoda, “They don’t think, they do, or they do not.” They use the force granted to them though spending a greater percentage of their lives in gyms, on tennis courts, and in weight rooms. They concentrate on muscle memory to prevent the mind from interfering with their eventual completion of the act. If we, non-athlete types, were in a similar situation, we would think about the significance of the history of the game, the profundity of the moment, how this moment may affect the rest of our lives, how many people are watching us, if Bryon is a better athlete than we are, and if he will block our shot, what the fellas are going to say about this play after the game, and we become so immersed in the enormity of the moment that we probably think too much to make the shot. The point is that they’ve made that shot so many times, in so many different ways and games, that they simply rely on muscle memory to make the championship shot. They may think about that shot, as long as it takes them to project it, but once they step on the court, they go on auto-pilot and complete the mission. They would love to give Hemingway-esque descriptions of their game, that satisfied us all, and lands them in a weekly spot on ESPN’s Pardon the Interruption, but for all the reasons described here they are simply not capable of it.

I used to wonder what announcers were talking about when they said, “He’s too young to understand what this means.” This kid, as you call him, has been playing this game his whole life, and he’s lived the life of the championship level athlete, which means sacrificing the norms of daily life that his peers knew, and he’s done all that for “this” moment. What do you mean he doesn’t know what it means? It dawned on me, after a couple struggles with it, that “this kid” doesn’t know what this moment would mean to that announcer … or those of us at home watching. In that post-game interview, then, we’re looking for something, some little nugget that we can identify with. When we get phrases from the cliché vault, we’re so disappointed that they didn’t put more effort into identifying with our sense of their glory. We’re frustrated that they couldn’t reach us on our level. Yet, as Wallace states, it is the essence of a championship level athlete to be “blind and dumb” during the moments that define them, and we all know this to one degree or another. We’ve all seen these championship level athletes being interviewed about their individual moments thousands of times, so why do we continue to be so frustrated with them, and does this continued sense of frustration begin to say more about them or us?

The Usage War: The Undermining of American Values


When I first heard the name Noam Chomsky, I learned that some regarded him as the father of modern linguistics, and I learned that he was considered a powerful force in America. How a man whose sole concern was language could have power outside the halls of academe confused me shortly after I dismissed him. The subject of linguistics seemed a narrow conceit with a narrow appeal. As my knowledge of political science grew, and I learned of the power of language, I learned of the power of this seemingly inconsequential subject, and how it has led to the least talked about “war” of our times.

The late author, David Foster Wallace, called it a usage war and he stated that it has been occurring since the late 60’s. Wallace’s primary concern was not the narrow definition of politics. Rather, he was concerned with the use of language, and the interpretation of it. This usage war is a war between two factions that the editor-in-chief of the controversial Webster’s Third New International Dictionary, named Philip Babcock Gove, {1} described as a battle between descriptivists and prescriptivists.

“The descriptivists,” Grove writes, “are concerned with the description of how language is used, and the prescriptivists are concerned with how the language should be used.”

“The late lexicographer Robert Burchfield furthered this description thusly: “A prescriptivist by and large regards (any) changes in the language as dangerous and resistible, and a descriptivist identifies new linguistic habits and records these changes in dictionaries and grammars with no indication that they might be unwelcome or at any rate debatable.” {2}

The descriptivists say that language is elastic, and it should bend to individual interpretations. Language, they say, should largely be without rules.

“Virtually all English language dictionaries today are descriptive. The editors will usually say that they are simply recording the language and how its words are used and spelled. Most Merriam-Webster dictionaries will note if certain words are deemed nonstandard or offensive by most users; however, the words are still included. Of modern dictionaries, only the Funk and Wagnall’s contains a certain amount of prescriptive advice. All the major dictionary publishers – Merriam-Webster, Times-Mirror, World Book, and Funk and Wagnall’s – will tell you that they are primarily descriptive.”{3}

Early on in life, we learned that if we were going to succeed in school, we would have to perfect our spelling and grammar. After we entered the real world, we learned that if we were going to succeed we would have to take it a step further and correct our speech to the codes in the political correct lexicon.

We can guess that nearly everyone has learned, at some point in time, the relative machinations of acceptable discourse. We can guess that anyone that has spoken, or written, on a professional level, has learned of the perceptive gains one can accumulate and lose with the use, and misuse, of language. We can also guess that most realize how others manipulate their audience through language. The latter may be the key to the descriptivist movement in linguistics today.

Our introduction to manipulated perceptions often occurs when we enter the workforce. We may see these perceptions parlayed in movies and television, but we don’t experience them firsthand until we enter the workplace and they directly affect us. At that point, it becomes clear how others use language to shift the power of daily life.

If this form of manipulation were limited to the workplace, that would be one thing. It would be powerful, but that power would be limited to that particular environment. As we have all witnessed when one successfully manipulates language, it doesn’t end when we clock out for the day. We accidentally, or incidentally, take these rules of usage, or speech codes, out of the workplace and into our everyday lives. David Foster Wallace catalogued these incremental actions and reactions in the book Consider the Lobster. It details the fact that lexicographers, like Phillip Babcock Gove, have used dictionaries, and other methods, as a foundation for a usage war that has been occurring in America since the late 60’s.

How many of us have used incorrect terminology that violates the current rules of usage? How many of us have used the words “reverse discrimination” as opposed to the more politically correct term “affirmative action”? How many of us have called an illegal immigrant an illegal immigrant, only to be corrected with the term “undocumented worker?” How many of us have had a boss, or members of the Human Resources department tell us, “I understand you have personal beliefs on this topic, but I hope you can see that it has no place in the workplace,” they say in so many words. “You don’t want to offend anyone, I know that. You’re a nice guy.” 

Most of us are nice people, and we don’t seek to offend the people we work with, our neighbors, or anyone else for that matter. To do this, we follow the speech codes handed down from the Human Resources department to help us get along with other people. We then, unconsciously, take those speech codes to the bar, to family functions, and to our home, until we find ourselves assimilated to the point that we’re correcting our friends.

“It’s a peccadillo,” they say, “a very slight sin, or offense, it’s not sexual relations with an intern. It’s a fib,” they say. “It’s not perjury before a grand jury. It’s “environmentalist” not “anti-corporate socialist”. It’s a “feminist” not a “man hating female who can find no other way to succeed”, “multiculturalist” not “racial quota advocate”, “rainforest” not “gathering of trees”, “sexually active” not “promiscuous”, “economic justice” not “socialism”, “fairness” not “socialism”. It’s “giving back” not “class envy”, and it’s “community organizer” not “radical agitator”. This is the war, and these are the little battles within that war that the descriptivists and the liberals have been waging against the “normal” prescriptive America lexicon for generations and they have succeeded beyond their wildest dreams.

This desire to be nice to other people, and understand other cultures, is one of the advantages the descriptivists/liberals have in manipulating the language, and winning the usage wars. When we find a person that may be different from us in some manner, we want to know how best to get along with them. We want to know their sensitivities, in other words, so we do not accidentally violate them. The question that we should bring to the debate more often is how do people learn the sensitivities of others? Are these sensitivities internal, or are they taught to us through repeated messaging? Most people are insecure, and they don’t know how to demand satisfactory treatment, but they can learn. An individual can learn that something is offensive, and they can learn how to communicate that offense.

“What’s wrong with that,” is a common reply to this notion. “What’s wrong with teaching people how they should be treated? We all just want to get along with one another?”        

Prescriptivists would tell you that buried beneath all this “well-intentioned” manipulation of usage is the general loss of language authority. Prescriptivists ache over the inconsistencies brought to our language through slang, dialect, and other purposeful displays of ignorance regarding how the language works. They labor over the loss of standardized language, such as that in the classical works of a Geoffrey Chaucer. Most of them do not necessarily call for a return to Chaucer-era usage, but they are offended when we go to the opposite pole and allow words like “height” and “irregardless” into modern dictionaries. They also grow apoplectic when terms, such as “you is” and “she be” become more acceptable in our descriptivist lexicon. And They hide in a hole when standards of modernity allow sentences to begin with a conjunction, such as “and”, and they weep for the soul of language when casual conversation permits a sentence to end with an infinitive such as to.

Language provides cohesion in a society, and it provides rules that provide like-mindedness to a people that want to get along. It’s fine to celebrate individuality, and some differences inherent in a melting pot as large as America’s, but if you have nothing to bind people together the result can only be a degree of chaos.

A member of the descriptivism movement, on the other hand, celebrates the evolution of language:

“Frank Palmer wrote in Grammar: “What is correct and what is not correct is ultimately only a matter of what is accepted by society, for language is a matter of conventions within society.”

“John Lyons echoed this sentiment in Language and Linguistics: “There are no absolute standards of correctness in language.”

“Henry Sweet said of language that it is “partly rational, partly irrational and arbitrary.”

It may be arbitrary in Sweet’s theoretical world of linguists seeking to either ideologically change the culture, or update it to allow for vernaculars in the current social mores, but in the real world of America today are we doing our students, our language, or our culture any favors by constantly redefining usage? If our primary motivation for teaching arbitrary methods of usage is sensitivity to intellectual capacity, different cultures, and self-esteem is the culture as a whole made better in the long run?

On the ideological front, the descriptivism movement has successfully implemented a requirement that all writers now use the pronouns “they” and “he or she” if that writer is seeking a general description of what a general person may do, or think. Repeated use of the general pronoun “he” without qualifying it with the balanced usage of “she”, “they”, or “he or she” is not only seen as antiquated, but sexist, and incorrect. The reason it is antiquated, those of the descriptivism movement say, is that it harkens back to a patriarchal, White Anglo-Saxon Protestant (WASP) society.

If you work in an office, and you send out any form of communication to a team of people, you know how successful the descriptivism movement has been in infiltrating our language in this regard. Yet, there was a point in our history, a point in the not-so-distant past when no one knew enough to be offended by the repeated use of “he” as a general pronoun. No one that I know of regarded this as improper, much less incorrect. Years of repeated messaging have created ‘gender neutral’ solutions to the point that schools, workplaces, and friends in our daily lives suggest that using “he” as a general pronoun is not just sexist it is incorrect usage. Yet, they deem using the pronoun “she” as an acceptable alternative. If this complaint were limited to the narrow prism of politics, one could dismiss it as a member of the losing team’s hysteria, but we’re talking about the politics of language usage.

A political science professor once told our class that, in his opinion, law breaking became a little more acceptable when the federal government lowered the speed limit to fifty-five in 1974. His theory was that the fifty-five mile per hour speed limit seemed arbitrarily low to most people, and they considered it unreasonable. His theory was that most people were generally more law-abiding in the 50’s, and  –“regardless what you’ve read”– in the 60’s, but in the 70’s more people found the general idea of breaking the law more acceptable, and he deemed this 1974 unreasonable limit on speed to be the antecedent. His theory was that no one person, no matter how powerful their voice is in a society as large as ours, could successfully encourage more people to break the law, and that only the society could do this by creating a law that was seen as not only unreasonable, but a little foolish.

Whether or not his theory is correct, it illustrates the idea that seemingly insignificant issues can change minds en masse. Could one person, no matter how powerful they may be in a society, teach people to be offended more often for more power in that society? Can political linguists dictate a certain form of usage by suggesting that anyone that doesn’t assimilate does so with ulterior motives? Could it be said that Human Resource videos –that anyone that has been employed has spent countless hours watching– are not only being used to teach people how to get along with people different than them, but how those different people should be offended?

“Why does that person continues to use general pronoun “he” instead of “he or she” or “they” continue to do that? Are they trying to offend all the “shes” in the room?” 

Everything stated thus far is common knowledge to those of us who operate in public forums in which we interact with a wide variety of people. What some may not know is that this “usage war” for the hearts and minds of all language users extends to the production of dictionaries.

If this is true, how can a dictionary be ideological? There are prescriptivist dictionaries that call for “proper” interpretations and use of language, and there are descriptivist dictionaries that evolve with common use. “Usage experts”, such as David Foster Wallace, consider the creation of these two decidedly different dictionaries salvos in the Usage Wars “that have been under way ever since an editor named Phillip Babcock Gove first sought to apply the value-neutral principles of structural linguistics to lexicography in the 1971 Webster’s Third New International Dictionary of the English Language.”

“Gove’s response to the outrage expressed by those prescriptivist conservatives who howled at Gove’s inclusion of “OK” and “Ain’t” in his Third Edition of Webster’s Dictionary was: “A dictionary should have no truck with artificial notions of correctness or superiority. It should be descriptive and not prescriptive.” {4}

One of the other reasons that descriptivism eventually took hold is that it allowed for more “free form” writing. Descriptivism allows a writer to get their words down on paper without an overriding concern for proper communication. Descriptivism allows for expression without concern for proper grammar or a more formal, proper lexicon. It allowed a writer to brainstorm, free form, and journal without a “fussbudget” teacher correcting these thoughts into proper usage.

This was a relief to those that enjoyed expression without having to answer to a teacher that informed us we weren’t expressing correctly. How can one “express correctly” those of us that enjoyed expression asked. Without too much fear of refutation, I think we can say that the descriptivism movement won this argument for the reasons those that enjoyed creative expression brought forth. When one of my professors told me to get the expression down, and we’ll correct your spelling and grammar later, I considered myself liberated from what I considered the tyrannical barrier of grammatical dictates. It wasn’t too many professors later that I discovered teachers that went beyond the “correcting the spelling and grammar later” to the belief that the self-esteem of the writer was paramount. If the student doesn’t get discouraged, this theory on usage suggested, they are more apt to express themselves more often. They are more inclined to sign up for a class that doesn’t “badger” a student with constant concerns of systematic grammar, usage, semantics, rhetoric, and etymology. One argument states that colleges based this lowering of standards on economics, as much of what they did encouraged the student. Personal experience with this, along with the other examples listed above, paved the way for the descriptivism movement to move the language, and the culture, away from the prescriptivist rules of usage.

Some have said that the motivation for those in the descriptivism movement is not nearly as nefarious as those in the prescriptivism movement would have one believe. Descriptivists would have one believe that their goal was more an egalitarian attempt at inclusion and assimilation. They would have them believe that the prescriptivists’ grammar requirements, and lexicography, are exclusionary and elitist, but can we take these descriptivist interpretations and nuances into a job interview, a public speech, a formal letter, or even into a conversation among peers that we hope to impress? Can we succeed in the current climate of America today with language usage that is wide-open to a variety of interpretations?

An English as a Second Language (ESL) teacher once informed me that the “impossibly high standards” President George W. Bush, and his librarian wife, placed on her students, made her job more difficult. I conceded the fact that I was an outside looking in, listening to her complaints, and that I didn’t know the standards she had to deal with them on a daily basis. “But I said, “If we’re looking at the intention behind these impossibly high standards, could we say that they were put in place to assist these non-English speakers into learning the language at a level high enough for them to succeed in America?” This ESL teacher then complained that the standards didn’t take into account the varying cultures represented in her classroom. I again conceded to her knowledge of the particulars of these standards, but I added, “You’re theoretical recognition of other cultures is wonderful, and it has its place in our large multi-cultural society, but when one of your students sit for a job interview what chance do they have when competing against someone like the two of us that are well-versed in the “impossibly high prescriptivist, standard white English, and WASP” grammar and usage standards we were forced to learn in our class?”

{1} http://en.wikipedia.org/wiki/Philip_Babcock_Gove

{2}   http://stancarey.wordpress.com/2010/02/16/descriptivism-vs-prescriptivism-war-is-over-if-you-want-it/

{3} http://englishplus.com/news/news1100.htm

{4} Wallace, David Foster. Consider the Lobster. New York, NY. Little Brown and Company, a Hachett Book Group. 2005. eBook.

 

Scat Mask Replica (20)


1)  I never noticed how profoundly TV affects the culture, until I stopped watching it as often. I now hear people repeating common phrases I’ve never heard before. I hear people laughing at the same jokes, gesticulating and posturing in similar ways when they tell jokes, and they may start laughing at jokes others tell before the joke is even finished. They seem to know the same stories and the same jokes. They seem to have the same rhythm to their jokes, and they all land on the same note when they hit their punch line. It gives us all comfort to hear a story or a joke that we know, and to know where it’s headed. Our brain rewards us with a shot of dopamine when we figure out the pattern of a story, joke, or song before it’s concluded. Dopamine makes us feel good for a moment, so we all watch the same TV shows and listen to the same songs over and over again, because we know where they’re headed, and we hang out with people who say “okay, right” and tell the same jokes in the same manner and land on the punch line in the same note, because they make us feel intelligent and funny and we get our dopamine rewards, and we couldn’t do it without them, because we are the complex species who need companionship.

trout2) Movie studios spend big money to put attractive people in movie roles, and we pay big money to watch them walk and talk with one another on screen. It’s not about being gorgeous, however, because audiences often spend time trying to spot the flaws in the flawless. Those who appear on screens most often have a quality about them that we enjoy watching for 90 minutes. Part of this quality is beauty, but another part of it is that elusive, indefinable “I don’t know, but I know it when I see it” quality.

3) As a ten year old, I was able to fool most of the adults most of the time. I played the role of the innocent child who didn’t know any better. More often than not, I did know better. My peers knew that, but the adults were bent on understanding me better and being sympathetic. My fellow ten-year-olds would scoff in my general direction. We adults should be scoffing, but we don’t. We don’t because we want to be viewed as intelligent and sympathetic individuals. We want to understand criminals, but more than that we want to be seen as individuals who are trying to understand. We don’t want to believe in absolutes. We say there are no absolutes, and this makes us feel like our structures are complex. Maybe there aren’t any absolute 100% truths, but isn’t a truth that is true 50.1% of the time enough to act on? The ten-year-old mind deals more in absolutes than the progressive, complex mind of the adult, but there are times when the absolutes are a lot closer to the truth.

4) I had a friend who described himself as “very sincere some of the times.” How can one be “very sincere” some of the times? I can see how a person would be very sincere about some things and insincere about others, but how can one characterize themselves as very sincere some of the times in a general manner?

5) There is a struggle in every mind to be intellectual. There is also a resultant struggle to be perceived as an intellectual. Unfortunately, many forego the internal struggle of the latter and place too much value in the latter.

6) I’m toilet trained, but every once in a while I imagine what I would look like if I suckled breasts as big as mountains. Would I have crooked teeth and mongoloid eyes?

7) Some people complain that they have no choice in life. This is a fallacy, for the most part, but they lean on this to explain why they are not doing what they want to do in life. If it is true, in the present, the only reason they have no choice is because of the decisions they made in the past. This is true of most people, and you are not an exception to this rule.

8) Relaxing the mind during the respites of relaxation reserved for artistic venues (i.e. movies) can produce a Chinese water torture effect. What starts out as meaningless drips hitting your forehead can incrementally evolve into accepting ideas that we would not otherwise consider.

9) I’ve tried being one of those guys that changed his underwear every day. It never got me anywhere.

10) If all theory is based on autobiography, then what does it say about those who pose theories on why and how people think. On that note, what’s the most terrifying motive for slaughtering a bunch of people: nothing. We search for motive, because we need a motive, and the thought that a person could do kill people for the thrill of the kill might prevent us from leaving the home as often as we do.

11) I walk into a department store and I see aisles upon aisles of things I’ll never need, yet some of them are red and sparkly. I wonder if these products could change my life. What will happen if I don’t purchase this latest, greatest, and top of the line product that has resulted in happiness and peace on earth for those who weren’t afraid to purchase now at a new, low price. Would my stubborn decision not to purchase such products result in me being forever portrayed in black and white, with a miserable face that results in complete anguish and a degree of dissatisfaction in life that the rest of the human species was in before they decided to indulge in this incredible convenience. I need to be in color again. I need to be the guy in the after picture with a smile so bright he doesn’t mind the backbreaking work of this task anymore. This guy in black and white suggests a certain nutrient depletion that I simply can’t go back to. Look at the scowls that guy makes as he works with the product that has served me well for so many years. Was I ever that miserable? I don’t want to be miserable anymore. Look at that guy. He looks like the most miserable guy since that feller that had his chest picked at by the bird in Greek mythology.

12) Pet peeve: People who quote Hollywood stars and give that star sole credit for that quote. “You know it’s like Jack Nicholson says …” If that quote came from a movie, I want to say, it’s likely that quote wasn’t a Nicholson creation. More often than not, it was a line a screenwriter wrote for him. The primary reason this bothers me –other than the fact that few put any effort into finding the actual writer of that quote, and even fewer will give that writer the credit he has earned– is that when a naïve, moronic star (not Nicholson) says something political, we listen. Why do we listen to them, because if that star is smart enough, or lucky enough, or in the right place at the right time often enough, he can compile enough lines over time to achieve a certain degree of credibility with us that he can take off screen with him. After they deliver enough of these lines, over the years in movies and TV, our conditioning might be such that we believe that these stars are smart based on lines written for them by other people.

13) Some people look at total strangers and think they’re total idiots. Others look at total strangers and think they have life all figured out. I got a little secret for you though. Something that may change your life: Most of us aren’t looking back at you. Most of us don’t care about you. So move on. Live your life and deal with it as it is. Quit worrying if anyone’s impressed with you or onto you. We don’t care about you.

14) One of the worst things Jerry Seinfeld and Larry David brought to the American conversation is the hygienic conversation. I heard these conversations sporadically before Seinfeld hit the air, but in the aftermath of the great show it seems every fifth conversation I hear involves the minutiae of cleanliness. People now proudly proclaim to their friends that they not only wash their hands, but they use the paper towel to open the door. “Oh, I know it!” their listener proclaims proudly. “It’s gross!” I have no problem with exaggerated methods of cleanliness, but to have non-stop conversations about it? Last week, I saw two fellas form a friendship on the basis that they both used disposable paper towels to open public bathroom doors. They both respected one another’s bathroom ethics, and they are now friends. It’s all a little silly at some point.

15) Another aspect of life we waste a lot of conversational time on is cell phones. We talk about our cell phone plans in a competitive manner. We talk about the ‘Gigs’ on our cell phone, the time it takes us to pull information off the web, the portability, the horrors of our prior plan, the ease with which we can text, and the apps our service offers, and we do it all with personal pride. We tell our peers that our phones are superior, as if we had something to do with their creation. We may not know where we stand on the various totem poles of life, and we may still have no idea what Nietzsche was going on about, but we know that our cell phone is superior to yours, and some of the times that’s enough.

16) Talking head types love to be unconventional as long as it ticks off the right people. I’ve always thought there was something conventionally unconventional about that.

17) I’ve always wanted to have a name like Bert Hanratty. When I do something wrong, my boss could scream: “Hanratty!” I would then walk to the boss’s desk like a 70’s sitcom star who is always messing things up in a comical way. My current name last name has two syllables in it, and there’s nothing funny about two syllables.

18) The anti-religious don’t have to think objectively, for they are objectivity personified by the fact that they are objectively objective.

19) What would you do if you scratched an itch on the back of your neck, and your hand came back with a tiny screaming alien on it? What would you do if another alien was perched on your other shoulder, and that alien said: “Quit living your life in preparation of disaster.”

20) The other day I laughed at the antics of our local radio show’s morning program. Scared the hell out of me. I ran to the bathroom and looked in the mirror, and I confirmed that I was, in fact, laughing. I cannot remember what it was that caused the laughter, but whatever it was I hurried up and shut the damn thing off. I picked up Finnegans Wake and read a few pages. This is my usual punishment for enjoying the idiotic humor of zany morning radio stars.