Impulsive vs. Reflective


I have learned, the hard way, to avoid the impulses that drive one to make impulsive purchases. I have learned to define my desire for said product through separation. Take a step away and try to take the newness element out of the product and imagine it on you, in you, or under you one month from now. The problem with these impulses I have is that they drive me to purchase shocking, ridiculous, and useless products that satisfy a short-term desire to be different.

Craftsmanship means as much to me as anyone else, but when it comes time to purchase products, the subtlety of a craftsman’s curve in a rocking chair has never spoken to me on a personal level. I much prefer a new-age piece of furniture that has some innovative sex appeal with a couple exclamation points behind it. I want a piece that causes people to ask questions that have no suitable answers.

Had I followed the impulses that have controlled me at various points in my life, I would now be driving a bright orange Jeep with black trim. I might even have a bright yellow colored living room with equally bright orange furniture, and some kind of multicolored carpet that accentuates the overall theme. I might also have a visually striking painting of a screeching gargantuan, gold eagle, with beaming blood red eyes, flying above shadowed villagers scampering to safety on a red felt background. Those products would fulfill a definition I have for the immediate, shocking, discomfiting, and shocking elements of beauty. It’s a definition of who I was, and who I am, that I know would shock my visitors into thinking there might be something we need to sit down and discuss before it gets out of hand.

Two things prohibit me from following these impulses: A wife and a child. A wife, or any person on the inside looking in, tempers such impulses with rational refutation. When a single man, with no children, follows his impulses, people sort through the psychological damages he must have accrued throughout his life, and they laugh it off as a bachelor pad. When that same man has a child, however, that child has extended family members that care about that child and worry about their well-being when they see that one of his primary role models has created a living room that requires sunglasses. When one of that child’s primary role models also has a painting of a bloodthirsty eagle flying above doomed villagers, above the hearth, they might question his ability to raise a proper child.

The other thing that prevents me from following these impulses now is that I’ve been there, done that. I’ve been the person others tried to understand, and I’ve been a person that others gave up trying to understand, until they conceded that the person they thought they knew is a lot weirder than they ever thought. I’ve purchased a shockingly bright, baby blue pair of shoes that I considered an expression of my personal definition of beauty. I tore these shocking baby blue pair of shoes off the shelves on sight and without thought. I figured I was in for a psychological pummeling from those who consider anything different a source of ridicule, but I was willing to ride it out for the effect I thought the pair of shoes would have on my essence.

Others echoed these fears by informing me that I should expect the worst from my classmates, if I had the temerity to wear these shoes to school. “People do wear such shoes,” they warned, “when they workout. They don’t wear them at work, in school, or on the path to and fro.”

Hindsight may be 20/20, in this case, but I remember tingling with anticipation over the effect I thought this would have on my classmates. I couldn’t wait to introduce them to the new me. I then made a statement about the old me, by throwing away my old, sensible shoes.

Those who tried to prepare me for the psychological pummeling that would follow, would have been shocked at how successful my attempt to shock people was. I lost loyal friends over it, as they attempted to distance themselves from me to avoid having shrapnel rain down upon them. The experience was such that I thought of a short story called The Boy with the Bright, Baby Blue Shoes. I remembered a nature documentary in which a pack of hyenas brought a zebra down bite-by-bite, and my sympathy for that beast churned to empathy after this moment in my life. For those who abhor judgments of any kind and seek the karmic justice for those who do, this was one of the many for me. It did not feel good, and the pain I experienced changed me. If you’re going to judge others, you should prepare for them to judge you.

I did not have the confidence, or temerity necessary to stare these people down back then, and they broke me. I did learn that when one dares to be different, there are whole bunch of guidelines and borders, and most of them are superficial. I also learned one golden rule of life that I would pursue for much of my life to arrive at a final answer, and that was that most people consider it a worthy goal to dismiss as many people as possible in life. A wearer of bright, baby blue shoes becomes a wearer of such shoes, for example, until that person becomes a barometer of agreed upon truths that need to be agreed upon in the most brutal fashion possible.

At some point, I did find the subtle beauty of a craftsman’s curve in the gap of others’ writings, in certain lyrical phrases, and in the margins of dialogue and characterization. I discovered something in the intended, and unintended, philosophical truths of various artistic expressions of organic craftsmen. In those phrases, lines, paragraphs, and comprehensive thoughts, I discovered a shockingly different beauty that replaced my need for superficially shocking modifications.

My need for character-defining purchases also led me to be a sucker for innovation. My impulses drive me to purchase the latest and greatest technology my fellow man created for my convenience, and it led me to spend a great deal of money in the “As Seen on TV” aisles of prominent stores, and the “As Seen on TV” stores in malls. I purchase these products in the hope that they will simplify otherwise arduous and mundane tasks, but I’ve purchased these types of products so often that I now know that whatever short-term convenience these products provide pale in comparison to their suspect long-term durability. These innovations do sell, of course, because people, like me, get amped up on the idea that a collapsible garden hose will free up so much space on my back patio. The question I ask myself, now, when wrestling with the impulses that drive me to purchase anything that will make my life easier is, if this “new and improved way of doing things” product were in fact better than the more traditional products in the main aisle, wouldn’t the new products replace those traditional products that my dad and my grandfather used in the main aisle. If the new and improved products are as great as the manufacturer’s claim, it shouldn’t take long for them to replace the old, traditional products, but for reasons endemic to this article they never do. 

For those who still can’t rationalize their impulses away, I have one piece of advice when attempting to define your desire by separation. Those bright, baby blue pair of shoes that look so deliciously freakish sitting in that aisle will eventually become nothing more than a pair of shoes over time. A Jeep will become nothing more than a mode for transportation, and a chair will eventually become nothing more than something to sit in, once the effect of being shocking wears off. The person who makes these impulsive purchases also realizes that these products provide onlookers data about the person that purchases them in a manner that the purchaser will likely regret long term. I hoped that by purchasing a pair of bright, baby blue tennis shoes that I would make a statement that no one in my vicinity would soon forget, and they didn’t, and I realized that I allowed them to dismiss me as a person that wore bright, baby blue shoes. I learned that every day beauty requires a study of the subtle forms of beauty that will grow on a person, and when the otherwise impulsive learn this they will decide to purchase the white Jeep with black rims.

It’s ‘Okay to Like’ Guilty Pleasures


“It’s okay to like your favorite shows again, even if they have no cultural value or societal significance,” a person informed my friend.  “As long as the preference for the show is characterized as a guilty pleasure.”

After receiving permission to enjoy the show my friend once so enjoyed, she began binge watching the show on Netflix. She watched this show in the manner of one catching up with an old friend, after a prolonged absence. She knew the show was a silly sitcom, and she also knew that the premise of that show –though somewhat relevant in its era– had become dated and insignificant. So, even though she had always loved the show, she stopped watching it, even in private, until that friend of hers ‘gave her permission’ to end that prolonged exile, informing her that ‘it is now okay’ to enjoy that show again.

o-GILLIGAN-facebookAs with ubiquitous idioms of this sort, I heard the terms ‘given permission,’ ‘guilty pleasure,’ and ‘it’s okay to like’ before. When everyone begins saying such things, however, I’m left wondering where I was in the gestation cycle of the phrase. I didn’t think the phrases funny, when I first began hearing them, or if they were intended to be funny. I didn’t think they provided an interesting twist on the art of decision-making, and I didn’t think that I would ever be incorporating them into my decision-making process, or the explanations to others regarding my choices.

I just thought it was an odd way for one person to frame the dietary decisions she had made, and that’s where it started for me.  I’d heard people, largely women, framing dietary cheats this way. ‘I’ve been good,’ they would say before they took a bite of something they knew damaged the discipline they had exhibited to that point. They then gave themselves permission to eat what they wanted based on that established discipline, and they called those cheats guilty pleasures. At some point, these phrases made a crossover into other decisions, until people began framing all of their decisions with these qualifiers. They also began informing me that I should frame my decisions in this manner, that I should give it a spin, as it were, and that with these qualifiers, I could now make my decisions free from the guilt associated with prying eyes.

“Why wouldn’t it be okay for me to like the television shows I enjoy?” I would ask when the phrases began crossing over into entertainment choices. At this point in the gestation cycle of these phrases it was obvious that something had already happened. I didn’t know if it happened in the shows I never watch, some movie I missed, or if the phrase had been repeated in a commercial, or a number of commercials, but some vehicle had imprinted these phrases so deeply into the craniums of the people I speak with that they were using the phrases without knowing why. I’ve often found that the best way to cut to the heart of the matter is to ask a question so obvious that no one ever thought of it before.  ‘Why isn’t okay for me to like what I like?’ and ‘Why am I then required to qualify my choices in a manner that prevents you from thinking less of me?’ I began asking variations of these questions of those that posed these notions to me, and as with most idioms of this sort, no one knows why. They just hear other people framing their decisions in this manner, until they find themselves doing it.

After questioning a number of these people, I made the mistake of dismissing these phrases on the basis that no one understood why they did it, and I assumed that it would have a very short shelf life, until everyone I knew began repeating these phrases in almost the same context, and Google searches began revealing websites that were being built around the idea that ‘It’s ok to like’ this today, and ‘it’s okay to dislike’ other things. I even found a Twitter page that gave its visitors permission to like some things and to like other people that like other things. It’s difficult to determine how tongue-in-cheek these grants of permission are, or if these people enjoy being on the cutting edge of cultural trends.

Then, I hear that my friend is now binge watching her favorite show of all-time again, and she’s characterizing it as her ‘one guilty pleasure’. She drops that phrase, I could only assume, to prevent me from thinking less of her for watching such a dated, irrelevant show. She cared what I thought of her, in that instance, and I rationalized that unless we have a master plan of dropping out of the human race, we all care what others think to a point where we need to develop some kind of shield to protect our inner sanctum from prying eyes. Those that have attempted to loft the very high school era idea that they don’t care what anyone thinks of them have inevitably run into the ‘thou doth protest too much’ wall that reveals that they probably care more than anyone else. One could say that this ‘guilty pleasure’ allowance has not only ‘given us permission’ to enjoy the shows we enjoyed so much at one time, it gave rise to an industry in which cable channels like TV Land could prosper, and a Netflix was born, and the whole idea of binge watching became a permissible and acceptable guilty pleasure.

The first question I would’ve asked this ‘guilty pleasure’ friend of my friend that granted her permission to like her favorite show again is, ‘How many guilty pleasures is one person permitted, and what happens to that person that violates the excessive quantity principle of the lack of quality edict?’ One would assume that the term guilty pleasure is intended to be exclusive to one, or at least a few, products.  Are these guilty pleasures exclusive within industries? Can one have more than one television show they consider a guilty pleasure, and if so, is it specific to genre? If one has more than one ‘60’s era, silly sitcom, that they characterize as a guilty pleasure, is that a violation of guilty pleasure principle, and if the person has too many guilty pleasures will they end up spending so much time pleasuring themselves that they may find themselves walking around with burdensome guilt? Would that person be deemed unimportant, and would that lead them to being ostracized from the hip, in touch groups in a manner reminiscent of a Nathaniel Hawthorne novel?’ Who are these social architects that dictate to society what is and what isn’t okay to watch? And how did this need for the ‘guilty pleasure’ qualifier come about, so that we can watch what we want without undue scrutiny?

We’ve all been informed that The Brady Bunch and Gilligan’s Island are okay to dislike, by these people, because these shows, and all shows like them, are impossible to take seriously. They say that these shows depict a silly and foolish era that we’ve all moved beyond, and ‘good riddance!’ they often add. At some point, however, they decide there is some quaint, retro glory in these shows, and they decide that ‘it’s now okay’ to go back and like these shows again, as long as the individual qualifies those viewings as a guilty pleasure. I would not listen to these people regardless how prestigious others deem them to be, but to those that do listen, I would ask, ‘What gives them the credibility to decide for you?’ It would seem to me that they gain their bona fides solely by making the claim that they know what it is that’s ‘okay to like’ and what is not, and what should be listed as a guilty pleasure.

***

My lifelong enjoyment of Gilligan’s Island could be called a ‘guilty pleasure’, if the term is defined as: “Something, such as a movie, television program, or piece of music, that one enjoys despite feeling that it is not generally held in high regard.” I know how dumb and silly the show is. I also know that in the broad, cultural sense it has no redeemable qualities. Yet, I do not feel guilty about any association I may have had, or will continue to have with the show, and I have no problem floating back in time to that place in time when I watched Gilligan’s Island every day for years.

This leads to that silly argument of extension that suggests that anything one is not ashamed of, must be something for which they hold such a sense of pride that they should be willing and able to defend, and those that don’t do either are taking the spineless, Switzerland position of critiquing both sides while trying to avoid vulnerability on the point. I understand that complaint, but remember we are talking about television shows here, and if I were forced to mount a defense for this television show –to avoid the spineless Switzerland position– it would be made in defense of silliness.

Gilligan’s Island was silly and dumb, as I’ve said, but so was one of the most celebrated, critically acclaimed, and award winning shows of our time: Seinfeld.  If we were to break this brilliant show down to its core, we would find silliness. The keys to Seinfeld’s success, it would seem to me, lay in its creative way to turn a phrase, and its ability capture a comprehensive thought with creative brevity. The writers were also hell bent on making a story flow through an arc and return to the theme of an episode with a “no hugging and no learning” themed resolution.

Gilligan’s Island could be said to be one of the predecessors of this “no hugging and no learning” theme that would later specifically be employed Seinfeld. It could also be argued that most of the shows of that era were based on this “no hugging and no learning” theme, and that the cultural relevance brigade with their “applause ready” soundbites, “poignant, thought-provoking, and very special” plot lines, with lots of hugging, and learning, and crying came later. It could also be argued that Seinfeld, and its “no hugging and no learning” theme was a return to that era when sitcoms didn’t try to be more than they were. They just wanted to make people laugh in an era when no one felt guilty about doing just that.

If the reader knows anything about Gilligan’s Island, and a growing number of people do not, they know that Gilligan’s Island would never be confused with having anything to do with cultural relevance. The creator of the show, Sherwood Schwartz, stated as much when he said that if there was anything political about the show it existed in an intended apolitical theme. His exact quote, as listed in a Mental Floss piece on the show, was that Gilligan’s Island represented, “A metaphorical shaming of world politics in the sense that when necessary for survival, yes we can all get along.”

As a political person that has been reminded, throughout my life, how divisive politics can be, I think we could all benefit from more “no hugging and no learning” shows. The problems with such shows is that no one feels important watching them, and we all have a need to feel important. Some of us even strive so hard for importance that we claim that we watch shows we never watch, read books we have not read, and listen to important music that no one listens to. Silly shows will never make a person feel important, they will not win awards, TV critics won’t talk about them, and water cooler speakers don’t often talk about “no hugging and no learning” shows, or if they do, it’s not reported on by TV critics that consider these type of shows guilty pleasures.

Seinfeld is the exception to all of these statements, of course, but that show developed such a groundswell of popularity that it caught people by surprise. The quality of the writing on the show was never in question, but there was never a “very special” plot line that critics could wrap their arms around. Critics sought a seminal episode to explain the ethos, and the manner in which it intertwined with the culture, explained it, or rose above it. When none of that happened, they decided to ‘give us permission’ to like she show based on the ‘guilty pleasure’ of watching a show about nothing.

The problem for the other silly, non-award winning, and panned by TV critics’ shows, is that quality writers don’t want to write for them, as most formulaic shows that eschew politics in their “no hugging and no learning” apolitical themes offer little in the way of sprucing up resumes.

What’s hilarious about the world these cultural doyens draw up, with their ‘it’s okay to like’ and shows ‘it’s okay to dislike’ parameters, is that they’re often aghast when a cultural figure from the other side of the aisles decree that there are shows ‘it’s okay to like’ and shows ‘it’s okay to dislike’, based on that cultural figure’s political and psychological underpinnings. With no objective understanding of what they do, the cultural doyens chastise the cultural figures for having the temerity to suggest that they can dictate what anyone should or shouldn’t watch. These people then ask us to join them in directing a “very special” special finger at the dastardly decision makers that they believe should be granted exclusive rights to that finger. Yet, I believe if we viewed these arguments in an objective manner, we should be able find a “very special” place in our hearts to provide both sides that finger.

As Jennifer Szalai details in her The New Yorker piece, the term guilty pleasure is almost exclusive to America. She provides an example in the way of a Frenchman interviewing for a job in America, in which he was asked what his guilty pleasures were. The Frenchman was confused. He claimed that he had never heard the term, and that the best translation he could find applied to matters no one he knew talked about. If a Gilligan’s Island was popular among the cultural elites in France, in other words, no one would knew it, because they didn’t talk about it. In America, on the other hand, it’s something we enjoy talking about almost as much as we do watching the shows.

“You make sure to talk about (your guilty pleasures) –which is why the term exudes a false note, a mix of self-consciousness and self-congratulation. Aside from those actively seeking out public debasement, if you felt really, truly ashamed of it, you probably wouldn’t announce it to the world, would you? The guilt signals that you’re most comfortable in the élite precincts of high art, but you’re not so much of a snob that you can’t be at one with the people. So you confess your remorse whenever you deign to watch (a show like Gilligan’s Island) implying that the rest of your time is spent reading Proust.”

Rock and Roll is Dead!


“Rock and Roll is dead!” is a line most of us have heard for most of our lives. From the anthemic screams of punk rockers to the classic rockers suggesting, “Today’s music ain’t got the same soul,” everyone has enjoyed repeating a version of this line. For most of our lives, however, this has been little more than snarky criticism of the current status quo. For some of us, this has been based on the idea that our favorite strain of rock is no longer prominent, that we don’t appreciate the new direction rock was headed in, or that we have simply aged out of it. Looking at it from a rational perspective, rock and roll has always been able to survive based on young individuals crafting creative derivatives of what came before, and those derivatives develop movements that lead to greater sales and continued power, for rock in the music industry. On both planes, it does appear that either rock music is in a severe and prolonged downtrend, or that it may, in fact, be dead in terms of it being a powerful force in the music industry.

“For generations, rock music was always there, and it always felt like it would come back, no matter what the current trend happened to be,” Eddie Van Halen told Chuck Klosterman in a 2015 interview. “For whatever reason, it doesn’t feel like it’s coming back this time.”

As Klosterman writes, in his book But What if We’re Wrong, Eddie Van Halen said this at sixty-years-old:

“So some might discount (Eddie Van Halen’s) sentiments as the pessimistic opinion of someone who’s given up on music. His view, however, is shared by rock musicians who were still chewing on pacifiers when Van Halen was already famous.”

Thirty-seven-year-old singer of the band Muse, Matt Bellamy, echoed Eddie’s statement saying:

“We live in a time where intelligent people –or creative, clever people– have actually chosen computers to make (sic) music. They’ve chosen (sic) to work in tech. There’s an exhaustion of intelligence which has moved out of the music industry and into other industries.”

Chuck Klosterman then adds:

“We’ve run out of teenagers with the desire (and potential) to become (the next) Eddie Van Halen. As far as the mass culture is concerned that time is over.”   

If the reader is as shocked as I was to read a high profile, classic hard rock performer, coupled with a more modern artist, and a rock enthusiast on par with Chuck Klosterman, discuss the end of an era in such a rational, and persuasive manner, you’re not alone. It does not appear to me that these individuals were intending to be provocative. They were suggesting that it now appears that those of us who proclaimed, “Rock and Roll will never die!” were wrong, and that historians may view rock and roll as nothing more than a “prolonged, influential, and cultural trend”. That trend involved rock artists often coming out with a new album every two years, and there were so many rock artists that we usually had about one new rock album a week, at its peak, and a rock album a month at various other times. When a new rock album comes out now, it’s often followed by “Keeping Rock and Roll alive!” There will always be some musicians who make rock music and albums, but the powerful force that once helped form the backbone of America might be over. That trend may have been such a prolonged staple, but it’s been around longer than most of us have been alive. Yet, if we are able to remove the emotion we have vested in the art form and examine it from the perspective of creativity and album sales, it is more than likely that hundreds of years from now historians will view rock and roll as a trend that began in the mid-to-late fifties and ended somewhere around 2010.

The Creative Power 

The one aspect of Bob Dylan’s memoir Chronicles that an interested reader will learn about the man, more than any other aspect of his life, is how much depth went into Bob Dylan’s artistic creations. Dylan writes about the more obvious, influential artists that affected him, such as Woody Guthrie, but he also writes about the obscure musicians he encountered on his path, that affected him in ways large and small. He also writes about the manner in which reading literature informed his artistic persona, reading everyone from prominent poets and fiction writers, to the Ancient Greek philosophers, and he finally informs us of how other experiences in his life informed him. The reader will close the book with the idea that the young Dylan wasn’t seeking a road map to stardom so much as he was learning the art of craftsmanship.

On this subject of craft, as it pertains to the death of rock and roll, the bassist from KISS, Gene Simmons, informed Esquire:

“The craft is gone, and that is what technology, in part, has brought us. What is the next Dark Side of the Moon? Now that the record industry barely exists, they wouldn’t have a chance to make something like that. There is a reason that, along with the usual top-40 juggernauts, some of the biggest touring bands are half old people, like me.”

On the subject of craft and being derivative, we could argue that Dark Side of the Moon was derivative. We could also argue that Elvis Presley, Bob Dylan, The Beatles, The Rolling Stones, and Led Zeppelin were all derivative. We could argue that rock and roll, itself, derived from rhythm and blues, and that rhythm and blues derived their sound from the blues, jazz, and swing music. There is no sin in being derivative, in other words, as most artists derived something from another influence, but the question of how derivative an artist is has often haunted most artists that derived their craft from other, more obscure artists. The question most artists have had to ask, internally and otherwise, is how much personal innovation did they add to their influences? Perhaps more important to this discussion is a question of how much room was left in the zeitgeist for variation on the theme their influence created? To quote the cliché, a time will arrive in any art form, when a future artist is attempting to squeeze blood out of a turnip, and while the room for derivatives and variations on the broad theme of rock and roll seemed so vast at one time, every art form eventually runs into a wall.

We could say that the first wave of rock and roll that didn’t spend too much time worrying about how derivative it was, was the Heavy Metal era of 80’s hair metal bands. We could also say that they didn’t have to search too deep, at that time, because the field still yielded such a bountiful harvest. All they had to do was provide a decent derivative of a theme some 70’s bands derived from some 60’s band that were derivatives of 50’s bands, and so on and so forth. There was still something so unique at the heart of what they were doing, in that space in time, that they could develop what amounted to a subtle variation of a theme and still be considered somewhat unique.

At some point in this chain of variations and strains, the wellspring dried up for 80’s hair metal bands, and they became a mockery of former artists, until rock and roll was in dire need a new template. At this point, right here, many proclaimed the death of rock and roll. They claimed that rock and roll was now more about hairspray, eyeliner, and MTV than actual music. Into that void, stepped Guns N’ Roses, Faith No More, The Red Hot Chili Peppers, and others. They provided unique variations at the tail end of the 80’s and early 90’s. At various points in the timeline, a variation has always stepped up to keep the beast alive, but hindsight informs us that rock and roll was, indeed, on life support at this time. Hindsight also informs us, that when the 90’s Seattle bands, and The Smashing Pumpkins, stepped to the fore, their derived variation on the theme was, in essence, a reset of the template that had been lost somewhere in the late 80’s, and they brought rock back to the early KISS, Black Sabbath, Led Zeppelin, and Aerosmith records of the 70’s.

This begs the question, would Nirvana have been as huge as they were, if they appeared on the scene around 1983-1984, or would that have been too early for them? Are musical waves little more than a question of timing? Did Nirvana hit the scene at a time when the desire to recapture whatever was lost in the late 80’s was widespread? The Nevermind album may have been so good that it would’ve sold in just about any rock era, but would Nevermind have outsold Quiet Riot’s Mental Health and Twisted Sister’s Stay Hungry, or would it have been too derivative of an era we just experienced? Would Nevermind have been the ten million copies shipped phenomenon it was if it hit the scene in 1984, or was it a valiant attempt to recapture what was lost in rock that we needed at the time?

Most of the musicians, in what rock critics called the grunge movement, had varied tastes, and some of their favorite artists were more obscure than the general public’s, but the basic formula for what would critics called grunge could be found in those four groups of musicians, from the 70’s, that had deep and varied influences. The grunge era, we could say, was the last innovative, derivative movement of nuanced rock.

Talk to just about any young person in America today, and they may list off some modern artists and groups that they listen to, but most of rock connoisseurs will provide “classic rock” band as one of their favorite genres. When someone my age hears the term classic rock, they’re more prone to think of one of the 70’s bands mentioned earlier, but these young people are referring to bands that were brand new to me somewhere around yesterday, yesterday being twenty years ago.

I know I run the risk of being dismissed as an old fogy when I declare rock dead, or something along the lines of “Today’s music ain’t got the same soul”, but there is something missing. In fairness to modern artists, and in full recognition of my old codger perspective, I have to ask how the “next big thing” will pop out, right now, in 2016, and offer the world a perspective on rock that no one has ever considered before? Such a statement does undercut the creative brilliance that young minds have to offer, but to those of us who have listened to everyone from top of the line artists to some of the more obscure artists in recording history, it seems to me that every genre, subgenre, experimentation, and variation has been covered to this point.

Gene Simmons asked where the next Dark Side of the Moon is going to come from, I ask where the next Melon Collie and the Infinite Sadness is, and it may be a question that led those of another era to ask what artist is the modern day equivalent to The Carter Family? I never thought I’d be this guy, but most of the modern rock music sounds uninformed and lacking in the foundation that previous generations had. I know this is largely incorrect, but when I listen to the rock bands of the current era, I don’t hear the long, varied search for influence Bob Dylan sought. I don’t hear artists harkening back to the rich and varied tradition old blues singers, folk musicians, and country artists learned from their family and friends in gospel songs at church, at campfires, and at night before going to bed. It might be there, but I don’t hear an informed artistic persona. Their music lacks some of the organic funk R&B musicians brought to the fold, and the intricate instrumentation that the 50’s and 60’s jazz musicians left for their successors to mine.

Some consider this entire argument moot, however, and they say that the nature of music and art in general, suggests that there will always be an innovative, up and coming star who develops variations and derivatives of former artists if there is money in it. Naysayers would echo their favorite artists and say it’s not about the money, and true art never should be. While that may be true, it is also true that when the money is removed, as the Gene Simmons quote below states, there may not be people in the upper reaches of the chain (the corporate types) who are willing to spend all of the money necessary to help develop that talent, and soon after the whole model is thrown into chaos the structure of it is destroyed.

It’s About the Money. It’s Always About the Money.

“You’re [now considered] a sucker if you pay for music,” one of my friends informed me at what was, for me, the advent of file sharing. 

My friend did not say the words “now considered” but that was the import of his statement. I was no Luddite. I knew about the file sharing sites, such as Napster, but for me, Napster was a place to find obscure throwaways, bootlegged versions of the songs I loved, and cover songs, by my favorite artists. I learned of the Metallica lawsuit against Napster, and some talk of file sharing among the young, but I had no idea that the crossover to file sharing had already begun for music enthusiasts, until my friend dropped this line on me.

The line did not inform me of the new way of attaining music, as I already knew it was out there, but it informed me of the new mindset in regards to accessing music. After scouring these sites for my favorite songs, albums, and artists, (and finding them, waiting to be downloaded for free) one thing became crystal clear, this was going to change everything. I read of the music industry hauling young people into court after illegally downloading music, but my astute, file sharing friend said he believed that the music industry was desperate, and that they were trying to scare people. He correctly predicted that the music industry would stop trying to prosecute them and simply give in. He said that they should’ve done something long before this point (and this point was very early on in the age of file sharing) to cash in on the file sharing wave. He said that there were simply too many people, from his small corner of the world, downloading music for free, for the music industry to prosecute them all.

File sharing, say some, may have spelled the true death of rock and roll as a profitable, cultural force in America today. I write this as a qualifier for those who suggest that the idea that a bunch of kids sitting in a garage to develop a new sound will never die. It may not, but reading through Gene Simmons’ interview in Esquire, a reader learns of the type of support that new musicians need from execs in the upper echelon of the music industry to help them progress from garage rockers to a cultural force in America, and that that part of the structure has been destroyed by file sharing. To belabor this point for just a moment, we would all prefer to believe that our favorite musicians have little regard for money, or corporate influence, but how much of the sound of an album was tweaked, finessed, and completed by industry money? Listen to insiders speak of a final product, and they’ll tell the reader that the album doesn’t sound anything like it might have without a high quality producer, and it doesn’t sound anything like it did before the corporate mixer came in and put in some of the finishing touches that those of us in the audience don’t care to know anything about.

Rock and roll’s appeal has always been a young person’s game, however, and that makes most of the derivative argument moot. Most young people live in the now, and young people have never cared that their favorite artist just happened to be a hybrid between The Rolling Stones and Aerosmith, at least not to the point that they wouldn’t buy their favorite artists’ albums. As far as they were concerned, their favorite band’s sound, and look, was fresh, original, and theirs.

“My sense is that file sharing started in predominantly white, middle- and upper-middle-class young people who were native-born, who felt they were entitled to have something for free, because that’s what they were used to,” Gene Simmons also said in the Esquire interview. “If you believe in capitalism — and I’m a firm believer in free-market capitalism — then that other model is chaos. It destroys the structure.”

Death of the Album

Scouring these file sharing sites, and creating personalized playlists, I also sensed a death of the album. As an album-oriented listener, I always thought one could arrive at the artistic persona of a musician in the deeper cuts of an album. My philosophy was fleshed out by Sting, and his, “Anyone can write a hit, but it takes an artist to write an excellent album” quote. I was affected by the new file-sharing mindset almost immediately, as I began to consider it a waste of time to listen to the various Queen Jane Approximately cuts, when I could create a playlist filled with top shelf, Like a Rolling Stone cuts from various artists.

The idea of the self-directed, playlist mindset developed somewhere around the advent of the cassette tape, an era that predated me, but the full album managed to maintain most of its glory throughout that era. For most of my life, the power of a quality single led concertgoers to leap out of their seat and rush the stage. With all these new tools, however, a person no longer has to stand around for an hour waiting for the band to take the stage. They no longer have to sit through mind-numbing guitar solos, and witty banter from the lead singer to get to the one song they enjoy from that artist. They can now go to a site like YouTube to watch their favorite singer sing that quality single.

I still think that the lack of depth in most products current artists put out is a factor in the demise of rock as a force in the industry. I am persuaded that that is not the case by the idea that young people know as little about the history of their music as their favorite artists do, however, and what little they do know is superseded by how little they care about it.

I am also convinced that file sharing has had an effect, if not a devastating effect, on the structure from top to bottom. Another writer had an interesting take on this matter, stating that the file sharing mindset may have something to do with young people growing up watching their favorite artists display their wares on shows like MTV’s Cribs. Shows like these may have led young people to think that their favorite artist has enough money as it is, and the shows may have led the young people to download the music for free without guilt. Which, in turn, led them to believe anyone who plops down money for music is an absolute sucker for supporting the artist’s hedonistic lifestyle. 

“They’re not going to miss any meals if I deprive them of my $9.99,” they might say. That may be true in the case of this one individual, but what happens when millions begin sharing this mindset? What happens is that when we begin removing the $9.99 bricks that formed the foundation of the industry, we destroy an industry, as we knew it. They will sign fewer rock artists, they will no longer hire all those little guys who helped finish the product, and they will no longer provide support or promotion to an album that would’ve garnered it before, because there’s little-to-no money in it for any of the players, on any level.

“You might be right. Rock and Roll might be dead in terms of a business enterprise that brings in new artists and fosters their career, but fans can always go back and listen to the music of your era,” young people might argue. “What the idea of Rock and Roll never dying means to me is that once an artist puts their music out, we can, and will, listen to that music forever.” This is obviously true. If we read the reports from the companies that list download sales, the dinosaur rockers are almost as popular as they’ve ever been, and in some cases more. If we look at concert receipts, they’re doing as well, and in some cases better, than ever.  

The spirit of rock and roll will never die, but it does appear that the starving artist walking around with nothing more than a guitar strapped to his back will have a much tougher time receiving institutional support, financially and otherwise from the corporations that brought all of the cultural icons to the mainstream, and the consequences for that could run deep for a culture that has subsisted on the philosophical foundation of rock music for as long as most of us have been alive.