Friday, December 11, 2009

Pain English


I suspect the reason business people so often resort to gobbledygook in their writings is not just laziness or habit, but that they're afraid specific language is too confining and restrictive. Ambiguity seems more inclusive. It relies on the reader to fill in the blanks of meaning. These would-be communicators fear that explicit, precise, and detailed writing might not tweak every reader's interest or encompass every possible interpretation.

Problem is, it doesn't say anything either.

You don't have to look hard for examples. As I started to write this, IBM spammed me with an email they hope will draw my attention to an article in their newsletter. They say the piece is about how that company helps "organizations make strategic decisions that enhance competitive advantages, create new sources of value, improve revenue growth and develop the change programs necessary to meet business or mission objectives."

Well, who wouldn't be interested in all—or at least some part of—that? Aren't those things that all businesses want to do? But the email doesn't give me a clue what IBM's consultants really offer, or how they deliver it, or what they've done for somebody else.

On the contrary, the message I get—loud and clear—from their spamMail is that IBM doesn't know a thing about my business or me. They think of Bob Kalsey and Bravura Films as just another organization with problems and issues no different from those faced by any other. By trying to offer everything, they offer nothing at all. That's the ROI for ambiguity: nothing.

These thoughts are inspired by a question today on LinkedIn, where I occasionally try to contribute to the conversation. Loren Hicks referred there to a help-wanted ad that includes the phrases “The role will leverage all aspects of the offer matrix” and “ … will include presenting and evangelizing xxx’s offering …”

Loren asks how people respond to such a thing and whether they'd apply for the job—whatever it is.

My response was:

I would be pleased to apply for a role that leverages all aspects of the offer matrix and proud to present and evangelize the company's offerings, especially if the aspects are of the unique, cost-effective and robust next-generation aspect type.

At the end of the day, though, the matrix would have to be flexible, scalable and optimized in terms of metrics that deliver value added outcomes, and the offerings would, hopefully, be easy to use, world-class and unique—as well as focused on high performance innovations from a leading provider of new and improved feature sets. Heck, the bottom line is, I'd give 110 percent commitment—or more—to such a win-win partnership of all stakeholders. Wouldn't you?


I have to give credit for many of the buzzwords I used to David Meerman Scott and Dow Jones, who created and provide (under Creative Commons) a list of the things. Just in case you've forgotten some of them and don't want to leave any out of your next spam.

Scott is as ferocious about blather and baloney as I am, and you might want to read his Gobbledygook Manifesto, in which he points out another reason business writers default to those words: they don't really understand their products, or how customers use them.

Shame on them.

Wednesday, November 18, 2009

The Business of Music




The music industry is in crisis. The fundamental cause is the unbundling of creative content from physical media: the same phenomenon that is behind the troubles of the newspaper publishing business.



At one time, music was not bound to physical media at all but was part of the oral/performance tradition. Artists supported themselves through patronage and direct payments by their listeners.

Musical notation was the first technology to change the music business. It arose not as a means of publication and distribution, but for the preservation of music and the convenience of its performers—and music became fixable, in some respects, to a physical medium. Various musicologists improved and standardized notational methods through the 16th century, concurrent with the printing press boom, and music publishing became a viable business.

Music flourished as an industry due to the entrepreneur's ability to produce and sell copies of sheet music and collections—to fasten creative content to salable, durable physical products. "Pirating," though, was a commonly accepted business practice in the United States, until Congress extended copyright protection to music in 1831.

Before the publishing revolution, music was not so much a business as an art, and musicians were independent of commercial specialists: lawyers, agents, publishers, printers and resellers. They were beholden only to the makers of instruments, their fellow artists, and the gratitude and generosity of their admirers. That changed when composers made a Faustian bargain with publishers. It seemed like a good idea at the time and, for centuries, it was.

After 1878, with the perfection of the first practical sound recording and reproduction systems, recorded music followed the same strategy as music publishing. Not just notations, but actual musical performances could be fixed to mass-reproducible physical media. The advent of the phonograph brought a crisis to the music publishing industry's reliance on sheet music sales, but publishers (and the composers they represented) adapted their revenue model to draw income in the form of royalties, on the strength of copyright protection.

Radio broadcasting was initially seen as a threat to the recording industry because it made music "free" to the consumer—first with the support of the radio set manufacturing industry and, eventually, of advertisers. The music industry demanded—and received—compensation through broadcast performance licensing, but its income from that source paled in comparison to that from record sales. Eventually it became clear that radio broadcasting was an effective music-marketing tool because of its reach and influence. Radio was not an on-demand medium, so consumers who wished to hear a particular performance at a particular time still needed to purchase it in its physical form.

Digital technology changed everything, and that is where we are today. Music and its performance are no longer bound to salable physical media. Individual consumers are able to cheaply copy, store, and distribute music. Whether those practices are legal or not is almost irrelevant, because in practical terms the laws and patents that protect the rights of creators, performers, and their licensees are unenforceable.

The business models are broken. The industry faces the prospect of a return to its roots: the support of musicians through patronage and the gratitude and generosity of their admirers.

The good news is that digital technologies are available to musicians, as well as to their audiences. Composers, arrangers, and musicians can create, perform, distribute, and monetize their products with as much independence from the traditional middlemen and facilitators as they are willing and able to claim. If one wants to go that way, one can. But the upside potential for the independent artist is limited; the rewards are different, and of a smaller scale, than those possible for musicians who choose the traditional route to success.

The music industry has long offered the alluring potential of quick and possibly obscene profit. That is not and has never been the principal motivation for every musician or music business person, to be sure, but it has always lurked in the wings behind the dream of artistic acceptance: the hot score, the hit single, the platinum album, the world tour, the merchandise, and the star on the walk of fame. If not the goal, that elusive dream has always been the ultimate verification of one's value. But to be honest, achieving that dream was never the pay-off for talent alone, or talent and hard work. It was the product of a machine, and artists never ran that machine.

Those at its controls are struggling today with the new digital reality. They are re-jiggering the machine, changing its components, re-tooling to maximize and maintain the profitability that fuels it. Music served on a platter is no longer the product, but the music and those who make it are—once again.

In a way, the machine too is returning to music's roots—to the song, the personalities, and the performance, rather than the page, the disk, or the cassette. It returns with the benefits of the contraption it has evolved to become: its profit motive, its managerial skills, its media reach and influence, its marketing power, its packaging abilities, its synergistic properties and its business relationships. Of course, it also arrives with the burdens of its artistic timidity, its pandering to public preferences, and its reputation for greed intact. You can learn everything you need to know about the machine from "American Idol."

Musicians, since their Faustian bargain with the Mephistophelean machine, have always been only a part of the system, and they are finding that their relationships—and their place in the revenue stream—are changing. Ironically, as the performers and their performances are increasingly the salable product, their share of the revenue will decline. There is no other way to feed the beast.

Musicians have two choices, and every imaginable opportunity in between: to go it alone, or to ride with the machine. Either way, the future is uncertain. But the future always is.


Wednesday, October 14, 2009

That Nobel Prize


Peace is more than a state of warlessness—a state, it should be noted, the world has never known. It is, as importantly, a process and an attitude.

The Norwegian Nobel Committee has conferred the Peace Prize on 97 individuals and 20 organizations. In only three instances was it awarded to individuals who were actually involved in the direct brokering of peace accords between nations.

Theodore Roosevelt won in 1906 for his successful mediation of an end to the Russo-Japanese war and other contributions. Henry Kissinger was honored in 1973 for his work on the Paris agreement that led to the final cease-fire in Vietnam and the withdrawal of American forces. And Anwar Sadat and Menachem Begin shared the prize in 1978 for the Camp David Agreement, which led to a negotiated peace between Egypt and Israel. (Despite the persistence of animosity, hostility, and bloodshed between Israel and its neighbors, that country and Egypt remain—after a fashion—at peace with one another.)

Some winners have helped to reduce tensions and bloodshed within their nations. The Committee honored Nelson Mandela and Frederick Willem de Clerk in 1993 for terminating the apartheid regime in South Africa, and John Hume and David Trimble for their efforts to find a peaceful solution to the conflict in Northern Ireland.

But, mostly, the prize has historically been awarded to people and institutions who have worked toward the process of conflict resolution and aided or inspired humanitarian efforts. The International Committee of the Red Cross has won three times, the UN High Commissioner for Refugees, twice.

Of all the Nobel prizes, the Peace Prize is generally the most controversial. Were there, in each year, an individual or organization conspicuously responsible for bringing peace to some corner of the world, there would likely be no controversy; the Prize would be a slam-dunk. Sadly, that has seldom happened, but happily there are always folks grinding away at the process, forcing attitudes and postures to change, making contributions. They might work for decades, on their own initiative, with little recognition. Others might just be at the right place when history comes to call. In every case, though, they are—more than anything—an inspiration to others.

Geir Lundestad, the Secretary of the Norwegian Nobel Committee, has said that President Barack Obama will receive the honor this year for his creation of a new climate in international politics. The cable tv political bloviators—and the Chairman of the Republican Party—think the prize is pretty much a joke; that Obama is undeserving, the prize diminished by his selection, and the award no more than a slap at former President Bush and evidence of an international socialist conspiracy at work.

President Obama richly deserves the Nobel Peace Prize. Even before taking office, he revived the world’s long-dormant sense of hope for peace and positive change. People around the globe see him as a transformative figure and are inspired by his message of optimism, his call for mutual respect, and his promise of progress. They wear Obama t-shirts in Udaipur these days, and gave this American visitor high-fives in Delhi last December as they grinned and shouted, “Obama! Obama!”

As a presidential candidate, Barack Obama convinced the majority of American voters to choose peace and to reject the previous administration’s policies of hostility and confrontation; to vote for rationality and against rigidity and blind ideology; to support a foreign policy based on respect, rather than arrogance; and to believe, once again, that it’s okay to look after one another--here at home and around the world.

None of the above, none of the genuine feelings of so many citizens of the world, is welcome news to those who reflexively oppose the President no matter what he does or tries to do or what honors or endorsements he receives. Nothing in the revitalized aspirations of many millions around the world will reverse the hostility of those who hope and pray for our President to fail (as though his failures would not be our own), who are irreversibly angry about his election victory, and who still can’t believe that most American voters don’t agree with them.

Those Americans should get over it, and get with the program—or get out of the way.

Rachel Maddow concluded an excellent MSNBC broadcast on the matter saying, “The American president just won the Nobel Peace Prize—by any reasonable measure, all Americans should be proud.”

Indeed.

Monday, September 14, 2009

Social Networking


Names are kind of funny. We like to name things, because it gives us the illusion of understanding them and the hope we will ultimately control them. When we name a disease, for example, we begin to think that one cure – if we're lucky or smart enough to find it – will remedy all instances of the malady. Unfortunately we often mistake symptoms for diseases and forget that a symptom may have many causes. Cancer comes to mind. Or the common cold. We forget, too, that the disease might be entirely imaginary – caused by mass hallucination or hysteria.

"Social networking" has a lot in common with diseases in those respects. It isn't a single thing, nor is there a single way to deal with its many instances. Giving it a serious-sounding and techno-babbly kind of name may make us feel as though it's one thing and that we understand it, but those impressions are false. It might even be imaginary, brought on by exposure to the radiation of computer monitors and Blackberry LCD screens.

We like to categorize things, too: also so we can understand and control them. Ever since Linnaeus we've tried to categorize the flora and fauna of the Earth, for instance. But we've embarrassed ourselves many times because the categories we've invented have sometimes turned out to be meaningless and those in which we've chosen to place a thing have not always been the most appropriate.

"Social networking" is a whole jungle of creatures and they don't all belong in the same part of the zoo. Facebook and Flickr, YouTube and Twitter may be cousins, branches on the Internet family tree, but should they be in the same cage and fed the same diet? Maybe they're all in the kingdom "Digital," the phylum "Internet," and the class "Social," but are they all in the order "Advertising Medium?"

Marketers are some of the most dedicated namers of things. One of them came up with a condition known as "halitosis" in order to sell an elixir for bad breath. Many folks who sell advertising and technology consulting services have latched onto "social networking" as a way to foment a profitable combination of greed and fear, dread and avarice in the marketplace for their wares. It helps that nobody really knows what "social networking" is (it can mean anything you want it to mean), but everybody wants to turn it to his own advantage or save himself from its potential ravages.

We sure do like to try to turn everything we encounter to our advantage, and that can bring good results or otherwise. Thankfully some guy long ago saw a spiny lobster crawling around and said "I don't care what it looks like, I'm gonna eat the thing." But it's not a good idea to leave infants unattended around cans of paint thinner.

Many thirsty folks these days are thinking seriously about swallowing the social networking Kool-Aid. I guess we'll find out how that turns out.

Wednesday, August 26, 2009

Senator Kennedy


Shakespeare observed remorsefully, "The evil that men do lives after them; The good is oft interred with their bones" Tragic but true, it speaks of our pettiness that we are so often blinded and imprisoned by our hostility.

Certainly Edward Kennedy was merely a man and had his faults as all men do. Many disagreed with his politics and with his beliefs about what makes for a civil society. But we needn't foam at the mouth at the mere mention of his name. To do so is unseemly, unfair, unhealthy, and makes us unworthy.

I, for one, will remember him as he remembered his brother Robert – as "a good and decent man, who saw wrong and tried to right it, saw suffering and tried to heal it, saw war and tried to stop it." And as a man who both respected and served humankind with a sincerity, grace and eloquence rarely seen these days.

Adam Clymer, former Washington correspondent for The New York Times and author of "Edward M. Kennedy: A Biography," is answering questions about Senator Kennedy today on The Times' Politics and Government blog. He writes, in part:

"If you voted at 18 or were served Meals on Wheels or took advantage of a Medicare drug benefit, he helped get you there. Cheap college loans, children’s health insurance, aid to the disabled and a variety of civil rights measures are also to his credit. I don’t adore him, but I respect that record. He achieved it by working across party lines, remarkable in a day when bitter partisanship seems to trump most issues in Washington."

Friday, August 21, 2009

A View of Mt. Twitter


Doc Searls is a most interesting fellow and he has a wonderful sense of metaphor. The other day he wrote that tweets have "the impact of snow on water" while "blogging is geology."

Tweets, as you know unless you've been comatose for a while, are those usually trivial and often incomprehensible mini-messages that some folks like to send out into cyberspace from their phones or computers in hopes of relieving their feelings of inadequacy and/or irrelevance.

Some folks in Iran did a lot of tweeting after the recent elections there, you may recall, and there were important social and political reasons to demonstrate their relevance. Their tweeting served the common good, to be sure, which shows that twittering offers real and potential benefits. Still, so much drivel, so much snow on water.

Not that there's anything wrong with drivel. It serves a purpose. It's part of the glue that holds people together, and there's value in that.

Twittering is sort of like the "Active SETI" project that attempts to send messages to intelligent aliens (should there be any) elsewhere in the universe. Both twitterers and the Active SETI people assume somebody may be out there listening for signs of intelligent life, though tweet-makers sometimes seem to be less concerned about the intelligent part. In the case of SETI, the subtext behind the messages is "You are not alone," while for tweeters it's often "I am here."

Doc's point, if I may be so bold to hazard an interpretation, is that tweets are ephemeral – part of the babble of the human brook flowing by. Blogs, on the other hand, become part of the record of human experience, just as sediments become a record of biological and geophysical events.

No doubt part of the appeal of Twitter is that it's so darned hip. But another part is that it IS ephemeral, which makes it a low-risk form of communication. Tweets aren't as likely as blog or Facebook postings to come back and haunt us someday. They go away pretty quickly, almost as fast as the remarks we make in conversation, so we can be spontaneous and frivolous and not fear that others may use our words to our detriment in the future – to make us seem supercilious or trivial or careless or worse.

I think Twitter may be changing blogging, making its recording function more significant and its reporting function less so. People use Twitter now to point others to things they find interesting or provocative and to publish trifles – things they might have formerly done with weblogs. Blogs are, I think and hope, becoming a medium for more carefully considered and painstakingly prepared messages. Blogs may become more worthy of the preservation that is part of their nature. They may become more interesting. They may even remain interesting to the cultural archaeologists who will dig around in them in the future to find out what people were like back in the early part of the 21st century.

It's a commonplace that each new medium adopts some of the characteristics of those that preceded it. Television, before it found itself, was a lot like radio – but with pictures. It's also true, though, that new media change those that are already in use. Radio became something different when TV came along. Twitter is a new medium that has taken to itself some of what was once the purview of the blogosphere, and I expect that blogging will change, now that the "frivolous" stuff we can't stop ourselves from producing finally has another place to go.

Of course, all this presumes that Twitter will persist long enough to make an impact beyond the few million digitally devout souls who use it now. Or that something else will take over its niche. It seems important enough to survive, but I wonder if its importance might be an illusion.

Seems like Twitter may seem important mostly because people talk about it. When people talk a lot about something, marketers perk their ears up, wonder if they can use it to sell stuff, and start sniffing around like dogs around a sandwich bush. When people with money in their pockets start sniffing like that, the cadre of consultants sees an opportunity to transfer some of that cash into their own pockets. Those consultants join the crowd of talkers. And pretty soon you've got a phenomenon on your hands, and pretty soon after that it becomes a mania.

There is a marketing principle that says the best way to success is to stake a claim on top of some mountain, where the mountain is an idea or a proposition or a gizmo or what marketers call a "category." Stake a claim at the top where you can be most visible. Many companies and would-be gurus are battling for control of and visibility atop Mt. Twitter – which seems to be about the highest peak on the horizon these days. I wonder, though, if Mt. Twitter is a real mountain or just another hill piled so high with curious marketers and hungry consultants that it has the look of an actual mountain without the granitic core to hold its own against the forces of erosion.

Time will tell. It always does.

Thursday, July 23, 2009

The Future of the Newspaper


Certainly technology will bring profound changes to newspapers and to the ways in which people experience news and information. But whether newspapers survive in their present physical form or some other, I expect them to evolve in significant ways if current trends continue. Their evolution, I believe, will take inspiration from the media with which they compete.

People often decry the bias of news sources, yet the most biased commentators are often the most popular. What we publicly decry may be exactly what we privately crave. We are naturally predisposed to accept and agree with interpretations of facts that support our preconceptions and, similarly, to distrust and differ with those that conflict with them. A trend in information media, facilitated by a greatly expanding number of media outlets, is toward increasing segmentation along socio-political lines.

It's a disturbing development, rooted in the profit motive that is essential to the current model of commercial information providers. Communication was once called "the glue that holds society together." It has become, instead, an adhesive that more tightly bonds individuals of particular socio-political leanings to one another, rather than a unifier of human society as a whole. Success in mass media once required providers to appeal to a broad audience but now it is possible to thrive in a niche.

I expect that xenophobia will spread from talk radio and cable punditry to newspapers. A 2004 study of young adult readers by Readership Institute found, not surprisingly, that "people want to read about people like themselves in their local daily newspaper," and "There is less interest (in) coverage of groups to which one does not belong." Perhaps newspapers will become more overtly opinionated in their coverage, cater more to the xenophobic tendencies of their readers, and position themselves more as the voices of specific identity groups. Already we are seeing more opinion, gossip, and biased analysis creeping from the op-ed pages into formerly hard news sections; more column inches throughout local papers topped with the photos and by-lines of their own celebrity pundits.

Another trend, although it has always been prevalent, is information as entertainment. By far the most popular newspaper features are the comics and sports pages. In advertising and news content, readers under 35 prefer information about "things to do," such as recreation and local activities, and "ways to get more out of one's life," such as health and fitness features. Reports of events from around the world are instantly available on the Internet, through Twitter, and on radio and television. Newspapers are unable to compete with the immediacy and pungency of these other media, so we can expect their focus to shift away from event reporting in favor of lifestyle features, amusement, and the narcissistic concerns of their audience.

A third trend I will call "informer-as-celebrity." I think that in a strange way Walter Cronkite is to blame – not personally, but because of the value he brought as an individual to the CBS television network. The other networks competed against Cronkite's highly successful "that's the way it is" reporting not by doing a better job of authoritative, credible coverage, but by emphasizing the personalities of their own anchors. They fought substance with style, and it proved to be a successful strategy.

NBC created "The Huntley-Brinkley Report" to succeed its "Camel News Caravan," tellingly replacing the name of the program sponsor with the names of its anchors. Chet Huntley and David Brinkley were superb newsmen, but their network traded on their personalities rather than their journalistic acumen. National and local news outlets followed suit and polyester-haired anchormen (and later, women), along with clownish weather and sports reporters filled the airwaves with happy-talk news programs. The spokespersons became the medium and largely the message of broadcast information.

The spawn of the ménage à trois of these trends – socio-political segmentation, information-as-entertainment, and informer-as-celebrity – is hostility-as-entertainment. What appeals to a large audience about cable television's motley crew of bloviators is the anger and rage they express and the gleeful pleasure they take in bitterness, insult, derision, and obstinacy. "Yellow" journalism – sensationalism, scandal-mongering, and unprofessional practices – has a long tradition; it's nothing new, and it's always masqueraded as "real" news. In the past, it has been part of a newspaper's overall brand and only occasionally identified with a specific reporter or columnist. We may well see more – and more outrageous – sensationalism as newspapers experiment with ways to emulate the appeal of their broadcast competitors. And I expect that a breed of bullying celebrity journalist "stars" will become more important to each newspaper's brand.

One can expect other trends as newspapers cater to their perceptions of audience demands.

Young people think newspapers are too big; they prefer concise, bite-size news. According to the 2004 Readership Institute survey, this group tends to agree that: “I wish this newspaper had fewer pages,” “It has too many special sections,” “It tries to cover too much,” “Too many of the articles are too long.” The same organization's study of a broader reader group similarly concludes that people who feel overwhelmed by news, tend to read newspapers less.

Motivated to expand – or maintain – their readership, newspapers seem to believe that their regular, devoted readers can be counted on to continue their newspaper habit, so they are catering more to "lighter readers" – ironically by providing less: fewer pages, shorter articles, and more limited coverage.

Younger readers also say that they highly value "dynamic visual treatment," and newspapers are certainly trying to cater to this with their colorful eye-candy designs – just as cable news relies heavily on high-tech graphics and the endless repetition of dramatic imagery.

Whatever physical form the newspaper takes in the future, we can expect news delivery media to: target segmented audiences; appeal to narcissism, xenophobia, and the thrill of sensationalism; rely on celebrity pundits; deliver less news more concisely; and do it all with dazzling graphics.

Sorry, news fans, but "that's the way it is."

Tuesday, April 21, 2009

Bloggers and Barkeeps


Someone left a copy of the Wall Street Journal on a stool down at the Bar and Grille yesterday and as I had arrived before the regulars and Manny the bartender was busy with something or other in the back, I scanned through the paper in hopes of gaining some insight into where my money had gone to.

There were articles about this company lowering its expectations for the next quarter and that one doing a little better in the last one than it had any reason to. There was a story about how all that money the government gave to the banks was either still there, at the banks, or had vanished without a trace into whatever void money goes when you take your eyes off it. And there was an analysis of the previous day's stock exchange decline, which attributed the loss to an announcement that a big company was making more money than people had expected it would. Somehow, that was bad for stocks in general. The day before I'd heard that stock values had gone up for a similar excuse.

Economics and finance are too complicated. That's what got us into the pickle we're in right now, and I said so to Manny when he came back to the bar to pour my drink. "I don't get this when they say one day some piece of good news made the market go up and the next day they say the same kind of news made it go down. What do you make of that, Manny?"

"Well, now, Bob," he said, "Maybe there's subtleties to it that us normal folks just don't comprehend. More likely, it seems to me, these newspaper writers don't have a clue themselves so they just latch onto some bit of news to blame for whatever happened in the market. The less sense it makes to you and me, the smarter they look for figuring it out. Me, I can't relate to any of it."

I turned to the next page of the paper and saw an article I thought Manny might relate to pretty well. Seems somebody figured out that almost a whole percent of Americans are getting paid as bloggers and their number now exceeds the total of professional bartenders.

Now I thought bloggers were mostly people who have the writing bug but, being unable to think up anything worthwhile to write about, tell their few readers what somebody else wrote about somewhere, adding a little, "this is cool," or "so-and-so had an interesting remark about such-and-such." And the rest of them are just self-absorbed people who think somebody else might be interested in what they had for breakfast, and none of them is paid a dime for their contributions to the American conversation. Seems I was wrong once again.

According to "The Journal" (which is how people who want you to know they read The Wall Street Journal refer to that periodical), more people make their actual living sticking their opinions on the Internet than do so by programming computers, or fighting fires, or practicing law.

"So, Manny," I said, "says here you bartenders are out-numbered by professional bloggers."

Manny observed that spouting off opinions is a growth industry while reporting actual news is on the way out, and he wondered what the world is going to be like when there isn't any news to complain about. "I guess those bloggers will be talking about themselves and sniping at each other even more than they are already. But you know, Bob, that's how it's going anyway. Why, even the regular news these days is mostly all about the news business itself and how it's going to hell in a hand basket."

I tapped my glass and Manny reached down to the well for the scotch bottle. As he poured, I suggested maybe he ought to think about taking up blogging himself. "Why, you are one of the most opinionated people I know, Manny. Seems like you could do pretty well at that. Don't take more than a few bucks to get started; eighty dollars, it says here, and you could make a hundred thou' or more if the breaks go your way."

"Well, Bob, there's something to be said for working at home, unshaved and in your jammies, but I kind of like to put a tie on and come down here to the bar. I get to talk to people. I hear things. Some of the things I hear are even true. Sitting by myself in front of a darned computer all day? Trying to stir up some hullabaloo to entertain other people who are doing the same thing? That doesn't appeal to me, and there's something almost unethical about it."

Manny took a load of glassware out of the dishwasher and stood back as steam rose into the air. "I don't mean any offense, Bob, because I know you write one of those blogs yourself. I read it once, and it was ... entertaining."

I thanked him for the compliment and said I'd often wondered who it was that read my blog that one time. "I guess it's a good thing you don't want to be a blogger, Manny. I'd rather come down here and trade insults with you in person than read your opinions on a computer screen."

"Aw, Bob. You know you just come down here because I pour you one on the house now and then."

"Well, there's some truth to that, Manny."

"Not today," he said, "but now and then."

"Better be good to me, Manny," I said, "This article says that 'If journalists were the Fourth Estate, bloggers are becoming the Fifth Estate.'" I showed him my empty glass. "So don't be so stingy with a jigger of that cheap booze, or the full weight of the Fifth Estate might bring you down. We bloggers are getting to be a powerful force in American culture. It says so right here in The Wall Street Journal."

"And would that be the same Wall Street Journal that says the stock market went down because some company made a lot of money?"

Manny isn't cut out to be a blogger; too much common sense.

Monday, April 6, 2009

To Be: A Noun


Over on LinkedIn the other day, leadership trainer and founder of Leaders and Thinkers, Benjamin Anyacho, asked, "What do you want to be remembered for as a leader?" He referred to Methuselah, Noah's granddaddy, who lived for nearly a thousand years, yet his legacy was written in two sentences. "In fact," Benjamin noted, "there was nothing to be remembered about Methuselah except that he was the oldest person that ever lived, and he had sons and daughters," and he added, "it's not how long we lived but how well."

I replied that I would not be so quick to disparage Methuselah. His achievement was so profound, so unique, and so well known, that the old fellow has become a noun.

There is something to be said for becoming a noun. James Watt became a noun, representing power even to this day. Adolf Hitler became a noun, it is true, but his name is a pejorative. We honor Napoleon with a couple of nouns, one a pejorative, the other a pastry.

Few people in history are sufficiently notable or notorious to even reach the lesser status of adjective. A candy retailer named Morris Michtom honored Teddy Roosevelt by naming a stuffed animal after him. Michtom founded the Ideal Toy Company on the strength of public response to the Teddy Bear, but the toy's association with Roosevelt's name was so tenuous that it is now all but forgotten; few writers these days even bother to capitalize the "teddy" part.

The adjective taken from Charles Ponzi's family name is much in the news these days, but his unfortunate survivors may have difficulty passing checks imprinted with their names. Franz Kafka became the root of an adjective – although his name requires an added "-esque" to serve that purpose. Almost anybody can be an –esque. Even the pop bubblegum music supergroup ABBA, whose name is an acronym for its members, has lent its moniker to an adjective of the -esque form – though not one that is entirely complimentary.

One's legacy may also become a verb. Folks caution White House interns these days not to Lewinsky. Good advice, but in another generation it won't be understood – and probably won't be followed anyway.

Victor Hugo said, "The word is the Verb, and the Verb is God." Buckminster Fuller expressed that line as "God, to me, it seems, is a verb not a noun, proper or improper." Some say that Fuller declared that he, himself, was a verb – which with some logical manipulation might be taken to equate himself with God. I'm not so sure he actually ever claimed to be a verb and I'm pretty sure he never claimed divinity. I am fairly certain, though, that Ulysses S. Grant, shortly before he died, believed himself to be a verb instead of a personal pronoun. Possibly just wishful thinking on the General's part.

I could accept a legacy as a verb, so long as it is an energetic one.

I would also be satisfied were my legacy an adjective, but more delighted to survive as a noun. What, exactly, would a Kalsey be? That remains to see. Something admired, or respected, or striven for, I hope. Any good thing will do.

One thing I do not look forward to being is a past participle, mostly because few people know what those are.

Friday, February 27, 2009

Of Mice and Heirs


I read of bloggers and hear about conservative yakshow bloviators proclaiming that they are sick and tired of hearing President Obama say, "I inherited this, and I inherited that," as though Obama takes every opportunity to deflect responsibility.

The criticism of Obama's language is unwarranted. It is a fact that his administration inherited an enormous debt and the most serious economic crisis of our lives. The very vocal minority of people who still hold George W. Bush in high regard – or who oppose Obama for whatever reasons -- bristle to hear the new President remind the nation of that fact. But they seem to have propagated the "I inherited" phrase in their own minds and (dis)credited the President for saying it more than he actually has done.

Near as I can tell, Obama has used the phrase "I inherited" on only one public occasion: his press conference of February 9th. In his prepared remarks, he said, "My administration inherited a deficit of over $1 trillion, but because we also inherited the most profound economic emergency since the Great Depression...." Note, please, that he did not say "I," but "My administration" and "we."

He said "I inherited" only once at the press conference, responding to a question. He replied in part that some opponents of his economic stimulus package complained about wasteful spending but had presided over a doubling of the national debt themselves. He asked that those who would engage in some revisionist history remember that, "I inherited the deficit that we have right now, and the economic crisis that we have right now."

In his speech to Congress, Obama used "inherited" three times:

1) ...not because I believe in bigger government -- I don't -- not because I'm not mindful of the massive debt we've inherited -- I am.

2) It reflects the stark reality of what we've inherited: a trillion-dollar deficit, a financial crisis, and a costly recession.

3) With the deficit we inherited, the cost...the cost of the crisis we face...

Note that once again, Obama reminded us that "we" inherited the debt, the deficit, the crisis, and the recession. Not him, not his administration, but the current government: executive and legislative branches included.

In his inaugural speech, Obama spoke about the crisis but never uttered "inherited." On other occasions when he has used the word he has employed the collective pronoun "we." In a speech in Elkhart, Indiana, for example, he used the same language as he did the same day at the press conference: "We inherited a deficit of over $1 trillion, but because we also inherited the most profound economic emergency since the Great Depression..."

It seems to me that Obama is trying to avoid blaming the current Congress for our woes and focus instead on the fact that the collective "we" now have the responsibility to do something about the crisis. He is trying to shift the discussion from who's to blame to who's responsible for getting it fixed – and how to go about it.

We are paying today for the errors and apathy of the past, but Obama does not lay blame; he does not proclaim that the failed policies of the Bush administration – and the misguided ideologies behind them -- have brought us to our economic knees, though they surely have. He only says that the government, as now constituted, has been stuck with this mess and needs to deal with it. Perhaps absolving the current Congress and the new executive branch of blame will help all of government to think less about history and more about the future and to work together more constructively. (I'm not holding my breath.)

Some opponents of the stimulus package repeat ad nauseum the claim that the thing includes $33 Million to save the salt marsh harvest mouse in San Francisco. Well that's simply not true. First off, the mouse in question does not reside in the City by the bay, there being no salt marshes in the County. Calling it "Pelosi's San Francisco mouse," though, presses at least three conservative hot-buttons, so the truth be damned.

Second, the package contains no earmarks for mouse habitat protection in San Francisco, in California, or any place else. It simply provides funds to Federal agencies to restore wetlands – anywhere they decide to undertake that activity. Now it happens that the California Coastal Conservancy has requested 30 million bucks to pay for a 4,000 acre restoration project in the Bay Area, which would benefit salmon, steelhead, trout, ducks, egrets and any other thing that lives in the marshes here.

It will also improve flood protection of homes and businesses in the area, and provide about a hundred jobs, so count humans among the beneficiaries. It might be one of the many projects that ultimately receives Federal funds. But there's nothing about it in the bill.

The Frisco Rodent story is a complete fabrication, designed only to stir up opposition to the stimulus package and throw some mud at the Democratic Speaker of the House. Yet it has been repeated on Fox "news," the Washington Times, and in blog-after-conservative-blog as though it were a true and horrifying example of political maneuvering and government waste.

Once these stories of mice and men-who-inherit-stuff get started, there's no stopping them. Believers believe what they want to believe.

We ought to get over our partisan bickering. It surely doesn't help matters to pick at -- and disingenuously misquote and misinterpret -- the President's words and intent. To misconstrue the good works that are included in the stimulus package is downright dishonest. We ought to stop looking for faults in others and making them up if they don't exist. (I'm not holding my breath about that, either.)

***
By the way, if you can find a transcript of President Obama saying "I inherited..." any other time than during his February 9th press conference, do let me know.

Wednesday, February 25, 2009

This Just In...

A headline today, from the Associated Press:

"Study of fossils shows prehistoric fish had sex"

No, I did not participate in the study.

Wednesday, January 28, 2009

Rhymes and Reasons

The poet Elizabeth Alexander met Barack Obama when both taught at the University of Chicago. Her family has a political history, her father having been Chairman of the Equal Opportunity Commission and her brother, Mark, an Obama advisor during the presidential campaign and transition. Though not widely known (what poets are, these days?), Alexander is highly respected in poetry circles and has received numerous awards for her work. Not surprising, then, that Obama invited her to write and deliver a poem at his inauguration.

Alexander is a scholar of African American culture and literature, currently a professor of African American Studies at Yale University. Her inauguration poem -- which can be found here -- takes its title "Praise Song for the Day" from an ancient African tradition, the praise song -- a lively form by which the lives of individuals are celebrated. She chose in this instance, though, to celebrate not Mr. Obama but the everyday American.

There's been much talk about "Praise Song" and its delivery, with The Chicago Tribune, The Los Angeles Times, and most critics panning the work as too prose-like and the delivery not up to snuff.

Writing for The Guardian, Carol Rumens – a poet herself – declares "Even when writing for a public occasion and a vast audience, the poet should be able to renew language by being precise, surprising, unhackneyed. Otherwise, what is the point of such a commission? Alexander is a true people's poet, but she has written better poems for the people than this one."

A little kinder was Eli Lehrer, a Senior Fellow at the Competitive Enterprise Institute, who wrote for The Weekly Standard that it "doesn't qualify as a great poem, but it might emerge as an important one. As a celebration of the commonplace and an exaltation of the personal over the political, the poem offers a distinctly American take on the concept of occasional poetry." He decides that, "Yes, it's self-centered. Yes, the poem doesn't really have much logic. But it works."

"Praise Song" was not helped by Ms. Alexander's recitation of it at the inauguration, but it seems to me her words themselves were, while clumsy in part, appropriate for the day.

It was an occasion of plain speaking and common language. Mr. Obama's widely anticipated speech was itself not one of rhetorical delights and poetic flourishes; no lines he spoke are destined to be carved in granite on a monument or cast in bronze for the ages. But if they were to be, they would be set in bland Helvetica, the font chosen by those of whom it has been said, "they want to fit in and look normal. They use Helvetica because they want to be a member of the efficiency club."

What Obama said, beyond the words he spoke, was "See? I'm no elitist after all." That was something that needed saying to move the conversation from personality and ideological rhetoric to the hard work that needs doing and the hard choices we collectively face.

Also straight-talking at the ceremony was preacher and civil rights leader, Joseph Lowery, who brought some of his customary plain and common touch to the benediction. Dr. Lowery closed with his own bit of poetry derived from a refrain used by African American performers including the Almanac Singers of the 1940s and bluesman Big Bill Broonzy. One version of the much-borrowed rhyme goes like this:

If you're white, you're right.
If you're yellow, you're mellow.
If you're brown, stick around
But if you're black, stay back.

Dr. Lowery's take was a lot more hopeful: "help us work for that day when black will not be asked to get in back, when brown can stick around, when yellow will be mellow, when the red man can get ahead, man; and when white will embrace what is right." After all the solemn talk of hard times behind us and ahead, Lowery's gentle and effervescent humor was much appreciated. He made the occasion no less serious, but a lot more human.

Perhaps by prearrangement, Alexander's poem seemed designed to keep with the tone of the moment. Its most telling phrase was "Say it plain." And that she did. And that may be part of the reason for disappointment among those of us who found "Praise Song" wanting as a work of poetry -- why Carol Rumens felt it failed to "renew language."

More at issue for me, though, was her delivery which, owing to its pomposity and self-importance, undermined her message of respect, esteem, and appreciation for the everyday experiences of common folk. She placed an artificial emphasis on words and phrases, making cumbersome what might have been elegant. She imposed white space around those words, seemingly to give them exaggerated weight. She made precious the little things she meant to declare only noteworthy. Perhaps she felt too much the historic significance of the day or worried that her words might seem, were they left unadorned by affectation, trivial.

I don't think she listened well to what her words had to say. Her expressions hadn't the brawn and sinew of Sandberg, yet she tried to stretch them tight and bulk them up with muscle they were far too frail to carry. They were as simple, though not as effortless, as the American colloquialism of Frost, but her plodding reading gave their realism a resonance of insincerity.

Poets ought never read aloud their own work – they've too much invested in it.

I nearly fell out of my chair on hearing Alexander orate so solemnly: "Some live by love thy neighbor as thyself, / Others by first, do no harm or take no more / than you need." All I could think of was the cheesy sign at the King's Table Smorgasbord all-you-can-eat joint I frequented in college: "Take all you want, but eat all you take." That level, the ham-fisted inelegance of a cheap eatery's admonition against wasting its money, was unfortunately the low plane of much of "Praise Song for the Day."

I recall Robert Frost's reading at JFK's 1961 inauguration. Blinded by the glare of the sun and TelePrompTers not available, he could not read the poem ("Dedication") that he had written for the event, but recited his "The Gift Outright" from memory instead. It is a short poem, less than a third the length of Alexander's. It speaks about surrendering ourselves to the country, "Such as she was, such as she would become." It was a moving moment: an elderly, world-renowned and well-loved literary figure honoring a young man of "a new generation" who offered the nation new hope and vigor. Frost honored the nation, too, with humility and humanness and honesty.

Frost's "Dedication" has been called "dreadful" as poetry. But nonetheless it, or something like it, might have been a good choice for Obama's inauguration. In it, he speaks of "A turning point in modern history," and concludes declaring the start of "A golden age of poetry and power/Of which this noonday's the beginning hour."

Yeah; it even rhymed.