Tuesday, April 22, 2014

Volatile Vocabulary

Technology now demonstrates how word usage changed over the last 500 years.

A continuing challenge for historical fiction is dialogue, i.e. keeping it consistent with the period and not lapsing into present usage. Needless to say, a Gilded Age setting would not be suitable for terminology like “epic fail,” “chill out,” and “ass-kicking.” Instead, you would more likely hear “catastrophic loss,” “calm down,” and “thorough thrashing.” Some of these changes in usage are faddish, but others reflect dynamic cultural shifts. This month’s issue of the Harvard Business Review reveals a few examples of words and phrases that ebbed and flowed in frequency over the 20th century:

Language is a reflection of culture, so the terms we use most tell us a lot about our shifting priorities. To get a sense of how the world of management has evolved, we turned to Google’s Ngram Viewer. This tool charts the frequency of words and phrases in more than 5 million books published from 1500 to 2008. We narrowed our focus to the period beginning in the 20th century.
The most dramatically demonstrated rise to dominance is that of the word “management.” No doubt the hundreds and thousands of textbooks and trade books printed since the Progressive Era have had their effect. Compared, though, to the mild ascendancy and decline of “leadership,” this trend may have broader implications. Let’s see how Merriam-Webster defines the root of each word:

Manage: to have control of (something, such as a business, department, sports team, etc.)
Lead: to guide on a way especially by going in advance
I see two major differences here: management is something that can be left at the office whereas leadership is a trait to be borne 24/7. Secondly, management derives authority from outside of oneself while leadership rests in a person’s own temperament. Granted, today’s managers are often on call way beyond regular business hours, but the principle remains unchanged; their mandate is restricted to the enterprise’s formal operations. It is harder to put down the mantle of leadership because, if exercised well, it sinks into the very marrow of your bones.

This is the 21st century. We want more time off. Leadership demands all your waking hours. We have a right to privacy. If one such right actually exists, leaders forgo it, recognizing that they set an example in every area of life. Successful management requires cleverness, quick-wittedness and thinking outside the box. Optimal leadership requires integrity, courage and thinking beyond the bottom line.  In the 21st century, principled leadership is extolled, flattered and revered. Management, however, and its effective execution seem to get most of the ink.

It may well be that leadership carries moral connotations with which we are uncomfortable. Management, on the other hand, is well-suited to technocrats and efficiency experts who solve knotty problems and make things run more smoothly. Our political rhetoric reflects the shift from exemplifying and extolling virtues to expanding and improving national output. The president of the United States, although invested with “executive” authority per the Constitution, was known for the first century of America’s life as the “chief magistrate.” Only later, as the size and scope of government grew, was the term “chief executive” adopted. Again, from Merriam-Webster:

Magistrate: an official entrusted with administration of the laws: as a :  a principal official exercising governmental powers over a major political unit (as a nation)
Executive: a person who manages or directs other people in a company or organization

We go from a limited mandate to a much broader one as the terms evolve and replace one another. Are there ominous implications here? Perhaps born-again libertarians like me are too quick to see tyranny lurking around every clause, phrase and sentence. Still, history does reveal a certain political enthalpy that accompanies ethical entropy. As common notions of morality give way to individual ones, government will grow stronger to prevent anarchy, e.g. the rise of Napoleon from the wreckage of the French Revolution.  Danger must be minimized; order, imposed; resources distributed according to a fixed regime.

This morphing from republican virtue to the procedural republic (hat tip to Michael Sandel of Harvard University) is mirrored by our changing word choices. Is it possible that by consciously amending the words we use we can thereby stave off tyranny? I would like to think so. 

Thursday, April 17, 2014

I Cite Authority. Authority Always Wins.

In the week leading up to Easter, we recall an arrest, a sham trial, beatings, floggings, crucifixion, death, burial and resurrection. As glorious as the ending is—and as divine it is in purpose—I often think about the political roots of the goings on back in the day. After Jesus delivered the Sermon on the Mount, a masterful presentation in substance and rhetoric, Matthew tells us what went on in the aftermath:

When Jesus had finished saying these things, the crowds were amazed at his teaching, because he taught as one who had authority, and not as their teachers of the law. (Matthew 7:28-29, NIV)
As it turned out, the people left drawing comparisons unflattering to the rabbis and religious leaders of the time. Whereas Jesus spoke with authority, their teachers were speculating and surmising, at best. Trying to make scripture relevant, they could only throw the kugel against the wall and see what stuck. Along comes the Nazarene to show them up. He would pay for that…and so much more.

The point is that moral authority, as it did two millennia back, always wins out in the long-term. It may be temporarily eclipsed by rage, passion, pride or fear but its power always re-emerges after these motivations fizzle. What is moral authority, anyway? How do you get it? What do you do with it? Since I am asking, let’s see what ask.com has to say about it.

Moral authority is the quality or characteristic of being, for having good character or knowledge, especially as a source of guidance or an exemplar of proper conduct. Moral authority is practised by people in the society who have the rank of either being elders or people with a lot of experience in numerous things.
This makes sense. People who have fought and were wounded in combat quite naturally carry greater moral ballast in discussions of war and peace, all other things being equal. When you struggle with your bills, do you seek advice from a friend with an 800 FICO score or one who scores 590? Having a cogent argument is all well and good, but a proven track record and life of integrity carry the argument to victory. When Jesus spoke of the will of God, his audience sensed that he knew it intimately, inside and out. He may have been repeating the other rabbis’ exhortations verbatim, but the words meant more coming from him.

Moral authority always involves loss, sacrifice or risk of sacrifice. John F. Kennedy was born with a silver spoon in his mouth. Until he became a congressman, he never really had a full-time civilian job, nor did he need one. Friends recalled that he was always scrounging money from them because he never carried any. He often got by on charm and good looks. More notoriously, he had little use for the sacred vows he made to his wife at the altar of God. And yet, his picture hung (and still hangs) in many poor hovels in South America and Asia. His words are quoted even today in countless public addresses. His very name carries connotations of idealism, compassion and justice.  Childhood illnesses that carried—nearly fatally—into adulthood; the tragic loss of two siblings; heroic war service; and an assassin’s bullet bought him a credibility that his father’s money could never buy. It explains, at least in part, why the flaws of our recent baby boomer presidents are rarely overlooked: the pervasive feeling that they never really earned their stripes; that their constitutional authority is unaccompanied by moral authority.

This relates in a significant way to the current flap in Nevada. The last remaining rancher in Clark County, Cliven Bundy, has not paid federal grazing fees since 1993. Thirty percent of the land in the United States is owned by the federal government. In Nevada, it lays claim to a whopping 81 percent. Under regimens imposed by the U.S. Forest Service and the Bureau of Land Management, fees are calculated by taking a base forage value (presently $1.23) that is then adjusted according to the prior year’s lease rate, receipts for beef cattle and the production costs. This figure is referred to as the animal unit month, i.e. the amount of forage needed to sustain a cow and her calf for one month. The minimum AUM by law is $1.35. Multiply that by thousands of head of cattle month after month and it gets up there.

Mr. Bundy is now over a million dollars in arrears, prompting government agents to round up and confiscate his cattle last week, returning it thereafter in the face of public protest. He asserts that the claim the U.S. makes on the pasture is illegitimate. If statehood means anything, he contends, it means sovereignty over its own territory within state lines. He further states that he will pay his bill in full…if he can remit payment to the state of Nevada. (Interestingly, western states tend to charge higher fees on their own rangeland, closer to those of private landlords.) He did not always feel this way. Only after the U.S. Interior Department listed the desert tortoise as “protected”—with all the restrictions and regulations that it entails—did the rancher begin to question federal legitimacy.

The fact is that conflicts between the feds and ranchers have been brewing for many decades. Frankly, I myself wonder why the United States needs to own so much rangeland, even while bearing in mind concerns about conservation and accessibility. Whether it owns or subsidizes, its natural bent—regardless of party—is to push people around. The surprising thing is that there are not more of these confrontations. Wholesale review of federal land policy is definitely in order.

That said, I think Mr. Bundy should pay up, not because I sympathize with the government’s position, nor because I want to see him penalized. He should pay because he can not establish moral authority on this issue with a giant bill like that. Any resistance he puts up will be viewed by all but his fellow ranchers as self-interested tax evasion, as opposed to principled civil disobedience. The very fact that he has a grazing permit means that—at least at one time—he recognized federal authority on the land in question. His supporters might argue that conforming to these rules will simply sweep the issue back under the sagebrush. I think, however, that concerted political effort can better educate people as to the problems of and benefits from the ranching community in this country. Without the albatross of a disputed invoice, I am sure it will. Of course, it takes a long time to earn those stripes. Philosopher Kevin Vallier talks about the importance of taking the time to make the case to the larger general public:

Some individualist libertarians may think people should just toughen up, but I don’t think anyone seriously believes this in their hearts. Rousseau and Hume were right to point out that the sentiments of others shape our emotional and mental health, along with our very identities. The benefits of moral authority, then, are massive, as they enable the achievement of the greatest of all social goods – being on good terms with one’s family, friends, colleagues and fellow citizens.
I agree. I sympathize with many libertarian causes, but we do not live in a vacuum. We must establish moral authority if real liberty is to survive. For almost all of us, this requires some level of pain and discomfort. For Cliven Bundy, it requires a certified check. Not because the government deserves it, but because it lends heft to his protest. He will no longer fight authority but, instead, walk in authority and speak with it. In the “Authority Song,” John Cougar Mellencamp sang:

They like to get you in a compromising position
They like to get you there and smile in your face
They think they're so cute when they got you in that condition
Well I think it's a total disgrace

The Nevada ranchers may have these sentiments toward the national government. They should, nevertheless, remember that we can place ourselves in compromised positions in the zealous pursuit of liberty. Let this post serve as a cautionary statement. The Savior wanted to make a point about liberty, too. In so doing, he gave up everything.

May your Easter celebrations remind you of him who suffered the ultimate pain and sacrifice.  

Monday, April 14, 2014

Tails of Suffering

For those who observe Holy Week, the theme of suffering is often heard nowadays. The humiliation and injustice of Christ’s arrest and “trial,” the brutality of his scourging and the agony of his murder are recounted annually. So, too, are exhortations to identify with the Savior through suffering. The apostle Paul wrote to the church at Philippi:

I want to know Christ--yes, to know the power of his resurrection and participation in his sufferings, becoming like him in his death… (Philippians 3:10, NIV)
Bracing words for a natural-born coward like me, but Paul had a habit of confronting his fears to an admirable (and uncomfortable) degree. One of the things that I marvel at is his seeming indifference to imprisonment, especially in those times. More than the lash or the club or the fist or the spear, prison held a special kind of torment. Mind you, Paul and his compatriots were not treated to the 12 by 16 foot cell that we might imagine, where one inmate sleeps on the lower bunk and the upper occupier plays a harmonica. Nor were their surroundings as spacious and well-lit as much artwork later represented. Fetid, putrid, dark and dank, these holes were not designed for long-term sentences (in fact, prison was rarely imposed as an exclusive penalty in the Roman world of the first century). Shackled to the wall, prisoners could see nothing, smell everything—from rotting feces to abscessed wounds—and slowly lose perspective and sanity. One commentary puts it this way:

Often, prisons of this kind were dug out of solid rock and were underground. Prisoners, their guards, and their provisions were lowered through an opening the size of a manhole. This manhole was the only means of entrance and exit. In the Roman colony of Alba, the prison was under the marketplace. Rain and debris from the market and from animals easily dropped into this "house of darkness." Neither animal waste nor that of the prisoners found its way out easily. (Bob Fraser, A Year Through the Bible)
And then there were the rats. The rats. Rattus rattus, the black rat, carried the fleas that carried the plague. Gnawing at the prisoners’ digits, spreading disease and raiding their meager food provisions, rats put the sting in disgusting. Evil, hideous rodents with beady eyes and long, dreadful tails. We shudder at the idea of sharing space with these bottom-dwellers. Yet Paul and friends were at their mercy, as if chained to the tracks of the Broadway-7th Avenue Line. Today we freak out if we see one from a distance. In Paul’s day—and for most of human history—they were a simple fact of life to be coped with if not welcomed.

No, today’s post is not about Easter or Passion Week or the apostles. It’s about rats.

Writing and reading about the late 19th century, this is not a subject I often come across. Recently, however, I read an excerpt of The Hidden White House by Robert Klara, who argues that rattus rattus pretty much owned the executive mansion until the Truman renovation. Klara recounts when the departing and widowed Eleanor Roosevelt visited the Trumans at the Blair House across Pennsylvania Avenue just before she left Washington:

The ostensible purpose for her stopping by was to wish the Trumans well, but it soon became clear that Mrs. Roosevelt had a second reason: “[She] was somewhat apologetic about the [building’s] condition,” Margaret (Truman) recalled. “The war and her heavy travel schedule had never given her time to do much decorating or housekeeping.”
That sounded reasonable enough. Who wouldn’t cut Eleanor Roosevelt, the busiest First Lady in history, some slack over a little dusting and vacuuming? Then, just before she left, she let drop another detail. The whole house, Mrs. Roosevelt said, “was infested with rats.”
Indeed it was, as it had been since John Adams moved there in 1800. What can be expected in a city built atop a swamp? Yet we choose to believe otherwise about America’s most stately residence. While the Truman renovation of the late 1940s went a long way toward controlling the access for varmints and pests, keeping them out remains a full-time job even today. Theodore Roosevelt’s rambunctious children were said to make a sport of chasing them down and Barbara Bush was once confronted by a large rat at the White House swimming pool. I made no reference to White House rats in The Schombürgk Line. I will do so in future books, though, because nothing says atmosphere like beady eyes and dreadful tails.

Saints and martyrs, presidents and potentates, firefighters, farmers and public health officers have all had to deal with rats. Seemingly invincible as a species, rats remind us that our dominion over the earth comes at a cost. We are still paying it.

Thursday, April 10, 2014

Cristina Fernandez Decorates the Nursery

My interest in agriculture over the last few years has led me to appreciate the role soybean farmers played in achieving an albeit temporary economic boom for Argentina during the last decade. The late president Nestor Kirchner and his successor-wife Cristina Fernández de Kirchner exploited the global demand for soybeans—of which Argentina satisfied nearly 25 percent—to invest heavily in infrastructure and social programs, directed primarily toward cities. In turn, farmers resented the increasing appetite of the Kirchner government for the fruits of their labor, particularly since they were seeing precious little of the largesse. Corn and Soybean Digest suggests that American growers chafing under increasing USDA and EPA regulations need only look to Argentina for perspective:

With a finger in everything agricultural, the Argentine government stifles the business by intervening in exports markets, particularly for corn, farmers here say. The typical Argentine farmer pays 68% of his income in taxes, says Jorge Romagnoli. The government taxes soybean exports at 35%, corn at 20% and wheat at 23%. Despite that, the nation’s soybean acreage continues to increase and so do exports.
For a time, this was enough to finance the renaissance, but no longer. To be fair, the Kirchners inherited a massive debt when Nestor assumed command in 2003. Argentina was already excluded from the international debt market and its economy was reeling. His solution, while politically powerful and popular, did little good for the long-term economic prospects of Argentina: he told the country’s creditors to go scratch; he would pay them when he was good and ready. For good measure, he revoked many amnesties already granted and drove several high court judges from office. His bold moves cowed his opponents and the international community, which agreed to re-structure Argentina’s debt service.

The late Argentine President Nestor Kirchner with wife and successor Cristina Fernandez de Kirchner

The Kirchners were a power couple who received plaudits from left-leaning politicians, journalists and neighbors for their assertive policies. There is a fine line, however, between assertive and heavy-handed. Succeeding her husband as president, Cristina imposed burdensome export tariffs to pay for increased spending, the fact not lost on farmers. When foodstuff demand—particularly from China—was higher, the duties were ugly but bearable. When Chinese imports were reduced, the Kirchner social policies likewise ebbed as rural opposition soared. Poverty is stubbornly high in Argentina. In addition, many observers and consumers believe the government is disingenuous when under-reporting the high rate of inflation. All of this occurs on the heels of a major currency devaluation and a legal push by bondholders to collect long outstanding debt.

I have posted about Argentina before, but a recent statement by President Fernández de Kirchner made me think of some dead white guy wisdom. Speaking to her countrymen in a televised address, Ms. Fernández said that she considered herself “the mother” of her people. She said this at a time when her administration is forced to dial back some of its progressive initiatives for the sake of economic survival. Yet the parental role she seeks to fill is not new to contemporary politicians. In decades and centuries past, countless leaders have sought to solidify their holds on power by making children of their constituents. Once properly conditioned, they will be completely dependent on government for security, peace and comfort.
U.S. President Grover Cleveland spoke to this issue in his second inaugural address in 1893. Looking to government as father or mother undermines the civic qualities essential to ordered liberty:

The verdict of our voters which condemned the injustice of maintaining protection for protection's sake enjoins upon the people's servants the duty of exposing and destroying the brood of kindred evils which are the unwholesome progeny of paternalism. This is the bane of republican institutions and the constant peril of our government by the people. It degrades to the purposes of wily craft the plan of rule our fathers established and bequeathed to us as an object of our love and veneration. It perverts the patriotic sentiments of our countrymen and tempts them to pitiful calculation of the sordid gain to be derived from their Government's maintenance. It undermines the self-reliance of our people and substitutes in its place dependence upon governmental favoritism. It stifles the spirit of true Americanism and stupefies every ennobling trait of American citizenship.
The lessons of paternalism ought to be unlearned and the better lesson taught that while the people should patriotically and cheerfully support their Government, its functions do not include the support of the people.

It may be that Argentina’s citizens are no longer equipped to rule themselves. If so, Cristina (and, perhaps, Nestor’s ghost) has done the political work well. There will always be strong support for a government large and in charge. Yet this is a double-edged sword: a needy and demanding populace can throw quite a tantrum when hungry, tired or bored. President Fernández might want to re-consider her new role and suppress her maternal instincts.

Monday, April 7, 2014

A Dark Noah's Ark

The latest cultural flap centers on the biblical story of Noah, the ancient patriarch who was spared the world’s watery chastisement and then went about re-populating the earth. More accurately, opinions are flying relative to the recently released feature film Noah, directed by Darren Aronofsky and starring Russell Crowe. I first heard the movie being advertised over the radio and the commercial ended with this codicil:

The film is inspired by the story of Noah. While artistic license has been taken, we believe that this film is true to the essence, values and integrity of a story that is a cornerstone of faith for millions of people worldwide. The biblical story of Noah can be found in the book of Genesis.
All that the producers need have said was five words buried in the explanation: artistic license has been taken. They felt, however, that the public would be well advised to know the meaning and message of the movie was in accord with scripture, regardless of liberties taken.  I have not seen the film, though I do intend to. Yet, as always, it is the debate and commentary that speak to the changes and fissures in our culture more than the movie itself.

"Entry of the Animals in the Ark" by Francesco,Bassano

As an American, I can attest to Mr. Aronofsky’s absolute right to make a movie based on the Bible and to tweak the story to his liking. The first amendment guarantees him this protection whether he recounts history (as he sees it ) or makes up a fantasy from whole cloth or creates some sort of hybrid. It would not be the first time somebody mixed the words of scripture with speculation. Jenny Diski, a novelist and author of travel gazetteers, observes in the Manchester Guardian:

For many hundreds of years, rabbis have been discussing their interpretations of the most minute clues in the text. Most of all they love to elaborate on what is not there, and, like all humans, try to make sense of contradictions, implausibility, reticence, and to uncover and make meaningful, as if it were Twitter, more puns than you can shake an olive branch at. The bellowing and infighting on paper that had been going on since the Hebrews returned from their exile in Babylon, was collected and edited in the early Middle Ages into various books of midrash: interjections, extrapolations, interpretation, each devoted to the books of the Bible.
Ms. Diski is an unbeliever, and her opinions carry an acerbic and mocking tone. I think though, that she is spot on about the rabbinical tendency to try to fill the spaces where the Bible is silent. Ironically, they have done so to satisfy the doubts of the Jenny Diskis of the world, many of whom are resolved never to believe. Ever. I believe the Bible is the word of God. Logic does not compel me to believe, but neither does it prevent me. Divinity operates within and without my understanding and I require no midrash, caveat or proof to believe as I do. Maybe I should. But there it is.

Darren  Aronofsky has, nevertheless, created in Noah the ultimate midrash for skeptics and seekers; for those put off by the restrictions imposed by organized faiths; and for—wait for it—the spiritual but not religious. From a few spare chapters of Genesis he fashioned a 139-minute spectacular replete with special effects, dramatic music and first-tier star power. The rap on this movie from biblically conservative quarters is three-fold: 1) the sin for which God judges the world worthy of annihilation appears to be ecological, i.e. the world is full of global-warming deniers; 2) Noah’s episode of drunkenness in Genesis 9:21 is extrapolated into a lifetime of alcoholism; and 3) God is said to order Noah to slay his grandchildren, only to be defied by the noble sailor.

Again, I have yet to see it myself but I will make a few preliminary observations. Hollywood artists are often uncomfortable with the idea of personal sin because it implies that God owns us—body, mind and spirit—and possesses full moral authority over each. To acknowledge such a thing would undermine cherished preferences from pro-choice to anti-censorship. Their shibboleths are all defended on the grounds of individual sovereignty so sin must be collectivized, thereby absolving their moviegoers of guilt and remorse. Environmental neglect is the culprit Aronofsky chose this time, but you can bet it will always be a national or global sin.

As to Noah’s taste for the grape, we know only one occasion of inebriation. That the writers wanted to plague Noah with that particular demon speaks to their general boredom with the narrative as it stands in scripture. Sir Arthur Conan Doyle, for example, was dissatisfied with Sherlock Holmes until he threw illicit drug use into the mix. God and his scribes have an agenda wholly different from 21st-century screenwriters, or anybody else for that matter. The prophet Isaiah spoke for God concerning this:

For my thoughts are not your thoughts, neither are your ways my ways, declares the Lord. As the heavens are higher than the earth, so are my ways higher than your ways and my thoughts than your thoughts. (Isaiah 55:8-9, NIV)
Darren Aronofsky and company are not the Intervarsity Christian Fellowship. Their ways and thoughts lean toward blockbuster entertainment, not the salvation of souls. A dark and troubled psyche sells more tickets than a righteous man who made a mistake. Ultimately, my sisters and brothers who object to this film on the grounds that it is disrespecting Noah put the cart before the horse. The producers of Noah see this as a good story worthy of enhancing. Since the Bible is in the public domain, they can do so without permission. To that end, the director referred to his creation as "the least biblical biblical film ever made". They are not bound by fidelity to the biblical text. Why all the shock and outrage?

The answer might be found in the cultural context. “Church-goers Now a Minority in America” the Huffington Post declared triumphantly in 2012. As society grows ever more secular, the entertainment industry has actively sought to fill in the void of moral authority. The problem is, it possesses none but does have a huge audience, so its interpretations of ancient scripture can be carried far and wide. This movie might be the only Bible many people ever experience. Perhaps it is understandable that believers get exercised about its fabrications. And yet…

The very fact that people are flocking to see it speaks of more than just a desire for distraction. There is soul-hunger out there and, maybe, Noah can serve as a point of contact between believers and unbelievers. Conservative journalist Cal Thomas takes surprisingly benign view of the movie:

After decades in which Hollywood mostly ignored or stereotyped faith, Christians should be happy they have gotten the film industry's attention. Successful films like "The Passion of the Christ," "The Bible" and "Son of God" prove that such stories "sell." Instead of nitpicking over "Noah," the Christian community should not only be cheering, but buying tickets to encourage more such movies. Hollywood may not always get it right, but that's not the point. They are getting something and that sure beats not getting anything, or getting it completely wrong as in Martin Scorsese's blasphemous, "The Last Temptation of Christ."
Besides, after some see "Noah," they might want to visit the "original cast." The next time a rainbow appears might be the right occasion to begin a discussion.

That is pretty much where my thinking is. Truth will out in the end but the dialogue has to begin somewhere, even if the terms are unfavorable. And so, I go to see the murderous drunk, the warming earth, the rock people and the cruel god. Let the conversation commence.

Wednesday, April 2, 2014

The Self-Employed and the Self-Absorbed

Ever since I started freelancing, I have felt obliged to read Entrepreneur magazine. I have all the business sense of Michael Jackson so I need all the help I can get. What I find curious about this publication is that they will frequently publish articles about the personal traits necessary to succeed when running an enterprise. One week it will be “The Five Qualities of a Successful Entrepreneur”; another will feature “Ten Indispensible Personal Traits for Small Business Leaders”; a more expansive list was found in “25 Characteristics Necessary for Small Business Success.” However many personal properties are actually necessary, I likely possess only a small fraction.

In spite of the numerical differences, these pieces are very helpful. I particularly liked last January’s “The 7 Traits of Successful Entrepreneurs” by productivity expert Joe Robinson. In and of itself, it reads like many other business motivational articles. His seven temperamental necessities are:

  • Tenacity
  • Passion
  • Tolerance of Ambiguity
  • Vision
  • Self-belief
  • Flexibility
  • Rule-breaking

Again, these are not surprising to people who regularly read this type of material. In the wake of a recent political episode, however, they have illuminated a fault line in the public square.

I have never followed Matt Drudge, at least not much. Tabloid websites from either right or left are just not interesting to me. Yet Drudge himself became the subject of controversy last week when he reported via Twitter on his latest interface with the federal government:

Just paid the Obamacare penalty for not 'getting covered'... I'M CALLING IT A LIBERTY TAX!
Pandemonium ensued. Jesse Lee, Director of Progressive Media (???) at the White House fired the opening salvo, also via Twitter:

Flat lie, no fee for previous year. Scary how much influence he once had. RT @DRUDGE: Just paid Obamacare penalty for not 'getting covered'.
Many press outlets pounced on the apparent ignorance of Drudge, since it was well known that the individual mandate had been postponed by the administration. From the Huffington Post:

Liberty tax! The whole thing is weird, considering that the tax penalty that adds bite to the the "individual mandate" -- the Affordable Care Act's diktat that most Americans have some form of health coverage -- isn't even due until more than a year from now, when people file their 2014 federal income-tax returns.
The Los Angeles Times also shook its journalistic head:

Again, weird. Businesses with fewer than 50 employees--and that appears to cover Drudge--are exempt from the ACA. No penalties due. If he mistweeted and meant to say that he paid his quarterly taxes as an individual or sole business proprietor, which is possible, then he might owe the penalty--theoretically.

If it is “theoretically” possible, there is nothing weird about it. As a self-employed individual, Drudge must pay estimated taxes ahead on a quarterly basis, i.e. pay on income he anticipates, not on what he has already made. After he files his annual return, the IRS decides whether he has overpaid or underpaid. Mistweeted? Does anybody read tweets expecting full legal and financial disclosure?

This little brouhaha underscores the diminishing respect for, and even knowledge of, the challenges of self-employment. The situation is bad enough among the general public, but all the more disturbing when found at the highest levels of authority personified by the White House Director of Progressive Media (???). For better or for worse, White House credentials carry a presumption of authority, whether or not those bearing them are endowed with a commensurate measure of knowledge and experience. Many self-employed people work in excess of 60 hours per week, with little extra time for boning up on tax law. Likewise, new businesses on a shoestring do not always have the benefit of a competent accountant.

The point is that if a pronouncement comes from the president’s staff—even if it’s simply a tweet from a staffer with an ambiguous title—it carries weight with the public. The staffer should take a deep breath and consider his words before counter-punching, no matter how snarky the opponent. I have often quoted Henry Adams in this space. He said, “Politics, as a practice, whatever its professions, has always been the systematic organization of hatreds.” Vladimir Lenin took this description as a compliment, standing on its foundation:

My words were calculated to evoke hatred, aversion and contempt . . . not to convince but to break up the ranks of the opponent, not to correct an opponent’s mistake, but to destroy him.
Sad to say, there are plenty of American political consultants that adhere to this credo, and very successfully, too. But history demonstrates the disasters that follow when political vindictiveness trumps sober and informed understanding of issues. One example is the nationwide vaccination campaign initiated by President Gerald Ford in 1976. Preliminary study of a swine flu outbreak at Fort Dix convinced government leaders that a full-blown pandemic was waiting in the wings. Epidemiologist Rebecca Kreston points out in Discovery magazine, however, that the World Health Organization and other bodies were counseling restraint until more data could be gathered and analyzed. Here is what followed:

With President Ford’s reelection campaign looming on the horizon, the campaign increasingly appeared politically motivated. The rationale for mass vaccination seemed to stem from only the barest of biological reasoning – it turned out that the flu wasn’t even related to the virus that caused the grisly 1918 epidemic and, indeed, those who were infected with the flu only suffered from a mild illness while the vaccine, for the reasons stated above [the government used an attenuated “live virus” for the vaccine instead of a inactivated or “killed” form], resulted in over four-hundred and fifty people developing the paralyzing Guillain-Barré syndrome. Meanwhile, outside the United States’ borders, the flu never mushroomed into the anticipated public health disaster. It was the pandemic that never was. The New York Times went so far as to dub the whole affair a “fiasco,” damning one of the largest and probably one of the most well-intentioned public health initiatives by the US government.

President Gerald Ford receives flu shot from White House physician

Getting the upper hand in the political wars will always be a part of government in a free society. Yet in so doing, I have to believe, officials should never get ahead of established facts in order to zing an opponent. Such practice is indicative of the self-absorption that alienates so many from the public conversation. Self-absorption is also often at odds with better traits like tenacity, vision, flexibility and the other entrepreneurial virtues. Granted, Drudge could well benefit from a smidgeon more humility. For its part, the White House should temper the knee-jerk responses with context and accuracy.

Monday, March 31, 2014

Left Idle By an Idol

The year 1968 is considered watershed in the political and cultural life of the United States. Perhaps no other event of that year captured the changes as much as the decision of President Lyndon Baines Johnson to forgo a re-election campaign and retire. The Tet Offensive of the North Vietnamese took American military leaders by surprise and the grinding pace—plus soaring casualty rates—of the long war in Vietnam had eroded Johnson’s once stratospheric popularity. Though he was constitutionally eligible to run for a second full term, he opted out on this very day, March 31, 1968—46 years ago today.

With America's sons in the fields far away, with America's future under challenge right here at home, with our hopes and the world's hopes for peace in the balance every day, I do not believe that I should devote an hour or a day of my time to any personal partisan causes or to any duties other than the awesome duties of this office--the Presidency of your country.
Accordingly, I shall not seek, and I will not accept, the nomination of my party for another term as your President.
America’s future was indeed under challenge. Not only the war, but political shifts had also sparked a wave of inner-city riots and a long period of unrest on college campuses. With the government’s attention concentrated elsewhere, the Soviets invaded Czechoslovakia and the North Koreans seized a U.S. Navy vessel. It doubtless felt to many like the United States was coming apart at the seams. All the while, the Democratic Party was growing weary of its leader and considering the available alternatives.
It was this last reality that moved Johnson to retire. A man of his temperament was not the sort to go voluntarily. In fact, he only lived a few years into it, passing away in January 1973. During the intervening years he was said to have been miserable: over-eating, drinking too much and just feeling sorry for himself. The key to his decision to call it quits was rooted in an event that transpired only a couple of weeks before. An anti-war candidate—Senator Eugene McCarthy of Minnesota—came within a hair of beating the president in the New Hampshire primary vote. This feat led Senator Robert Kennedy of New York to jump into the race, though he had already committed to support LBJ. Smelling Johnson’s blood, Kennedy saw a chance to fulfill his family’s legacy while getting even with the president, his longtime rival.

To be fair, several biographers acknowledge that Johnson may have chosen retirement anyway. He had suffered a heart attack in the 1950s and had a serious gallbladder operation since then. His family was frequently urging him to take it easy. Yet it was his sub-conscious, perhaps, that belied his true feelings. Confiding to Doris Kearns Goodwin, then a young White House aide and later an assistant with his memoirs, he recounted a recurring nightmare:

I felt [in 1968] that I was being chased on all sides by a giant stampede coming at me from all directions. I was being forced over the edge by rioting blacks, demonstrating students, marching welfare mothers, squawking professors, and hysterical reporters. And then the final straw. The thing I feared from the first day of my presidency was actually coming true. Robert Kennedy . . . openly announced his intention to reclaim the throne in the memory of his brother. And the American people, swayed by the magic of the name, were dancing in the streets. The whole situation was unbearable for me. (Evan Thomas. Robert Kennedy: His Life, p.365. 2002)

Odd. Lyndon Johnson had an adoring wife who stuck with him through adultery and ill health. He had two loyal and loving daughters who were giving him grandchildren, and two sons-in-law then serving honorably in Vietnam. Originally a schoolteacher to dirt-poor Mexican children, he went on to a long and significant congressional career, topped off by many years as the Senate Majority Leader. He had been John F. Kennedy’s vice president, and assumed the presidency with consummate skill after JFK’s assassination in 1963, winning an overwhelming electoral victory in his own right the following year. He shepherded through Congress historic civil rights legislation that John Kennedy likely would have been unable to do. What could little Bobby Kennedy really take from him?

Everything, apparently, in terms of what mattered most to him. As Max Holland, author of The Kennedy Assassination Tapes wrote in the Los Angeles Times:

The flaw in Johnson was that he was not content to lead and be respected. Rather, he demanded almost slavish support and public adoration. Such deep emotion is reserved for very few presidents, and usually they have to die in office to achieve it. Johnson's curse was that he inherited the presidency from such a man. And now RFK was repudiating everything Johnson represented or hoped to accomplish.
I would put it somewhat differently. In challenging him for the highest office in the land, RFK was stealing the favor of Johnson’s god—public adoration. More than votes, LBJ wanted the love of his constituents, their affirmation and their unwavering admiration. It had been his raison d’etre for decades; without it, life held no meaning. This was Lyndon Johnson’s idol. The homage he had paid for years and years was becoming lukewarm and the idol spewed the 36th president of the United States out of its mouth, leaving him idle back at his Texas ranch.

The triumphs and tragedies of our political leaders all have spiritual components. Sometimes within the triumphs are the seeds of tragedy, a fact of which Johnson was probably unaware as he basked in the approbation of the 1964 election results. Making a choice between the lesser of two evils 46 years ago, he chose the idleness of retirement over his idol’s most horrific instrument of vengeance: public humiliation.