Sarah Beckwith's op-ed piece in today's News & Observer, "Shakespeare was Hardly Anonymous," would have you believe there has never been a controversy over Shakespeare's authorship. But the authorship controversy is nearly as old as the plays themselves. Even some contemporaries of the actor from Stratford-on-Avon and some who came soon after raised doubts about who wrote the plays attributed to him.
I'm no expert on the matter, as Beckwith purports to be, but I've been reading about the controversy and have been intrigued by it for more than three decades. (I blogged about this matter more than two years ago: "Will the true Shakespeare please stand up?") A Washington Post article in the mid-1970s first introduced me to serious doubts about the authorship of the greatest set of writings in the English language. The Atlantic magazine devoted a cover story to the authorship controversy in 1991 and has looked back at the point-counterpoint arguments over the Bard in this post. (I think I still have that issue of the Atlantic squirreled away somewhere.) Neither the Washington Post nor The Atlantic can be accused of lacking seriousness. Amazon.com lists 166 results in a search for books on "Edward de Vere Shakespeare." Admittedly, that doesn't mean there are 166 different books available on this subject, but the controversy has generated dozens of serious books. One of the more recent is "Shakespeare by Another Name: Edward de Vere, Earl of Oxford, The Man Who Was Shakespeare." I checked that book out of the public library and found it quite persuasive in its argument that Edward de Vere should rightfully be credited for the authorship of the Shakespearean plays. Hank Whittemore has written a book titled "The Great Shakespeare Hoax," and there are others. To claim there is no controversy or that the argument is akin to flat-earth claims is ridiculous.
The contention of the Shakespeare debunkers boils down to doubts about the ability of the man William Shakespeare to write the plays attributed to him. He had little or no education — some even doubt that he could read and write. But the plays attributed to him expanded the English language with new words and metaphors that have become familiar parts of our conversations — "the milk of human kindness," "a pound of flesh," etc. His vocabulary exceeded that of the translators of the King James Bible. He was intimately familiar with the royal court, with foreign literature and classic tales and, perhaps most tellingly, with Italy, where several of his plays are set. But William Shakespeare, the actor, never traveled to Italy. Edward de Vere lived there for some time, in Verona. There were good reasons in Elizabethan society for an aristocrat to deny authorship of plays, which were considered debauched common entertainment unfit for royalty. Edward de Vere, the Earl of Oxford, is the logical answer to the question, "Who had the education, skills and life experiences to have written these plays?"
I know English professors who are steadfast in their faith that William Shakespeare, despite his lack of formal education, travel and worldly knowledge, wrote the plays attributed to him, and I respect their position. (I am an English major who took only one Shakespeare course.) I don't think the question is a closed book, but I do lean strongly toward the belief that the Earl of Oxford wrote the plays and an actor named William Shakespeare took credit for them.
We may never be able to satisfy this 500-year-old mystery, but it is a mystery, one of the greatest in literary history. If the mystery is not why de Vere denied his authorship, then it is how Shakespeare managed to write so magnificently about things he never experienced.
Friday, October 28, 2011
Monday, October 24, 2011
Saddam could have been another Qaddafi
The capture/killing/execution of Muammar Qaddafi by Libyan rebels occurred with no loss of American lives. That makes you wonder how different the world might have been if the NATO strategy in Libya had been applied to Iraq in 1991.
In 1991, the United States, having amassed a huge coalition army, attacked Iraqi-occupied Kuwait and stormed into the Iraqi desert, unimpeded by Saddam Hussein's once-feared military. American air strikes had obliterated the Iraqi air force, and American ground forces with air support had demolished the Iraqi army, sending remnants of it scurrying toward Baghdad in one of the most complete military victories in modern history. After 100 days of one-sided combat, President George H.W. Bush declared the war was over. Riding ridiculously high approval ratings, Bush seemed assured of re-election.
Bush and his advisers chose not to follow the panicked Iraqi troops into Baghdad and directly overthrow Saddam Hussein. They were convinced that the Iraqi people would rise up against the now-weakened tyrant and quickly be rid of him. His army was decimated, his air force crippled and the allies promised to keep his remaining fighter jets out of the sky. The oppressed Shiite majority should easily be able to overthrow the ruthless dictator.
That was the theory, but it didn't work out that way. Allies allowed Iraqi helicopters to fly, and those aircraft gave Saddam the advantage he needed to crush the rebellion. For 12 years, Iraq played cat-and-mouse with Allied fighter jets patrolling skies over Iraq while, on the ground, Saddam built an even more oppressive regime. In 2003, the first President Bush's son saw an opportunity to finish the job his father had begun. Using erroneous or fraudulent intelligence, George W. Bush declared that the world had to depose Saddam Hussein before he could use his weapons of mass destruction. It should be another cakewalk, just like 1991, the advisers said. Once again they were wrong, and 4,400 American troops would die in the invasion and the subsequent insurgency.
If the first President Bush had followed NATO's Libyan protocol of 20 years later and provided air support for Iraqi dissidents, Saddam Hussein might have been deposed 12 years earlier, the Iraq war would not have happened, the United States would not have spent trillions of dollars in an ill-conceived invasion and occupation, 4,400 Americans would still be alive, the federal budget would be far healthier, and America would enjoy far more support in Arab countries.
In 1991, the United States, having amassed a huge coalition army, attacked Iraqi-occupied Kuwait and stormed into the Iraqi desert, unimpeded by Saddam Hussein's once-feared military. American air strikes had obliterated the Iraqi air force, and American ground forces with air support had demolished the Iraqi army, sending remnants of it scurrying toward Baghdad in one of the most complete military victories in modern history. After 100 days of one-sided combat, President George H.W. Bush declared the war was over. Riding ridiculously high approval ratings, Bush seemed assured of re-election.
Bush and his advisers chose not to follow the panicked Iraqi troops into Baghdad and directly overthrow Saddam Hussein. They were convinced that the Iraqi people would rise up against the now-weakened tyrant and quickly be rid of him. His army was decimated, his air force crippled and the allies promised to keep his remaining fighter jets out of the sky. The oppressed Shiite majority should easily be able to overthrow the ruthless dictator.
That was the theory, but it didn't work out that way. Allies allowed Iraqi helicopters to fly, and those aircraft gave Saddam the advantage he needed to crush the rebellion. For 12 years, Iraq played cat-and-mouse with Allied fighter jets patrolling skies over Iraq while, on the ground, Saddam built an even more oppressive regime. In 2003, the first President Bush's son saw an opportunity to finish the job his father had begun. Using erroneous or fraudulent intelligence, George W. Bush declared that the world had to depose Saddam Hussein before he could use his weapons of mass destruction. It should be another cakewalk, just like 1991, the advisers said. Once again they were wrong, and 4,400 American troops would die in the invasion and the subsequent insurgency.
If the first President Bush had followed NATO's Libyan protocol of 20 years later and provided air support for Iraqi dissidents, Saddam Hussein might have been deposed 12 years earlier, the Iraq war would not have happened, the United States would not have spent trillions of dollars in an ill-conceived invasion and occupation, 4,400 Americans would still be alive, the federal budget would be far healthier, and America would enjoy far more support in Arab countries.
Wednesday, October 19, 2011
Helping a stranger while dodging traffic
I was stopped at a busy stoplight at the intersection of two four-lane roads one recent afternoon, about five vehicles back from the stoplight. More cars were behind me. I noticed a box truck pulled over to the curb of the intersecting street, its cargo bay open. A man was walking away from the truck, watching approaching traffic, as he stooped to collect something from the roadway.
It took a moment to realize that what he was retrieving was several folding chairs that had fallen from the open rear of the cargo truck. I couldn't tell how many chairs were in the roadway, but I could see enough, as I peered around the cars ahead of me, to see that it might take several signal cycles for him to clear the traffic hazards. There were enough chairs to stall traffic for a while, and, already, drivers were ignoring the traffic signal and looking instead for a path between the chairs lying in the road.
Then something surprising happened. First one driver, then another and another, got out of the cars waiting in front of me and, watching the oncoming traffic carefully, walked into the roadway to help the man whose chairs were in the road. With this additional help, the road was quickly cleared, the chairs were secured in the truck, drivers returned to their vehicles, and the stoplight began to matter again.
I don't know how many places something like this might happen. I don't know how many people would put their lives at risk on a busy street to help a stranger who either failed to secure his cargo or was the victim of an equipment failure. But I saw some brave and considerate souls do the right thing. In the process, they helped everyone waiting at that intersection get where they were going.
It happened in Wilson, N.C.
It took a moment to realize that what he was retrieving was several folding chairs that had fallen from the open rear of the cargo truck. I couldn't tell how many chairs were in the roadway, but I could see enough, as I peered around the cars ahead of me, to see that it might take several signal cycles for him to clear the traffic hazards. There were enough chairs to stall traffic for a while, and, already, drivers were ignoring the traffic signal and looking instead for a path between the chairs lying in the road.
Then something surprising happened. First one driver, then another and another, got out of the cars waiting in front of me and, watching the oncoming traffic carefully, walked into the roadway to help the man whose chairs were in the road. With this additional help, the road was quickly cleared, the chairs were secured in the truck, drivers returned to their vehicles, and the stoplight began to matter again.
I don't know how many places something like this might happen. I don't know how many people would put their lives at risk on a busy street to help a stranger who either failed to secure his cargo or was the victim of an equipment failure. But I saw some brave and considerate souls do the right thing. In the process, they helped everyone waiting at that intersection get where they were going.
It happened in Wilson, N.C.
Tuesday, October 18, 2011
First impression of Cain plays out
Back in June, I wrote that after the first Republican presidential debate, Herman Cain came across as a serious, even viable candidate. It's amusing now to see that Cain, the unlikely politician, has risen to the top of the GOP field, at least in some polls. With another debate scheduled tonight, Cain will have another opportunity to live up to his newfound ranking.
Cain is still an unlikely candidate. With no campaigning or governing experience, he has a handicap vs. his more experienced opponents. Still, he has struck a chord with some voters. He comes across as folksy, jovial and avuncular. His 9-9-9 plan has taken off, despite the fact that it would shift tax burden to the poor and low-income and probably would not produce enough revenue, analysts say, to cover the federal government's expenses. Cain continues to pitch the plan with the fervor of a snake-handling evangelist and is not about to back down, even to questions about whether 9-9-9 might turn into 12-12-12 (percent sales tax, personal income tax and corporate tax).
What Cain's rise in the polls indicates is that voters are not particularly interested in a candidate's race or color. The GOP electorate is conservative and predominantly white, but a plurality of GOP voters polls say they like Cain and support him. If Cain's race has handicapped him in any way, it's not apparent.
Cain remains highly unlikely to win the nomination, and as his platform comes under greater scrutiny and his foreign policy inexperience becomes more apparent, he is likely to slide in the polls. Still, his accomplishment as a businessman taking on the political establishment is impressive. He has livened up the GOP debate without venturing over the cliff with Ron Paul.
Monday, October 17, 2011
Returning to campus after so many years
My wife had this wild idea: Why not sit in the Carolina Inn bar and watch a football game on the big-screen TV instead of sitting in Kenan Stadium on a too-warm afternoon to watch the game from afar (assuming we could find tickets). We tried her idea on Saturday, when Carolina was playing Miami and the streets and sidewalks of Chapel Hill were packed with thousands of extra cars and people. On a gloriously beautiful autumn day, we sat in the bar, ordered $11 hamburgers and delicious draft beer and watched as the Tar Heels looked disappointingly inept on their way to a 17-0 halftime deficit.
By then, having consumed a hamburger and a couple of beers each, we'd seen enough of disappointment and decided to stroll down to Franklin Street, where my wife wanted to cash in a coupon at a store. First, we took a detour through campus, along the brick walks of the oldest portions of the now-sprawling campus — Memorial Hall, South Building, the Old Well, Person Hall, BVP, the Davie Poplar and the rest. Although portions of the campus are barely recognizable because so many new structures have been squeezed into formerly wooded or pastoral slices of ground, this part of campus is little changed from its 1790s origins and is nearly identical to its mid-1960s appearance. The old buildings are freshly washed or painted; the creeping ivy is missing from several buildings that now glow in the sunlight. All four of us, all Carolina alumni, strolled slowly through our memories of this place, picking out favorite spots and keen recollections, opening doors left unlocked and standing back to admire improvements.
You can say it about any university campus, I suppose, that it is a "special place," where teenagers tasted freedom, tested independence and learned maturity. It's a place of ideas and concepts formerly unknown and of interests explored. It is as well a place of romance — each of us had met our spouse at this place — and a foundation for later life.
On football weekends when I was a student, I would see the returning alumni and feel more pity than envy for them. They turned out in university-emblazoned finery and drove big cars at a time when ragged jeans and sweatshirts were normative student attire. I interpreted their presence as an attempt to relive their youth. My youthful analysis was flawed. Now I realize that returning to this place is not an effort to relive one's youth; it is a way of saying thanks for the glories of youth and, also, for the blessings of maturing years.
By then, having consumed a hamburger and a couple of beers each, we'd seen enough of disappointment and decided to stroll down to Franklin Street, where my wife wanted to cash in a coupon at a store. First, we took a detour through campus, along the brick walks of the oldest portions of the now-sprawling campus — Memorial Hall, South Building, the Old Well, Person Hall, BVP, the Davie Poplar and the rest. Although portions of the campus are barely recognizable because so many new structures have been squeezed into formerly wooded or pastoral slices of ground, this part of campus is little changed from its 1790s origins and is nearly identical to its mid-1960s appearance. The old buildings are freshly washed or painted; the creeping ivy is missing from several buildings that now glow in the sunlight. All four of us, all Carolina alumni, strolled slowly through our memories of this place, picking out favorite spots and keen recollections, opening doors left unlocked and standing back to admire improvements.
You can say it about any university campus, I suppose, that it is a "special place," where teenagers tasted freedom, tested independence and learned maturity. It's a place of ideas and concepts formerly unknown and of interests explored. It is as well a place of romance — each of us had met our spouse at this place — and a foundation for later life.
On football weekends when I was a student, I would see the returning alumni and feel more pity than envy for them. They turned out in university-emblazoned finery and drove big cars at a time when ragged jeans and sweatshirts were normative student attire. I interpreted their presence as an attempt to relive their youth. My youthful analysis was flawed. Now I realize that returning to this place is not an effort to relive one's youth; it is a way of saying thanks for the glories of youth and, also, for the blessings of maturing years.
Wednesday, October 12, 2011
Why should we 'Occupy Wall Street'?
Although I came of age in the era of protests — the 1960s — and participated in a few myself, I am having trouble figuring out the Occupy Wall Street protests. These protests, which are proudly leaderless and disorganized, take aim at the big banks and other financial institutions on Wall Street and have spread to demonstrations in other cities, including Chapel Hill, N.C. (which has a long history of protests).
But it seems to me that the protesters are blaming the beneficiaries of policies when they should be targeting the creators of policies — Congress and the executive branch of government. The rich have gotten richer, the income gap has widened and economic wealth has become more concentrated not because banks and investors have taken advantage of opportunities but because the government has paved their paths to these new economic realities. Tax cuts for the wealthiest Americans, deregulation of banking and forceful pushing of questionable home loans are all federal policies. Banks and investors have taken advantage of these policies, but who could blame them? The last thing this economy needs is a ban on taking on opportunities.
Wall Street — as a term meaning American corporate thinking — can be blamed for some policies that are not good for the nation as a whole and are particularly bad for certain segments of the population. Corporate policy has frequently been criticized as being too short-sighted, looking no further than the next quarterly report when wise business practices would be looking ahead to the next year or the next decade. Satisfying investors and the corporate board is not as important in the long term as satisfying customers as a whole. The short-sighted focus on the next earnings report also ignores larger economic good, such as environmental benefits and domestic employment.
So, yes, I see the anger at Wall Street and the frustration of young people with large student loans but no jobs, but the first policy change needs to come out of Washington, not out of Wall Street. Or so it seems to me.
But it seems to me that the protesters are blaming the beneficiaries of policies when they should be targeting the creators of policies — Congress and the executive branch of government. The rich have gotten richer, the income gap has widened and economic wealth has become more concentrated not because banks and investors have taken advantage of opportunities but because the government has paved their paths to these new economic realities. Tax cuts for the wealthiest Americans, deregulation of banking and forceful pushing of questionable home loans are all federal policies. Banks and investors have taken advantage of these policies, but who could blame them? The last thing this economy needs is a ban on taking on opportunities.
Wall Street — as a term meaning American corporate thinking — can be blamed for some policies that are not good for the nation as a whole and are particularly bad for certain segments of the population. Corporate policy has frequently been criticized as being too short-sighted, looking no further than the next quarterly report when wise business practices would be looking ahead to the next year or the next decade. Satisfying investors and the corporate board is not as important in the long term as satisfying customers as a whole. The short-sighted focus on the next earnings report also ignores larger economic good, such as environmental benefits and domestic employment.
So, yes, I see the anger at Wall Street and the frustration of young people with large student loans but no jobs, but the first policy change needs to come out of Washington, not out of Wall Street. Or so it seems to me.
Monday, October 10, 2011
Presidential primaries can't get much earlier
SOMEDAY SOON — The Florida Republican Party announced today that it will hold its presidential primary tomorrow. In response, New Hampshire, announced it was moving its presidential primary, traditionally the first in the nation, to yesterday.
These announcements follow a string of escalations in the Great Presidential Primary Wars that began when Florida announced it would hold its presidential primary in January. Refusing to be relegated to second place in the presidential sweepstakes, New Hampshire promptly moved its primary to December.
But the Christmas Primary was short-lived, when Florida proclaimed its primary day to be the Day after Thanksgiving. Then New Hampshire retaliated by declaring Halloween its primary day, despite a chorus of "how appropriate!" from cynical Democrats. Ultimately, that led to today's twin announcements of the Tomorrow Primary and the Yesterday Primary.
Asked how voters were supposed to arrange to vote tomorrow, much less yesterday, one New Hampshire party spokesman admitted, on condition of anonymity, "It's not the vote we're after. We just want the over-the-top publicity for an event that involves only a minuscule portion of the national electorate."
These announcements follow a string of escalations in the Great Presidential Primary Wars that began when Florida announced it would hold its presidential primary in January. Refusing to be relegated to second place in the presidential sweepstakes, New Hampshire promptly moved its primary to December.
But the Christmas Primary was short-lived, when Florida proclaimed its primary day to be the Day after Thanksgiving. Then New Hampshire retaliated by declaring Halloween its primary day, despite a chorus of "how appropriate!" from cynical Democrats. Ultimately, that led to today's twin announcements of the Tomorrow Primary and the Yesterday Primary.
Asked how voters were supposed to arrange to vote tomorrow, much less yesterday, one New Hampshire party spokesman admitted, on condition of anonymity, "It's not the vote we're after. We just want the over-the-top publicity for an event that involves only a minuscule portion of the national electorate."
Thursday, October 6, 2011
Steve Jobs: One shock after another
Steve Jobs' death came as a shock Wednesday. He was a modern-day genius who changed technology and, by doing so, changed popular culture and society. (I wrote about this recently in this blog post.)
As we look back on Jobs' legacy, we get another shock: The iPod is only 10 years old. It's hard to remember a world without it. Not so long ago, my children were listening to music on a Sony Walkman, which seemed so compact and convenient, capable of playing 40 minutes or so of consecutive music on a cassette tape. The CD version of the Walkman seemed like the ultimate in personal listening. Now, I have hours of music on an iPod shuffle that is barely larger than a postage stamp. It was Jobs' visionary creativity that gave us these devices and others. The iPod led to the podcast, which makes all sorts of radio programs and other information or entertainment available at our convenience. The iTunes store revolutionized how we buy music, as well as movies and television shows.
I've been using Apple products for more than 25 years and have always found them preferable to the alternatives because Jobs demanded products that were not only technically competent but also easy to use, intuitive, practical and elegant. In the process, he created one of the most successful companies in American history. The company had its first office and manufacturing facility in the Jobs family garage.
Here's another shock: Jobs' genius was so little appreciated that he was actually forced out of the company he had co-founded. During his exile beginning in 1985, he went on to other imaginative ventures, including NeXT computers, which was later bought by Apple. When he returned in 1996, Apple was losing money, and consumers were leery of Apple products, fearful that the company might soon be bankrupt. But Jobs directed the development of the odd-looking iMac and then a whole range of new products, including the hugely successful iPhone. Jobs took big risks — the iMac was ridiculed at first because it had no floppy disk drive. Critics said consumers wouldn't by a computer without a floppy drive (Apple's original Mac had led the change from 5.25-inch floppies to the less floppy 3.5 inch disks), but the critics were wrong.
It seems doubtful that Apple will be able to maintain its creative boldness without Jobs, but I hope his sense of inventiveness, perfectability and user-friendliness has so pervaded Apple that the company will remain an innovation leader.
As we look back on Jobs' legacy, we get another shock: The iPod is only 10 years old. It's hard to remember a world without it. Not so long ago, my children were listening to music on a Sony Walkman, which seemed so compact and convenient, capable of playing 40 minutes or so of consecutive music on a cassette tape. The CD version of the Walkman seemed like the ultimate in personal listening. Now, I have hours of music on an iPod shuffle that is barely larger than a postage stamp. It was Jobs' visionary creativity that gave us these devices and others. The iPod led to the podcast, which makes all sorts of radio programs and other information or entertainment available at our convenience. The iTunes store revolutionized how we buy music, as well as movies and television shows.
I've been using Apple products for more than 25 years and have always found them preferable to the alternatives because Jobs demanded products that were not only technically competent but also easy to use, intuitive, practical and elegant. In the process, he created one of the most successful companies in American history. The company had its first office and manufacturing facility in the Jobs family garage.
Here's another shock: Jobs' genius was so little appreciated that he was actually forced out of the company he had co-founded. During his exile beginning in 1985, he went on to other imaginative ventures, including NeXT computers, which was later bought by Apple. When he returned in 1996, Apple was losing money, and consumers were leery of Apple products, fearful that the company might soon be bankrupt. But Jobs directed the development of the odd-looking iMac and then a whole range of new products, including the hugely successful iPhone. Jobs took big risks — the iMac was ridiculed at first because it had no floppy disk drive. Critics said consumers wouldn't by a computer without a floppy drive (Apple's original Mac had led the change from 5.25-inch floppies to the less floppy 3.5 inch disks), but the critics were wrong.
It seems doubtful that Apple will be able to maintain its creative boldness without Jobs, but I hope his sense of inventiveness, perfectability and user-friendliness has so pervaded Apple that the company will remain an innovation leader.
Monday, October 3, 2011
A chill in the air, and altered sunlight
I feel the chill in the air and welcome it after so many spring and summer afternoons working in the hot sun. Saturday's chill came unexpectedly simply because I had not looked at the forecast, somehow assuming that summer's warmth would continue through the weekend. I borrowed a jacket from my son for a chilly walk through his Greensboro neighborhood, reveling in the brisk breeze and the changing leaves. We needed only the aroma of a wood fire to think that winter had fully arrived.
The sunlight is different. After last weekend's dreary clouds and misty rain, the sunlight is back but no longer the same. Its obtuse angle gives a different feel to the daylight. It is the difference between a spotlight set too low and a broad-spectrum floodlight shining high overhead. Driving west in the late afternoon on Friday, I shifted my eyes and my whole body to take the glare out of my eyes and to avoid being blinded by sunlight on a plane with my face.
Light is quickly receding. Darkness slips in before dinner is done, and the night stretches out to cover more of every 24-hour span. The artificial creation of Daylight Saving Time will soon be repealed until next year, and the darkness will push out the light before the end of every workday.
Last week, walking out to the driveway to retrieve the morning newspaper, I looked up at the cloudless sky and saw Orion stalking prey across the black sky, the surest sign that winter is on its way. Nature's time cycle continues its revolution, and it is good.
The sunlight is different. After last weekend's dreary clouds and misty rain, the sunlight is back but no longer the same. Its obtuse angle gives a different feel to the daylight. It is the difference between a spotlight set too low and a broad-spectrum floodlight shining high overhead. Driving west in the late afternoon on Friday, I shifted my eyes and my whole body to take the glare out of my eyes and to avoid being blinded by sunlight on a plane with my face.
Light is quickly receding. Darkness slips in before dinner is done, and the night stretches out to cover more of every 24-hour span. The artificial creation of Daylight Saving Time will soon be repealed until next year, and the darkness will push out the light before the end of every workday.
Last week, walking out to the driveway to retrieve the morning newspaper, I looked up at the cloudless sky and saw Orion stalking prey across the black sky, the surest sign that winter is on its way. Nature's time cycle continues its revolution, and it is good.