The greatest loss in being laid off by the newspaper, besides the loss of income, was the loss of opportunity to review local theater productions. With a little academic background, a love of theater and, most importantly, a willingness to do it, I became the default reviewer for local productions by The Playhouse and Barton College. Over nearly 30 years, I saw a lot of great performances and only a handful of miscues. I spent many a late night laboring over a review that would appear in the next day's paper. I can honestly say that only one or two of the dozens of productions I saw were not very good.
I don't know that my reviews did a lot for theater in Wilson, but they did give each production an extra boost of attention from the newspaper, and I think, occasionally at least, might have led some audience to the theater. Paul Crouch, the retired director of the drama program at Barton, told me after he retired that he had always told his students and cast to pay no attention to the reviews in the newspaper because "None of them know what they're talking about." Fair enough.
Saturday night, I went to see "Member of the Wedding," one of two plays on this year's Theater of the American South playbill. I went without a notepad and pen and without a complimentary ticket. If I were reviewing this production, I'd say it was an outstanding staging of a powerful story by Carson McCullers with an honest depiction of interracial relationships in the Jim Crow South and the deep anxieties of a 12-year-old girl.
What was most impressive about the play were the performances of the two main characters, 13-year-old Maddie Taylor of Chapel Hill as Frankie Addams and Yolanda Rabun as Berenice, the family cook, who is also a surrogate mother for Frankie. Taylor, who has numerous film and television credits despite her young age, lived up to her highly acclaimed reputation. She had a stage presence that was gripping and endearing. Rabun captured her role beautifully with all the nuance of a life that is central to white society but outside of it. Rabun's combination of loving care and demanding discipline was perfectly on target, and her singing voice in three brief stagings was spine-tingling. Adam Twiss of Barton College directs a moving production with a more than adequate supporting cast. But it is Taylor and Rabun, who are onstage nearly the entire play, who carry the story and make the drama real.
How far has Theater of the American South, Gary Cole's bold planting of a major theater festival in sleepy Wilson, come? "Member of the Wedding" competes well with "Cat on a Hot Tin Roof" from the first year of TOTAS. It was good to see some local performers in the cast, albeit in minor roles, in a festival that has relied primarily on professional actors. The Wilson community remains enthusiastic about the festival, which also attracts patrons from many miles away for its plays, cooking demonstrations, lectures and other events.
With "Member of the Wedding," Theater of the American South is alive and well once again this year as it wraps up its 2010 season tonight.
Sunday, May 30, 2010
Thursday, May 27, 2010
Unions ally with public employees
Labor unions have seen their future, and it is in the public sector. While union membership has plummeted, along with manufacturing jobs, union visionaries have realized the opportunity that lies before them: organizing government workers. Public sector unionizing is the only growth industry organized labor has seen in the past quarter-century.
Now, the Service Employees International Union, a political powerhouse, wants Congress to require state and local governments to negotiate with unions representing government workers. Such a law would overturn laws in several states, including North Carolina, which specifically prohibits bargaining with or strikes by public sector employees. If passed, the federal legislation would be a payoff for the SEIU's support of a number of powerful Democrats, including Senate Majority Leader Harry Reid.
The problem with collective bargaining with public sector employees is the imbalance of power at the negotiating table. SEIU and its North Carolina affiliate, the State Employees Association of North Carolina, have shown their willingness punish any legislator who opposes their agenda and to threaten all others. "Management," in this case, is at a disadvantage because members of the board of directors (i.e., legislators and other elected officials) are in the pocket of the union that helped to get them elected. There can be no balanced negotiation when labor can influence management behind the scenes through campaign contributions and vote delivery.
Unlike private-sector negotiators, government officials have little incentive to oppose union demands. A corporation has to watch its bottom line and faces market forces that prevent raising prices to pay for too-generous labor costs. A city manager, however, has fewer limits in a labor negotiation. City services are almost always monopolies, so raising prices is no problem. And if more money is needed for higher salaries and better benefits, the city can always raise taxes. It is down this road that states such as California and countries such as Greece discovered the limits of their generosity. If North Carolina, which has prohibited bargaining with public employees for half a century, is forced by federal law recognize and negotiate with labor unions, taxes will inevitably rise.
There should be no right to strike against the public interest. Imagine the chaos — and the anger — if firefighters, police, emergency medical personnel and others went on strike. Even if strikes are prohibited (at first) union members could make their point by, for example, having Highway Patrol troopers ticket every driver doing 56 in a 55 mph zone. Or suppose the entire fire department got sick on the same day. Even with some moderate restrictions, collective bargaining with public employees could lead to chaotic consequences.
Just look at what is happening now, when public sector employees are prohibited from collective bargaining. Public employee "associations" (they're not officially unions in this state) directly lobby legislators for raises and better benefits. They protest loudly against any reductions in force or furloughs. In the past two years, as tens of thousands of private-sector employees have been laid off, the loudest outcry you heard was from state employees over wage freezes, furloughs and very limited layoffs. Public employees in this state enjoy extraordinary benefits, including a defined-benefit pension, which is extremely rare in the private sector now. Public sector pay, which has been safeguarded while recessions reduced private-sector compensation, is at least as good as in the private sector.
Giving public employees the heavy artillery of bargaining to go with their small-arms of political pressure would put government firmly in the hands of public employees, whose primary goal — as has been shown many times — is to look out for themselves. Instead of good government we'd have good-for-me government, and the "me" would be union members, not taxpayers.
Now, the Service Employees International Union, a political powerhouse, wants Congress to require state and local governments to negotiate with unions representing government workers. Such a law would overturn laws in several states, including North Carolina, which specifically prohibits bargaining with or strikes by public sector employees. If passed, the federal legislation would be a payoff for the SEIU's support of a number of powerful Democrats, including Senate Majority Leader Harry Reid.
The problem with collective bargaining with public sector employees is the imbalance of power at the negotiating table. SEIU and its North Carolina affiliate, the State Employees Association of North Carolina, have shown their willingness punish any legislator who opposes their agenda and to threaten all others. "Management," in this case, is at a disadvantage because members of the board of directors (i.e., legislators and other elected officials) are in the pocket of the union that helped to get them elected. There can be no balanced negotiation when labor can influence management behind the scenes through campaign contributions and vote delivery.
Unlike private-sector negotiators, government officials have little incentive to oppose union demands. A corporation has to watch its bottom line and faces market forces that prevent raising prices to pay for too-generous labor costs. A city manager, however, has fewer limits in a labor negotiation. City services are almost always monopolies, so raising prices is no problem. And if more money is needed for higher salaries and better benefits, the city can always raise taxes. It is down this road that states such as California and countries such as Greece discovered the limits of their generosity. If North Carolina, which has prohibited bargaining with public employees for half a century, is forced by federal law recognize and negotiate with labor unions, taxes will inevitably rise.
There should be no right to strike against the public interest. Imagine the chaos — and the anger — if firefighters, police, emergency medical personnel and others went on strike. Even if strikes are prohibited (at first) union members could make their point by, for example, having Highway Patrol troopers ticket every driver doing 56 in a 55 mph zone. Or suppose the entire fire department got sick on the same day. Even with some moderate restrictions, collective bargaining with public employees could lead to chaotic consequences.
Just look at what is happening now, when public sector employees are prohibited from collective bargaining. Public employee "associations" (they're not officially unions in this state) directly lobby legislators for raises and better benefits. They protest loudly against any reductions in force or furloughs. In the past two years, as tens of thousands of private-sector employees have been laid off, the loudest outcry you heard was from state employees over wage freezes, furloughs and very limited layoffs. Public employees in this state enjoy extraordinary benefits, including a defined-benefit pension, which is extremely rare in the private sector now. Public sector pay, which has been safeguarded while recessions reduced private-sector compensation, is at least as good as in the private sector.
Giving public employees the heavy artillery of bargaining to go with their small-arms of political pressure would put government firmly in the hands of public employees, whose primary goal — as has been shown many times — is to look out for themselves. Instead of good government we'd have good-for-me government, and the "me" would be union members, not taxpayers.
Monday, May 24, 2010
When an 'act of war' doesn't end in war
A hostile foreign power sinks a warship, killing scores of crew members. What will be the result?
This is not a backgrounder on the USS Maine, which sunk in Havana Harbor in 1898. In that case, the result, despite a lack of evidence that Spain had anything to do with the U.S. battleship's sinking, was "a splendid little war," which the United States won handily. No, this case involves two nations with the same last name — Korea — and two very different economic and political systems. On March 26, the South Korean Navy ship Cheonan was struck by an underwater explosion, killing 46 sailors and sending the ship to the bottom of the ocean. Two months later, investigators have announced that the ship was sunk by a torpedo that had originated in North Korea. Despite the carefully reconstructed evidence, North Korea denies having any responsibility for the sinking, which occurred in waters near the two countries' international boundary.
South Korean President Lee Myung-bak has officially blamed North Korea and has announced a series of economic measures and restrictions that fall far short of a declaration of war, which would seem to be warranted after this classic "act of war." For its part, North Korea warns that any sanctions against the secretive, closed Stalinist regime would mean "all-out war."
The Korean peninsula, divided since the end of World War II into a repressive, communist North and a Western-leaning and economically vibrant South, personifies the Cold War maxim of "mutually assured destruction." That doctrine held that neither the United States nor the Soviet Union could risk war because each held sufficient numbers of nuclear weapons to obliterate the other, even in a retaliatory strike after an initial attack. In Korea, nuclear weapons are not the key to the MADness, although North Korea possesses a few nuclear weapons, and U.S. troops stationed in and near South Korea have nuclear capabilities. Even a conventional war between the Koreas would be madness, as the world learned in 1950-53. All-out war would devastate both countries, and although South Korea might ultimately prevail, its economic advancements of the past half-century would be wiped away. U.S. war game exercises conclude that North Korea's defeat might unleash chaos that would envelope the whole reunion and be felt around the globe.
Maybe that's why South Korea, recognizing that it has been attacked by a foreign power whose sinking of a sovereign warship was, indeed, an "act of war," is willing to settle for something less than a retaliatory military strike. President Lee wants an apology and punishment for those responsible. What remains mysterious is why Kim Jong Il's North Korean regime would so foolhardily commit an act of war and thereby risk the obliteration of two nations.
This is not a backgrounder on the USS Maine, which sunk in Havana Harbor in 1898. In that case, the result, despite a lack of evidence that Spain had anything to do with the U.S. battleship's sinking, was "a splendid little war," which the United States won handily. No, this case involves two nations with the same last name — Korea — and two very different economic and political systems. On March 26, the South Korean Navy ship Cheonan was struck by an underwater explosion, killing 46 sailors and sending the ship to the bottom of the ocean. Two months later, investigators have announced that the ship was sunk by a torpedo that had originated in North Korea. Despite the carefully reconstructed evidence, North Korea denies having any responsibility for the sinking, which occurred in waters near the two countries' international boundary.
South Korean President Lee Myung-bak has officially blamed North Korea and has announced a series of economic measures and restrictions that fall far short of a declaration of war, which would seem to be warranted after this classic "act of war." For its part, North Korea warns that any sanctions against the secretive, closed Stalinist regime would mean "all-out war."
The Korean peninsula, divided since the end of World War II into a repressive, communist North and a Western-leaning and economically vibrant South, personifies the Cold War maxim of "mutually assured destruction." That doctrine held that neither the United States nor the Soviet Union could risk war because each held sufficient numbers of nuclear weapons to obliterate the other, even in a retaliatory strike after an initial attack. In Korea, nuclear weapons are not the key to the MADness, although North Korea possesses a few nuclear weapons, and U.S. troops stationed in and near South Korea have nuclear capabilities. Even a conventional war between the Koreas would be madness, as the world learned in 1950-53. All-out war would devastate both countries, and although South Korea might ultimately prevail, its economic advancements of the past half-century would be wiped away. U.S. war game exercises conclude that North Korea's defeat might unleash chaos that would envelope the whole reunion and be felt around the globe.
Maybe that's why South Korea, recognizing that it has been attacked by a foreign power whose sinking of a sovereign warship was, indeed, an "act of war," is willing to settle for something less than a retaliatory military strike. President Lee wants an apology and punishment for those responsible. What remains mysterious is why Kim Jong Il's North Korean regime would so foolhardily commit an act of war and thereby risk the obliteration of two nations.
Downtown whirligigs will identify Wilson
Last week's announcement that 32 of Vollis Simpson's giant whirligigs will be moved off his farm near Lucama to downtown Wilson is terrific news for Wilson. Simpson's iconic whirligigs have attracted visitors for decades who marvel at the giant wind sculptures and, unfortunately, make up urban myths about their origins. Simpson has been recognized around the world for his whirligigs, which are considered a high form of folk art.
Some details remain to be worked out about moving the whirligigs. I understand the city does not actually own the two-acre site selected for placement of the whirligigs, for example, and funding has not been announced. But Simpson's whirligigs are so well known and admired (examples stand at the N.C. Museum of Art, the Abby Aldrich Rockefeller Folk Art Collection at Williamsburg, Va., at Olympic Park in Atlanta, Inner Harbor in Baltimore and numerous other locations) that raising money for this venture shouldn't be so hard. With Simpson in his 90s, it's important to nail down this deal now and plan for the repair, refurbishing and maintenance of these whirligigs.
Last week's announcement points to an important fact: Simpson's whirligigs give Wilson a unique identity. Forget about tobacco festivals, barbecue festivals, "bloomin'" festivals, fall festivals, coastal plain festivals, human domino falls, car shows, rubber duck races and all the rest. Scores of cities and towns can replicate those themes, but few, if any, can claim a whirligig as a symbol of local culture. Wilson's annual Whirligig Festival is growing in popularity and guarantees a unique identity for this town. This is the card Wilson has been dealt; it should play it.
Some details remain to be worked out about moving the whirligigs. I understand the city does not actually own the two-acre site selected for placement of the whirligigs, for example, and funding has not been announced. But Simpson's whirligigs are so well known and admired (examples stand at the N.C. Museum of Art, the Abby Aldrich Rockefeller Folk Art Collection at Williamsburg, Va., at Olympic Park in Atlanta, Inner Harbor in Baltimore and numerous other locations) that raising money for this venture shouldn't be so hard. With Simpson in his 90s, it's important to nail down this deal now and plan for the repair, refurbishing and maintenance of these whirligigs.
Last week's announcement points to an important fact: Simpson's whirligigs give Wilson a unique identity. Forget about tobacco festivals, barbecue festivals, "bloomin'" festivals, fall festivals, coastal plain festivals, human domino falls, car shows, rubber duck races and all the rest. Scores of cities and towns can replicate those themes, but few, if any, can claim a whirligig as a symbol of local culture. Wilson's annual Whirligig Festival is growing in popularity and guarantees a unique identity for this town. This is the card Wilson has been dealt; it should play it.
Friday, May 21, 2010
This debate is four decades old
The Tea Party folks had only begun celebrating the lop-sided triumph of Rand Paul in the Republican Kentucky Senate primary when the victorious candidate stuck both feet in his mouth. Who would have thought that a 44-year-old congressional debate and 44 years of settled law would become a topic in a 2010 Senate campaign? But that's where Paul, the son of 2008 presidential candidate Ron Paul, has taken us.
Paul follows his father's Libertarian philosophy, and you have to respect the elder Paul's principles. As a member of Congress, he declined overseas trips and largely paid his own way on almost everything. He believes in limited government with a big emphasis on limits. But as his son's conundrum shows, that path leads into some mushy swamps. Soon after his electoral victory, Rand Paul began trying to explain his belief that every business has the right to decide whom it will serve. It's unfair for the federal government to force a business to adopt practices it opposes.
If that argument sounds familiar, you're old enough to remember the debate over the 1964 Civil Rights Act. That law banned discrimination based on race, creed, religion, color, national origin or sex, and opponents of the law made the same argument Paul is making now: Businesses should be allowed to decide with whom they will conduct business. The argument has some measure of sensibility. Barry Goldwater made the argument in voting against the Civil Rights Act. But before you sign onto the notion that businesses should be free to determine their own trade policies, recall what society was like in 1964 and what the Civil Rights Act sought to correct. Statutory racism allowed business owners to refuse to serve African-Americans. It barred African-Americans (and other racial, ethnic or religious groups) from many jobs. It prohibited minorities from use of public facilities, such as bathrooms, water fountains and parks. It prevented African-Americans from eating out in most restaurants. It maintained a morally and economically indefensible dual society in which minorities were barred from contributing fully to the dominant social and economic structure.
Perhaps Paul can make an argument in favor of shopkeepers' freedom to choose their own customers, but he cannot claim that the old society, which he is too young to remember (he is just one year older than the Civil Rights Act, was a better, fairer or more economically efficient place. Sometimes, foggy principles have to step aside for the moral — and economic — good of society.
Paul follows his father's Libertarian philosophy, and you have to respect the elder Paul's principles. As a member of Congress, he declined overseas trips and largely paid his own way on almost everything. He believes in limited government with a big emphasis on limits. But as his son's conundrum shows, that path leads into some mushy swamps. Soon after his electoral victory, Rand Paul began trying to explain his belief that every business has the right to decide whom it will serve. It's unfair for the federal government to force a business to adopt practices it opposes.
If that argument sounds familiar, you're old enough to remember the debate over the 1964 Civil Rights Act. That law banned discrimination based on race, creed, religion, color, national origin or sex, and opponents of the law made the same argument Paul is making now: Businesses should be allowed to decide with whom they will conduct business. The argument has some measure of sensibility. Barry Goldwater made the argument in voting against the Civil Rights Act. But before you sign onto the notion that businesses should be free to determine their own trade policies, recall what society was like in 1964 and what the Civil Rights Act sought to correct. Statutory racism allowed business owners to refuse to serve African-Americans. It barred African-Americans (and other racial, ethnic or religious groups) from many jobs. It prohibited minorities from use of public facilities, such as bathrooms, water fountains and parks. It prevented African-Americans from eating out in most restaurants. It maintained a morally and economically indefensible dual society in which minorities were barred from contributing fully to the dominant social and economic structure.
Perhaps Paul can make an argument in favor of shopkeepers' freedom to choose their own customers, but he cannot claim that the old society, which he is too young to remember (he is just one year older than the Civil Rights Act, was a better, fairer or more economically efficient place. Sometimes, foggy principles have to step aside for the moral — and economic — good of society.
Wednesday, May 19, 2010
I'll never sit here again
After nearly a full month of online advertising plus a week of newspaper classified advertising (all for one price), my 16-year-old Honda del Sol was sold today. Considering I received 95 percent of the somewhat aggressive asking price, I should be overjoyed. But as the skies cleared this afternoon after two days of steady rain, I felt myself missing that little car that had given me the most fun I ever had on four wheels. I was ready to take the top off and let the remnants of my hair whip in the breeze. I was ready to stomp the accelerator as I shifted to second and feel the four-cylinder engine oomph the lightweight two-seater up to speed. I was ready to take a corner using 100 degrees of arc instead of 90 and let the rear end glide into place. I wanted to feel all those things, and more, again, but I wouldn't and couldn't.
I sold the car to a young man who was probably in preschool when the car was built. He was young enough to enjoy it and to put up with its shortcomings — a chronically leaky trunk and an air conditioner that labored to keep up with the heat generated by the black sheet metal and black interior. And he won't mind that newer cars have nicer appointments, such as keyless entry, cup holders and ambient lighting. If he takes care of the car, it should give him several more years of driving fun.
Each day I drive my "less old" car (it's not "new") I bought six weeks ago, I like it better. It doesn't do the same things the del Sol did, but it does other things the del Sol never would. After shopping for a car of my dreams, one that would combine the del Sol's fun with a larger vehicle's security and more luxurious appointments, I concluded that such a car didn't exist, at least not in my price range. So I settled on a 2003 Honda Accord coupe with a six-speed transmission and 120,000 miles on the odometer. The six-speed lacks some of the "fun" element of the older car's five-speed, but it is fun to shift and has amazingly economical higher gears. A nice stereo, keyless entry, leather seating, an aesthetic body shape, more than 100 extra horsepower than what I was accustomed to — those are all things I could get used to. And while no car feels very safe to me in interstate highway traffic, the Accord is far heavier and less likely to be whipped about by passing trucks. The wind and road noise that I tolerated in the small car are almost entirely missing in the newer car.
Before I sold the del Sol, I received inquiries from as far away as Georgia and New York. People who are familiar with these cars love them, despite all their shortcomings. Fourteen years ago, when I debated and worried over whether I should plunk down more than I had planned to spend on a two-year-old pop-top two-seater, my wife persuaded me by saying, "If you don't buy it now, next time [you trade cars], you'll be too old." As usual, she was right. I got 14 years of fun (and a few headaches) while I was still young enough to enjoy it.
Tuesday, May 18, 2010
Palin makes it up as she goes along
Now that she's no longer an elected official but just another rich celebrity, Sarah Palin must think she can say anything and get away with it. The former Alaska governor and Republican vice presidential nominee doesn't mind just making things up, apparently.
In a speech in Charlotte before the National Rifle Association, Palin accused President Obama of planning to eliminate private ownership of guns and ammunition. Huh? "If they [Obama and his minions] thought they could get away with it," Palin told the crowd, "they would ban guns and ban ammunition and gut the Second Amendment."
Where does she get this drivel? To my knowledge neither Obama nor any other national-scene politician in recent years has ever proposed an outright ban on guns or ammunition. Gun politics has taken a turn the past few years toward looser laws and broader interpretation of the Second Amendment. Even left-leaning politicians now acknowledge the right to gun ownership by law-abiding Americans. Whatever battle there is over the Second Amendment is over the nuances of the law — just what kinds of "arms" are protected, for example, and what reasonable restrictions might government place on this right?
But Palin was never one to deal in nuances. She might leaven her audacious claims and invented scenarios with folksy charm, but she much prefers to outrageous lie to the complicated fact. It sells, and she's going to keep dishing the red meat to her baying hounds.
In a speech in Charlotte before the National Rifle Association, Palin accused President Obama of planning to eliminate private ownership of guns and ammunition. Huh? "If they [Obama and his minions] thought they could get away with it," Palin told the crowd, "they would ban guns and ban ammunition and gut the Second Amendment."
Where does she get this drivel? To my knowledge neither Obama nor any other national-scene politician in recent years has ever proposed an outright ban on guns or ammunition. Gun politics has taken a turn the past few years toward looser laws and broader interpretation of the Second Amendment. Even left-leaning politicians now acknowledge the right to gun ownership by law-abiding Americans. Whatever battle there is over the Second Amendment is over the nuances of the law — just what kinds of "arms" are protected, for example, and what reasonable restrictions might government place on this right?
But Palin was never one to deal in nuances. She might leaven her audacious claims and invented scenarios with folksy charm, but she much prefers to outrageous lie to the complicated fact. It sells, and she's going to keep dishing the red meat to her baying hounds.
Friday, May 14, 2010
We all feel the death of a child
We were among scores, maybe hundreds, even thousands, who shed tears for a child we had never met, had never even heard of just days before. Such is the impact of the loss of a child.
We learned from our son of the illness of a classmate's young daughter. She was hospitalized with bacterial meningitis, and the prognosis was dire. By the time we heard of it, she was already in what he called "miracle territory." That's what doctors said it would take for her to survive. Despite fervent prayers from friends, relatives and strangers, the little patient did not survive. We received the heart-wrenching news this morning.
Before that curtain fell, we had read a forwarded e-mail from the mother, filled with gratitude for all the prayers, concerns and best wishes for "our precious girl." I read it, utterly astounded that this young mother could put together those words so eloquently, despite the burden on her heart.
Every parent's greatest fear is the death of a child. Fortunately, few parents have to endure such a tragedy. When this happens, all parents feel the horror and the gratitude of that dagger to the heart that barely misses. There but for the grace of God ...
The "sweet girl" whose parents now face a "world forever changed" was just weeks older than our son's younger child, who had himself faced a frightening bout with pneumonia when only a month old that left him hospitalized for a week. Modern medicine allowed him to come home and grow into a healthy toddler. This new tragedy slaps us with the reality of how close we had come to a similar fate. Nearly 30 years ago, a colleague's firstborn son was hospitalized with meningitis, and the young parents stared blankly at the shattering of their hopes and dreams. That baby survived and has grown to adulthood, but fate could have twisted another way.
Years ago, my brother and I sought out and found an old family cemetery, which had been untended for decades and was overgrown with trees and weeds. Walking among the stones, looking for names we knew from family histories, I happened upon a small grave marked by a stone larger than the grave space. On the top of the moss-laden stone were carved two words: "OUR BABY." At a time, in the early 1900s, when couples produced broods of eight, 10, 12 children, this ancestor of mine grieved for the one who was lost. Having a half-dozen other children to love could not fill the vacuum left by the one who died far too early. Even in a house filled with children, the loss of one imposes a world "forever changed."
The death of a child is so powerful, you don't even have to know the child or the parents to be knocked askew.
We learned from our son of the illness of a classmate's young daughter. She was hospitalized with bacterial meningitis, and the prognosis was dire. By the time we heard of it, she was already in what he called "miracle territory." That's what doctors said it would take for her to survive. Despite fervent prayers from friends, relatives and strangers, the little patient did not survive. We received the heart-wrenching news this morning.
Before that curtain fell, we had read a forwarded e-mail from the mother, filled with gratitude for all the prayers, concerns and best wishes for "our precious girl." I read it, utterly astounded that this young mother could put together those words so eloquently, despite the burden on her heart.
Every parent's greatest fear is the death of a child. Fortunately, few parents have to endure such a tragedy. When this happens, all parents feel the horror and the gratitude of that dagger to the heart that barely misses. There but for the grace of God ...
The "sweet girl" whose parents now face a "world forever changed" was just weeks older than our son's younger child, who had himself faced a frightening bout with pneumonia when only a month old that left him hospitalized for a week. Modern medicine allowed him to come home and grow into a healthy toddler. This new tragedy slaps us with the reality of how close we had come to a similar fate. Nearly 30 years ago, a colleague's firstborn son was hospitalized with meningitis, and the young parents stared blankly at the shattering of their hopes and dreams. That baby survived and has grown to adulthood, but fate could have twisted another way.
Years ago, my brother and I sought out and found an old family cemetery, which had been untended for decades and was overgrown with trees and weeds. Walking among the stones, looking for names we knew from family histories, I happened upon a small grave marked by a stone larger than the grave space. On the top of the moss-laden stone were carved two words: "OUR BABY." At a time, in the early 1900s, when couples produced broods of eight, 10, 12 children, this ancestor of mine grieved for the one who was lost. Having a half-dozen other children to love could not fill the vacuum left by the one who died far too early. Even in a house filled with children, the loss of one imposes a world "forever changed."
The death of a child is so powerful, you don't even have to know the child or the parents to be knocked askew.
Wednesday, May 12, 2010
Judicial experience is no criterion
The inevitable battle over Elena Kagan's nomination to the Supreme Court is just beginning, but one aspect of the criticism of her nomination is clearly irrelevant. Some critics have complained that she has no judicial experience, which is true. She has spent her entire professional career in academia and government service. But to claim that never having worn a judge's robe disqualifies her for the bench ignores history. Some of the court's most honored and respected justices had no judicial experience before being nominated to the nation's highest court. Among them: Chief Justice William Rehnquist, Justice Lewis Powell, Justice William O. Douglas, Chief Justice Earl Warren, Justice Charles Evans Hughes, Justice Louis Brandeis, Justice Byron White, Justice Felix Frankfurter, and Chief Justice John Marshall, considered by many historians to be the most important member of the court ever. So lack of judicial experience should not be any impediment to nomination to the Supreme Court.
Kagan's nomination will likely turn on her work at Harvard Law School, where her record gives ammunition to her critics from the left and the right. At Harvard, she denied military recruiters access to the campus because of the armed forces' prohibition against homosexual conduct. Harvard lost a court battle over its policy. Conservatives are still unhappy with her defense of the Harvard policy against military recruitment. But as Harvard Law's dean, she also revitalized the school by bringing in new, conservative faculty so that the school boasted of the highest academic credentials, regardless of political leanings, and liberals are still unhappy with her cozying up to conservatives. Because Kagan's Harvard tenure gives both sides something to complain about, those criticisms will likely even out, paving the way to her confirmation.
But if critics harp on her lack of judicial experience, they should be laughed out of the Senate hearing room.
Kagan's nomination will likely turn on her work at Harvard Law School, where her record gives ammunition to her critics from the left and the right. At Harvard, she denied military recruiters access to the campus because of the armed forces' prohibition against homosexual conduct. Harvard lost a court battle over its policy. Conservatives are still unhappy with her defense of the Harvard policy against military recruitment. But as Harvard Law's dean, she also revitalized the school by bringing in new, conservative faculty so that the school boasted of the highest academic credentials, regardless of political leanings, and liberals are still unhappy with her cozying up to conservatives. Because Kagan's Harvard tenure gives both sides something to complain about, those criticisms will likely even out, paving the way to her confirmation.
But if critics harp on her lack of judicial experience, they should be laughed out of the Senate hearing room.
Monday, May 10, 2010
Cotton mill villages, whence we come
For the past few weeks, I've been reading a book I had given my father many years ago. "Like A Family" from UNC Press is a study of cotton mill villages in the South, particularly North Carolina. I had given the book to Daddy because it sounded like a nostalgic piece about the good ol' days of growing up on a mill village, which my father (and my mother) did. The book is light on nostalgia and a bit heavy on socio-political slant. The authors (six of them) view much of village life in terms of labor relations and economic exploitation of poor workers. Needless to say, my father, his brothers and sisters, his father, father-in-law and brothers- and sisters-in-law (all of my relatives worked in cotton mills) didn't see their lives that way. In their conversations at family gatherings, the talk was nostalgic, sharing stories of tending gardens, playing baseball with a taped-together ball, and courtin' in the parlor of mill houses. Though they never discounted the harshness of the work or the paucity of the wages, they fondly recalled the camaraderie and the community spirit of a place where everyone knew everyone else because all the families worked in the mill, attended the mill school, joined mill churches and lived in close quarters on narrow mill lots.
The story of cotton mills in the South is both a social and an economic history. Although the authors of "Like A Family" are critical of the post-Civil War investors who brought cotton manufacturing to the South, cotton mills were in some sense the salvation of the destitute former Confederacy. This defeated land, its currency worthless, had relied on agriculture operated by a landed aristocracy dependent upon slave labor. Without slavery and without the capital necessary to buy seed and fertilizer and hire labor, southern landowners developed the sharecrop system, which has been compared to slavery, but it entrapped both former slaves and poor whites, including some of my ancestors. Cotton manufacturing took advantage of the nearby raw product produced throughout the South and of the excess of farm labor. Both of my grandfathers were among those who made the transition from farming to "public work" (as their generation called it) in cotton mills. The work was hard, hot and dirty, and it failed to provide the personal satisfaction and sense of accomplishment of farming, but it paid a steady wage, regardless of weather or insect infestation. Few of those who tried public work ever went back to farming.
Some elements of the cotton mill economy make us cringe today — child labor, whites-only hiring policies, and blatantly anti-union laws and policies. But there was also an egalitarian element to mill villages. Everyone, at least in the early years, lived in basic, nearly identical houses built by the mill. Workers paid rent or received free rent in lieu of wages. In the mill village where my parents grew up, houses were rented by the room. If a couple needed only one bedroom instead of the three available, the extra rooms would be locked, and they'd pay rent for only the rooms they used. The superintendent's house would likely be larger than the workers' houses, but it was not ostentatious in any way, just another wood frame structure with few, if any embellishments. Although there was some wage differentiation, most workers earned close to the same wages, and most of the work was equally hard.
It has been reported that W.J. Cash, author of the seminal and still readable "Mind of the South," had intended to write "the great American novel" set in a mill village. Instead, he killed himself. Few novels I can think of are set in textile mills or mill villages. Novelists, it seems to me, have ignored one of the great stages of American life — the southern cotton mill village. Doris Betts' "The Scarlet Thread" is one of the few novels I'm aware of that is set in a mill village, and it is out of print.
In literature and in history, cotton mill villages deserve more recognition. For generations of Americans in the South, mill villages constituted the shared experience of life and work.
The story of cotton mills in the South is both a social and an economic history. Although the authors of "Like A Family" are critical of the post-Civil War investors who brought cotton manufacturing to the South, cotton mills were in some sense the salvation of the destitute former Confederacy. This defeated land, its currency worthless, had relied on agriculture operated by a landed aristocracy dependent upon slave labor. Without slavery and without the capital necessary to buy seed and fertilizer and hire labor, southern landowners developed the sharecrop system, which has been compared to slavery, but it entrapped both former slaves and poor whites, including some of my ancestors. Cotton manufacturing took advantage of the nearby raw product produced throughout the South and of the excess of farm labor. Both of my grandfathers were among those who made the transition from farming to "public work" (as their generation called it) in cotton mills. The work was hard, hot and dirty, and it failed to provide the personal satisfaction and sense of accomplishment of farming, but it paid a steady wage, regardless of weather or insect infestation. Few of those who tried public work ever went back to farming.
Some elements of the cotton mill economy make us cringe today — child labor, whites-only hiring policies, and blatantly anti-union laws and policies. But there was also an egalitarian element to mill villages. Everyone, at least in the early years, lived in basic, nearly identical houses built by the mill. Workers paid rent or received free rent in lieu of wages. In the mill village where my parents grew up, houses were rented by the room. If a couple needed only one bedroom instead of the three available, the extra rooms would be locked, and they'd pay rent for only the rooms they used. The superintendent's house would likely be larger than the workers' houses, but it was not ostentatious in any way, just another wood frame structure with few, if any embellishments. Although there was some wage differentiation, most workers earned close to the same wages, and most of the work was equally hard.
It has been reported that W.J. Cash, author of the seminal and still readable "Mind of the South," had intended to write "the great American novel" set in a mill village. Instead, he killed himself. Few novels I can think of are set in textile mills or mill villages. Novelists, it seems to me, have ignored one of the great stages of American life — the southern cotton mill village. Doris Betts' "The Scarlet Thread" is one of the few novels I'm aware of that is set in a mill village, and it is out of print.
In literature and in history, cotton mill villages deserve more recognition. For generations of Americans in the South, mill villages constituted the shared experience of life and work.
Friday, May 7, 2010
A national 'free exercise' of religion
It's a strange thing we've come to: People who don't pray holding protests and filing lawsuits against those who do pray, publicly. A federal judge has ruled that the National Day of Prayer, held Thursday, is unconstitutional. People who don't want to attend National Day of Prayer events, which are held in towns and cities across the country, are not required to. They can ignore the Prayer Day participants and go on about their business, just as people who don't like "Dancing With The Stars" can ignore it. It's hard to imagine how those who don't want to take part in the National Day of Prayer are harmed by the actions of those who do take part.
Imagine if people who oppose college athletics because these sporting events detract from the academic mission of colleges and universities — an entirely rational assertion — were to hold protests and try to prevent college sports fans from attending games or watching them on television.
I'll admit that I have only attended a handful of these National Days of Prayer over the past 30 years. These public displays of religious zeal often seemed shallow and insincere to me, but other people enjoyed and took inspiration from these events, so God bless 'em.
The plaintiffs in the federal lawsuit that now threatens the existence of a national prayer observance that has been around since the Truman administration are the Freedom From Religion Foundation. The gist of their argument, as I understand it, is that a National Day of Prayer endorsed by Congress and the president violates the First Amendment. That amendment reads: "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof ... ." (emphasis added) To claim that an non-sectarian, generic National Day of Prayer constitutes "an establishment of religion" is a stretch. It seems to me that a stronger argument can be made that prohibiting a National Day of Prayer would violated the "free exercise" clause of the First Amendment. If the National Day of Prayer is struck down, those who wish to observe a nationwide observance of prayer would be denied that right.
Considering the fact that the Constitutional Convention of 1787, which proposed the Constitution and the First Amendment, opened its sessions with prayer, it would seem obvious that it was never the intent of the Founding Fathers to prohibit public prayer or a national prayer observance.
Imagine if people who oppose college athletics because these sporting events detract from the academic mission of colleges and universities — an entirely rational assertion — were to hold protests and try to prevent college sports fans from attending games or watching them on television.
I'll admit that I have only attended a handful of these National Days of Prayer over the past 30 years. These public displays of religious zeal often seemed shallow and insincere to me, but other people enjoyed and took inspiration from these events, so God bless 'em.
The plaintiffs in the federal lawsuit that now threatens the existence of a national prayer observance that has been around since the Truman administration are the Freedom From Religion Foundation. The gist of their argument, as I understand it, is that a National Day of Prayer endorsed by Congress and the president violates the First Amendment. That amendment reads: "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof ... ." (emphasis added) To claim that an non-sectarian, generic National Day of Prayer constitutes "an establishment of religion" is a stretch. It seems to me that a stronger argument can be made that prohibiting a National Day of Prayer would violated the "free exercise" clause of the First Amendment. If the National Day of Prayer is struck down, those who wish to observe a nationwide observance of prayer would be denied that right.
Considering the fact that the Constitutional Convention of 1787, which proposed the Constitution and the First Amendment, opened its sessions with prayer, it would seem obvious that it was never the intent of the Founding Fathers to prohibit public prayer or a national prayer observance.
Wednesday, May 5, 2010
Bring on the barbarians
"I decided years ago, I really don't care what some folks think of me. They are either gonna like me or not and there is little that can change that. I know one thing that won't change, I refuse to be what someone else thinks I should be, I am me, take or leave it. We came in this life without most of them and we can leave without most of them."
How many centuries did it take for civilization to develop? How long was it between the time Neanderthals grabbed at fresh meat with their dirty fingernails and stuffed all they could into their mouths, battling all competitors, and the development of etiquette? What did it take — religion, deference or wisdom — to shift human personalities from caring only for one's self to a character of sympathy and empathy? And now have we reverted back through the centuries and millennia to the levels of compassion displayed by our prehistoric forebears?
The manifesto above is taken from a social media post. It's the words of one person, but it's not an anomalous quote. That attitude seems to be gaining popularity. Whether it comes from public schools wringing hands over self-esteem or from young parents' worshipful attitude toward their offspring, I don't know. But the philosophy, if it can be called that, is frightening. Taken to its extreme, it says, "I'm self-centered, disrespectful, unconcerned about others, arrogant and violent when I want to be, but that's just the way I am. That's me. I'm not changing. You don't like it? That's your problem. I don't have any problems. Now get out of my way!"
All the world's great religions teach us to defer to others, to care for others, to respect others, to "love one another." "Do unto others as you would have them do unto you." These principles underlie modern etiquette, diplomacy and social interaction. Help others. Find common ground. Express concern. If these principles are being displaced by a new generation of people who care only about themselves and will reject any criticism or correction, we may be headed back through human history to a barbaric time.
How many centuries did it take for civilization to develop? How long was it between the time Neanderthals grabbed at fresh meat with their dirty fingernails and stuffed all they could into their mouths, battling all competitors, and the development of etiquette? What did it take — religion, deference or wisdom — to shift human personalities from caring only for one's self to a character of sympathy and empathy? And now have we reverted back through the centuries and millennia to the levels of compassion displayed by our prehistoric forebears?
The manifesto above is taken from a social media post. It's the words of one person, but it's not an anomalous quote. That attitude seems to be gaining popularity. Whether it comes from public schools wringing hands over self-esteem or from young parents' worshipful attitude toward their offspring, I don't know. But the philosophy, if it can be called that, is frightening. Taken to its extreme, it says, "I'm self-centered, disrespectful, unconcerned about others, arrogant and violent when I want to be, but that's just the way I am. That's me. I'm not changing. You don't like it? That's your problem. I don't have any problems. Now get out of my way!"
All the world's great religions teach us to defer to others, to care for others, to respect others, to "love one another." "Do unto others as you would have them do unto you." These principles underlie modern etiquette, diplomacy and social interaction. Help others. Find common ground. Express concern. If these principles are being displaced by a new generation of people who care only about themselves and will reject any criticism or correction, we may be headed back through human history to a barbaric time.
Sheriff's primary results are surprising
The three automated telephone calls I received on my home phone in the final 24 hours of primary voting from Wilson County Sheriff Wayne Gay should have given me a clue, but Gay had always been a meticulous campaigner, so I didn't think much of it. Over his 28 years in office, Gay had always "run scared" at election time, behaving as if the weakest candidate were a serious threat. He had also been a consummate fundraiser, scaring off many potential challengers who could not match his campaign treasury.
But when the results were in from the Democratic primary last night, Gay had tasted defeat for the first time, losing to challenger Calvin Woodard by 2,500 votes. I'm sure I'm not the only person who was shocked. Gay had run his usual high-profile campaign, in fact, doing more than in the past to keep his name before voters. Round re-election signs were planted in yards and at intersections around town. A campaign headquarters was rented. Mailings went out. And then there were all the automated calls pleading for votes.
Woodard has to be commended for his victory. He had begun campaigning early, appearing, for example in the 2009 Wilson Christmas Parade as a candidate for sheriff. Although he must have spent far less than Gay (I haven't seen campaign finance figures), he got more votes for his buck than Gay did. It's hard to unseat a well-established sheriff, who as legendary Richmond County Sheriff Raymond Goodman (now deceased) liked to point out, is a "constitutional officer" who holds more power than any other single local official. Many N.C. sheriffs have used that power to build strong political organizations. A few have used that power for their own benefit. Gay, as his campaign literature pointed out, presided over an expansion of the Sheriff's Office and an overall reduction in county crime. He also turned the hot potato of animal enforcement into an asset and satisfied the animal lovers who had been complaining for years of poor enforcement of the animal ordinance and poor treatment of animals. After some clashes in the past, he had built a strong relationship with the Wilson Police Department.
Gay's demise might be, in part, a reflection of the public's impatience with incumbents in general. Gay has spent nearly three decades running a highly visible law enforcement operation. During that time, he has, no doubt, made some enemies, but he has avoided any major scandals and has demanded no appearance of misconduct among his employees. Putting the office in a negative light was a sure way to lose a job at the Sheriff's Office. Along the way, he has expanded the office with motorcycle patrols, traffic enforcement, a horse patrol and even an airplane.
Woodard will take over at the end of this year, barring any unforeseen reversals, and will, no doubt, reorganize the department to suit his philosophy and style. Some familiar faces in the department will lose their jobs because of their loyalty to Gay. That's the way the system works. The real test will come later as he and his new hires enforce laws, investigate crimes and attempt to build confidence among the public.
But when the results were in from the Democratic primary last night, Gay had tasted defeat for the first time, losing to challenger Calvin Woodard by 2,500 votes. I'm sure I'm not the only person who was shocked. Gay had run his usual high-profile campaign, in fact, doing more than in the past to keep his name before voters. Round re-election signs were planted in yards and at intersections around town. A campaign headquarters was rented. Mailings went out. And then there were all the automated calls pleading for votes.
Woodard has to be commended for his victory. He had begun campaigning early, appearing, for example in the 2009 Wilson Christmas Parade as a candidate for sheriff. Although he must have spent far less than Gay (I haven't seen campaign finance figures), he got more votes for his buck than Gay did. It's hard to unseat a well-established sheriff, who as legendary Richmond County Sheriff Raymond Goodman (now deceased) liked to point out, is a "constitutional officer" who holds more power than any other single local official. Many N.C. sheriffs have used that power to build strong political organizations. A few have used that power for their own benefit. Gay, as his campaign literature pointed out, presided over an expansion of the Sheriff's Office and an overall reduction in county crime. He also turned the hot potato of animal enforcement into an asset and satisfied the animal lovers who had been complaining for years of poor enforcement of the animal ordinance and poor treatment of animals. After some clashes in the past, he had built a strong relationship with the Wilson Police Department.
Gay's demise might be, in part, a reflection of the public's impatience with incumbents in general. Gay has spent nearly three decades running a highly visible law enforcement operation. During that time, he has, no doubt, made some enemies, but he has avoided any major scandals and has demanded no appearance of misconduct among his employees. Putting the office in a negative light was a sure way to lose a job at the Sheriff's Office. Along the way, he has expanded the office with motorcycle patrols, traffic enforcement, a horse patrol and even an airplane.
Woodard will take over at the end of this year, barring any unforeseen reversals, and will, no doubt, reorganize the department to suit his philosophy and style. Some familiar faces in the department will lose their jobs because of their loyalty to Gay. That's the way the system works. The real test will come later as he and his new hires enforce laws, investigate crimes and attempt to build confidence among the public.
Tuesday, May 4, 2010
An old taxation idea, modified
When I was younger, in college, and more confident in the ability of government to do good, I suggested that a simple way to level earnings and provide government more revenue would be to impose a confiscatory tax on earnings higher than the salary paid the president of the United States, which was then, I believe, $200,000. Now Sen. Jim Webb of Virginia has proposed a slightly wimpier version of what I suggested 40 years ago. Webb's bill would impose a 50 percent tax on bank bonuses that exceed the president's salary (now $400,000).
I never heard a cogent argument against my long-ago proposal. Sure, it would be darn near impossible to achieve, politically, and it might have some impact on wealthier people's willingness to work harder and invest more. But, overall, it would affect a bare minimum of people, and the macroeconomic impact would be negligible. People can live a pretty luxurious life on the salary of the president of the United States, and it's hard for anyone to claim that his job is more difficult, more stressful or more important than the president's. So the downside is that the sale of multi-million-dollar homes, private jets and big yachts would suffer. The economy would survive.
Webb's ploy would affect only bankers and would apply only to bonuses, not to salaries. Voters are angry about the outlandish bonuses being paid at some big banks. Some of those bonuses make the president's salary look paltry. The fact that Webb ignores the outlandish pay of other corporate titans shows that his bill is aimed more at disaffected voters than at bankers or at correcting any economic faults. I doubt that the bill will get any further than my wild idea, circa 1969 or so.
I never heard a cogent argument against my long-ago proposal. Sure, it would be darn near impossible to achieve, politically, and it might have some impact on wealthier people's willingness to work harder and invest more. But, overall, it would affect a bare minimum of people, and the macroeconomic impact would be negligible. People can live a pretty luxurious life on the salary of the president of the United States, and it's hard for anyone to claim that his job is more difficult, more stressful or more important than the president's. So the downside is that the sale of multi-million-dollar homes, private jets and big yachts would suffer. The economy would survive.
Webb's ploy would affect only bankers and would apply only to bonuses, not to salaries. Voters are angry about the outlandish bonuses being paid at some big banks. Some of those bonuses make the president's salary look paltry. The fact that Webb ignores the outlandish pay of other corporate titans shows that his bill is aimed more at disaffected voters than at bankers or at correcting any economic faults. I doubt that the bill will get any further than my wild idea, circa 1969 or so.
Monday, May 3, 2010
Casino would desecrate sacred ground
It should be self-evident, as Thomas Jefferson might have said, that a casino should not be built next to the most hallowed battleground on the American continent. But unless concerned preservationists and Civil War buffs can turn back the power of the almighty dollar, a casino might become Gettysburg's newest neighbor. The Civil War Preservation Trust (to which I have given small donations for several years) and other groups thought they had defeated the casino forces when a proposed casino at another site near the Gettysburg battlefield failed to obtain a Pennsylvania license. But gambling profiteers are still eager to tie their gambling tables and slot machines to a Gettysburg address, and another proposal is before the people of Pennsylvania.
A gambling casino would desecrate the site of the largest and bloodiest battle of the Civil War, which marked a key turning point in the war. On July 1-3, 1863, Confederate forces under Robert E. Lee tried mightily but failed to dislodge Union forces under George Meade, who had captured the high ground and never relinquished it, despite concerted attacks that culminated in the legendary "Pickett's Charge," a suicidal advance across nearly a mile of open ground. More than 30,000 Americans died in those three days, and thousands more were wounded. The outpouring of grief resulted in scores of monuments to various regiments and companies who fought bravely there on both sides. The rolling Pennsylvania landscape, a comfortable drive from Washington, D.C., is embellished with those marble, bronze and granite markers, hailing heroes nearly 150 years gone. Hundreds of thousands of tourists visit the battlefield, which is maintained by the National Park Service, every year to soak up some of the solemn history of this hallowed ground. They stand at the marching off point for Pickett's charge and stare at the line of cannon in the distance and that copse of trees that Lee made their goal and wonder what could have inspired men to such suicidal courage. They stand at the spot where the 20th Maine, out of ammunition after repeated attacks by Confederates, held their position on Big Round Top in hand-to-hand combat. They look at the spot where Abraham Lincoln delivered the greatest two-minute speech in American history. No one can go to Gettysburg without being moved.
Anyone who would place a casino on the main road into the heart of the Gettysburg battlefield has no sense of history or of honor. As Lincoln said, "It is for us the living ... to be here dedicated to the great task remaining before us — that from these honored dead we take increased devotion to the cause for which they gave the last full measure of devotion ... ." The dead of Gettysburg did not give their lives for gambling, and no casino should taint their sacrifice.
A gambling casino would desecrate the site of the largest and bloodiest battle of the Civil War, which marked a key turning point in the war. On July 1-3, 1863, Confederate forces under Robert E. Lee tried mightily but failed to dislodge Union forces under George Meade, who had captured the high ground and never relinquished it, despite concerted attacks that culminated in the legendary "Pickett's Charge," a suicidal advance across nearly a mile of open ground. More than 30,000 Americans died in those three days, and thousands more were wounded. The outpouring of grief resulted in scores of monuments to various regiments and companies who fought bravely there on both sides. The rolling Pennsylvania landscape, a comfortable drive from Washington, D.C., is embellished with those marble, bronze and granite markers, hailing heroes nearly 150 years gone. Hundreds of thousands of tourists visit the battlefield, which is maintained by the National Park Service, every year to soak up some of the solemn history of this hallowed ground. They stand at the marching off point for Pickett's charge and stare at the line of cannon in the distance and that copse of trees that Lee made their goal and wonder what could have inspired men to such suicidal courage. They stand at the spot where the 20th Maine, out of ammunition after repeated attacks by Confederates, held their position on Big Round Top in hand-to-hand combat. They look at the spot where Abraham Lincoln delivered the greatest two-minute speech in American history. No one can go to Gettysburg without being moved.
Anyone who would place a casino on the main road into the heart of the Gettysburg battlefield has no sense of history or of honor. As Lincoln said, "It is for us the living ... to be here dedicated to the great task remaining before us — that from these honored dead we take increased devotion to the cause for which they gave the last full measure of devotion ... ." The dead of Gettysburg did not give their lives for gambling, and no casino should taint their sacrifice.
Saturday, May 1, 2010
May has arrived; go barefoot
Mayday. For many socialist countries, and in the former communist regimes, a day for labor parades and for singing the Internationale, but for me, it's the day to go barefoot outside. It was my mother's rule: No going barefoot until May 1. It didn't matter how hot it got in April. May 1 was the day to go barefoot. I can still remember being caught at age 3 or 4 breaking the rule and being sharply admonished and brought back inside.
Today, I wore sandals as I worked in the yard, but the sandals came off at the end of the day, and I was outside, barefoot, just like the old days. I'm sure Mother would approve.
Today, I wore sandals as I worked in the yard, but the sandals came off at the end of the day, and I was outside, barefoot, just like the old days. I'm sure Mother would approve.
Subscribe to:
Posts (Atom)