Thursday, December 29, 2011

The Best Songs from 2011

by Conroy

With only days left in 2011, it's time for my annual Top 10 Songs list. As with the previous versions (see my 2010 list here), this list will count down what I consider to be the ten best songs from this year. As a reminder, there are a couple of ground rules for songs to be eligible for the list:
  • If at all possible, I try to include only one song per artist. For instance, I could have included multiple songs from the excellent album The King is Dead by The Decemberists, but adhering to my rule, and because I wasn't completely blown away by numerous album tracks, I've included just one song from the album. However, if we turned back time and I was writing about my Top 10 songs from say 1997, I would have included multiple songs from Radiohead's spectacular album OK Computer (after all, four songs from that album are included in my Top 250 Songs).   
  • All songs must be released in this calendar year (i.e. 2011). For instance, Adele released her massive hit album, 21, this year. The lead single "Rolling in the Deep" was a global hit, and it has made (often topped) critics’ Top 10 lists for 2011, but it was actually released and gained immediate airplay in 2010. Since it was released separate from the rest of the album and gained widespread attention, I must disqualify it from consideration for songs from 2011. Is this unnecessarily hair-splitting? Maybe, but I dislike recognizing songs more than a year after they become hits (or in the case of many great songs that don’t become hits, after their initial release). 
Also, as you read through this list I offer my familiar caution. As with all annual lists, my Top 10 Songs is based on impressions from this year with only a limited amount of time to have heard and internalized each of these songs. Often it takes years or more for the significance of a song, album, or any other artistic creation to become clear. I publish these lists because I think it's fun, generates discussion, and identifies some of the outstanding songs from the past year. However, it could be that I look back years from now with different opinions of what really was best from 2011 (consider my 2010 list to judge my judgment).  

Okay that's enough background, the list:

Conroy's Top 10 Songs of 2011

10. “Change the Sheets” by Kathleen Edwards. It’s been a few years since Kathleen Edwards’ previous release, the outstanding Asking for Flowers, but her follow-up Voyageur will be released early next year. If the whole album matches the quality of the lead single “Change the Sheets,” her fans will be much pleased. I especially like the multi-layered backing vocals that add a depth I haven’t heard from her before. The overall atmospherics of the track mix well with the personality that makes Kathleen Edwards’ music so appealing.

9. “Pill” by Edie Brickell. Edie Brickell has been around for a long time. She first hit it big with “What I Am" way back in 1988. So it was both surprising and exciting to hear her back this year with a self-titled album. “Pill” is a great example of ”happy” music – light, peppy, propulsive – that is about a dark subject, here depression. Some lyrics: “You can’t pay attention / It’ getting pretty rough / You feel a little down now / And you can’t get it up / They got a pill for that…” I like this type of juxtaposition of music and theme, but most of all, “Pill” is the type of song that can be listened to on a loop. I hope we don’t have to wait another eight years for the next Edie Brickell release.

8. “My Body” by Young the Giant. My sister turned me onto Young the Giant, and I’ve heard there debut album described as a mix of Fleet Foxes and Kings of Leon. Maybe, but I always find those types of descriptions unhelpful. What “My Body” features is a fantastic guitar-powered chorus, as good as anything released this year. Rock critics (and to a lesser extent fans) seem to have a longstanding angst that the genre is one day going to run out of steam. That a point will be reached when nothing new or interesting will come along. Given that the Rock n Roll has been popular and artistically inventive for over half a century, I find this point of view silly; no worry needs to be given to the fate of rock. Young the Giant is evidence that rock continues to be home to new, vibrant, and interesting music.

Monday, December 26, 2011

The Villages: A New Way to Retire

by Conroy

I looked out through the large bus windows at the rolling central Florida hills. The late morning sunlight was muted by the tinted glass, and the countryside appeared as a contrast of straw-colored grass and subtropical evergreen. It had been a drought-stricken winter, the end of the dry season, and even the dark murky lakes and ponds that dotted the landscape looked drained and shrunken. The bus was heading north from Orlando on the Florida Turnpike. In years past this ground was cultivated with orange groves, but too frequent winter frosts and freezes drove the orange growers south. Now ranching dominates and open, empty fields carry on along the rolling ground as far as the eye can follow . This part of the Sunshine State seems far removed from the palm-laden glitz and built-up densities of Miami and “South Florida,” or certainly the sprawling fantasy production of Disney World and the rest of Orlando’s tourist-entertainment parks. But it’s in this quiet corner of Florida where you will find one of the fastest growing communities in the United States, The Villages. That’s where my charter bus, one of many on the daily round-trip shuttle service to and from Orlando, was taking me.

The Villages is the brainchild of developer H. Gary Morse, who in the mid-1980s saw an opportunity to transform a struggling trailer park into a vast new development. Florida has long been famous as a Mecca for retirees, the “snowbirds” looking to escape the frigid winters of the Northeast and Midwest, and Morse carried this reality to a new conclusion: He wouldn’t build a retirement center, or neighborhood, but an entire town. Large tracks of northeast Sumter County (and smaller portions of adjacent Marion and Lake Counties) were bought, and over the last two plus decades, transformed into a new, wildly successful community. The population of The Villages, including part-time residents, is over 80,000 with a planned ultimate population of more than 100,000. With vast numbers of baby-boomers starting to retire, the U.S. might see a proliferation in these types of master-planned retirement towns, and I wanted to get a firsthand look at this potential future.

Entering The Villages
My bus, which I was glad was well less than half full, exited the Turnpike onto US 301. There followed a slow ten minute ride, which went through the sad-looking town of Wildwood. We passed tiny dilapidated bungalows, tired storefronts, and that general haphazard, disorderly look of neglected landscaping and forgotten maintenance. The impression is unmistakably one of the languishing rural South. What a contrast when a few minutes later and couple miles down the road we turned onto Buena Vista Boulevard and entered The Villages. We passed thousands of trim bright new houses. The white and cream facades accented with a mix of shingled gray, brown, and burnt orange roofs. Many houses feature patios and pools (all enclosed by lanai). We drove past several bright green golf courses; there are nearly 40 in the town all told, including nine 18- or 27-hole country club courses. There was also the exquisitely manicured landscaping: Large oak trees adorned with Spanish moss; palmettos lining golf courses; bushes shielding roadside houses; blooming flowers of pink, red, yellow, violet, and indigo brightening the roadside; and the thick, uniformly trimmed and edged carpet of grass. It was all wonderfully arranged and effectively used to delineate neighborhoods and break-up the views. Intersections occurred at roundabouts, not a traffic signal to be seen. Buena Vista Boulevard, like all major roads in The Villages, is flanked by paved cart paths. Perhaps the most distinctive feature of the town is the ubiquitous use of golf carts to get around – everywhere a resident would want to go – and today, Saturday, the carts were out in force.

Lake Sumter Landing
The bus stopped in Lake Sumter Landing, one of two town centers, which features several streets of pastel colored shops and restaurants radiating from a central square. I stepped off the bus into the sunny warm April day and was greeted by my hosts – my parents, part-time residents. The architecture and layout is all set-up following that Florida specialty, the theme, this being the Old South (e.g. a wooden bridge over a small canal, complete with a fake lock and waterwheel). The adjacent square (surrounded by buildings with a southern town façade) was busy as residents and their guests (like me) surveyed the offerings of local vendors; art and personal accessories mostly, and a regular daytime activity. The north side of the square is open and leads to Lake Sumter, right in the center of The Villages. The lake is fronted by houses on either side of the town center. This is apparently some of the choicest and priciest real estate in the whole town.

We walked to a close-by restaurant for lunch and the sidewalks and stores were no less busy. I was somewhat surprised by the composition of the crowds. While there was a surfeit of older people, there were also a lot of families. I had in mind a scene of senescent retirees, slow moving, quiescent. But the hustle I encountered was just like your local outdoor mall, not the exhausted idleness of a retirement home. I was skeptical when my parents told me they were buying a house in The Villages. I feared they were adopting a mentality and lifestyle older than their years. But on first exposure, my fears were somewhat allayed. There was vibrancy here, not certainly for a young person, but at least I could sense the appeal.

Saturday, December 17, 2011

Whose Money?

by Conroy

1 World Trade Center under construction
Drivers approaching lower Manhattan in 2011 have seen the steady rise of a new skyscraper, 1 World Trade Center. By December its steel superstructure already stood over 1,100 feet above street level, dominating the famous skyline around it. When the building is “topped out” next year it will be the tallest building in the United States (in fact the tallest in the entire western hemisphere).  1 World Trade Center is the most spectacular part of the massive World Trade Center redevelopment effort, which includes several other towers, the National September 11 Memorial & Museum (the memorial is open but the museum is still under construction), and a soon-to-be-completed transportation hub (road/rail/bus terminal). These are just some of the tangible signs that after a decade New York City has largely (physically) recovered from the 9/11 attacks.

Drivers Sue the World Trade Center?
Not long after 1 World Trade Center peeked above the surrounding buildings, the American Automobile Association (AAA), a service and advocacy group for over 50 million drivers, sued the Port Authority of New York and New Jersey (PANYNJ). Why? Well this fall PANYNJ announced significant toll increases, up to 50%, on its many New York area bridges and tunnels, which include the heavily trafficked Lincoln and Holland tunnels and George Washington Bridge. The higher tolls were needed, according to PANYNJ, to pay for ongoing facility maintenance and planned improvements and to cover some of the cost of the World Trade Center construction. You see, in addition to bridges, tunnel, airports, seaports, and transit, PANYNJ also owns several commercial properties the most visible being the World trade Center site. The AAA lawsuit claimed that the PANYNJ toll increases were an unfair burden to drivers and that toll revenues would be diverted from the bridges and tunnels that toll payers use to unrelated commercial enterprises. In other words, and rather nefariously in AAA’s eyes, 1 World Trade Center was rising high above Manhattan on the dime of drivers who would never benefit from the development.

PANYNJ has countered that in fact they misspoke and all of the revenue raised from increased tolls will be used on their transportation facilities and not a nickel to pay for the World Trade Center construction. Both sides argued their case in court last week and a ruling on the issue might be made by the end of the year. In light of PANYNJ’s modified account of how the toll revenues will be used, I foresee the new toll rates being upheld and no refunds for any AAA drivers, but I’m not a lawyer and won’t wager on any particular outcome.

This whole case may seem like one advocacy group attacking one issue from one public agency, but I think it’s a microcosm of a larger debate being held in many forms nationwide. Namely, when it comes to public money, whether it’s tolls or taxes, whose money is it and who gets to decide how it will be spent? In the 2010 U.S. midterm elections, Republicans across the country benefited greatly from a broad grassroots effort, the Tea Party movement, which essentially argued that elected officials and public agencies were incapable of responsibly using tax dollars. And therefore, all tax increases were unacceptable and major spending programs dubious. On the national level we’ve seen the results of last year’s elections: A tooth-and-nail struggle to get any spending programs passed through Congress, and none of those that eventually were passed included any tax increases. Clearly one’s sympathy or antipathy to the Tea Party movement’s agenda rests in your political and fiscal perspective, but this is one of the fundamental arguments in America today.

Whose Money is It?
So going back to the core question, whose money is it and who has the right to decide how it is spent?

Saturday, December 10, 2011

Dwindling Ranks

by Conroy

American ground crew like GMGF Jim
Just this past week America marked the 70th anniversary of the Japanese attack on Pearl Harbor; seventy years, a lifetime ago. This weekend my girlfriend’s maternal grandfather, Jim (hereafter referred to as GMGF Jim), will celebrate his 92nd birthday. These two occasions, the first a date of great historical significance, are separated by only a handful of days, and connected by the passing years.

The attack on Pearl Harbor plunged the United States into World War II. Americans responded by throwing themselves into the war effort, including millions of servicemen destined for all parts of the globe, in what would be the most impressive display of military force in history [1]. One of those men was a young GMGF Jim, in his early 20s, who enlisted in the U.S. Army, and was assigned to the new United States Army Air Force (USAAF), the predecessor of the modern USAF. When I met GMGF Jim in the middle of last year I was curious to hear more about his service, having learned only snippets from my girlfriend. Unfortunately, his advancing age has robbed him of most of his hearing, which means that conversations with him proceed in bursts of short shouted questions, often repeated, and followed by equally short shouted replies. All of this leads to a rather halting dialogue frequently waylaid by misunderstandings, incomplete details, and the basic ineffectiveness that disjointed rhythm has on communication. I would relate it to a phone call where there’s a long delay and you constantly have false starts, long pauses, and overlapping talking. It just doesn’t work well.

Two Experiences
I tell you this as a preface to what I’ve gathered of GMGF Jim’s war service; I think I have the broad elements correct, but the details may be iffy. He set sail from wherever basic training was, maybe his home of Baltimore, destined for Panama City, Florida. But due to some sort of mix-up, the ship instead went 1,500 miles too far south, to Panama…the country…in Central America. I know what you’re thinking: That can’t be right? Which was pretty much my reaction, but I pressed the point with GMGF Jim a couple of times and received that same answer (all I could do was shake my head and say “okay”). So, once in Panama, the local American military commanders had to do something with a ship full of recruits. Fortunately, there was a rather important strategic asset in the area, the Panama Canal. The Canal needed guarding and part of that duty fell to the USAAF. After a short stay on land, GMGF Jim and his fellow shipmates were sent through the Canal and sailed southwest to the Galapagos Islands [2]. There the USAAF established abase at Baltra Island. Planes from this base patrolled for enemy submarines and protected Allied (and neutral) shipping on the western approaches to the Canal. GMGF Jim spent the rest of the war as a member of the ground crew servicing planes.

Given the general danger of being in the military during World War II, I consider getting stationed in Darwin’s quiet balmy islands to have been a pretty plum assignment. Not bad considering it was a mistake. But it was work and it was far from home. Most of war is extended boredom and I’m sure GMGF Jim had moments of the tropical doldrums. Still, it’s an interesting story, and raises a few questions: He really ended up in Panama instead of Florida? The vast American military-logistic system was capable of misplacing an entire ship? There were on-site ad hoc solutions that ended up lasting the entire war? Whatever happened in Panama City when his ship didn’t arrive? What about the men originally slotted for Baltra Island, where they there as well? It all strikes me as a semi-comical combination of Catch-22 and Guard of Honor [3]. It’s also something I was never likely to learn without hearing it from the mouth of the man who lived it.

My maternal grandfather, Jack, also served in World War II as an artilleryman in Europe. He died fifteen years ago at the age of 79. In the last six years of his life his mind was decimated by Alzheimer’s, but even late in his life, long after most of his recent memories were lost, he was still able to tell me of his time in Europe. His service was more typical of what we think of as the American experience, war in Europe, fighting Germans through France, etc., though I gathered from him that he didn’t have any close calls on the battlefield.

Monday, December 5, 2011

The Shopping Season

by Conroy

Tis the -- shopping -- season
We’re in early December and for Americans that means we’re smack in the middle of the holiday season. It also means we’re in the full register-ka-ching-ing focus of the shopping season. Indeed if you were to list the most obvious manifestations of approaching Christmas [1] you would probably include the tiny many-colored glowing lights adorning neighborhood houses, or the Christmas trees and other seasonally related decorations inside the homes of you, your family, and friends [2], but also the inescapable, omnipresent, overwhelming seasonal advertising.

From the radio spots you suffer through when driving to work in the morning, the bright, glossy ads in newspapers and magazines, the loud television commercials, annoying internet popups, and in-your-face billboards…it’s everywhere. It’s Christmastime, and it’s time to shop. Best Buy, Walmart, and Target have, as usual, put out their heavy dosage of ads, but all major retailers are in on the game. I’m especially confounded by those ridiculous "December to Remember" Lexus commercials aimed at whatever infinitesimal fraction of the population chooses to buy luxury cars for Christmas [3]. And I guess it’s working, given the record sales on the super-hyped Black Friday and Cyber Monday shopping days. And surely many people, like my mom and aunts, really enjoy shopping; zeroing in on sales, mingling with the crowds, and wrapping gifts [4]. Why else would hordes of people stay up through the night of Thanksgiving jostling with other midnight shoppers? It can’t be just for the money-saving sales.

This year I’ve been struck by the tone of the advertisements, almost as if it’s your duty to shop, something along the lines of voting or obeying traffic laws. If you don’t participate, then it’s somehow antisocial and un-American. If you’re like me, you probably find this commercialization of Christmas unsettling in a somewhat-hard-to-define way. I’m not religious, but I was raised Catholic and went to Catholic school, and so, of course, it’s still worth noting that Christmas is a religious holiday, deeply special to Christians because it marks the birth of Jesus. In its evolved modern context, the holiday has wider significance than its religious foundations. It’s a time of celebration of the year completed, a time to spend with family a friends, and yes, even a time of giving. It’s this aspect of Christmas, the time spent with family and friends, that makes the Christmas season meaningful to me, and I’m guessing (or hoping), it is the largest reason Christmas is “the most wonderful time of the year” for most of you as well.

A part of what unsettles me about the commercial side of Christmas is the often cited fact that the holiday retail season is an essential feature of the consumer-driven American economy. You’ve probably heard the misleading statistic that two-thirds of the U.S. gross domestic product is derived from consumption – people buying things, but regardless of the real value, shopping is a major part of the modern economy. And about 20% of all retail sales come during the Christmas shopping season (November to December). This explains why businesses are so forceful with Christmas ads – they must make their money before the end of the year, or risk red-ink and failure. (I may be off base, but it seems a little alarming that a major prop of the world’s largest economy is the buying of Christmas gifts.) That’s also why we’re subjected annually to the phenomenon that Gregg Easterbrook has termed “Christmas Creep”: Seasonal advertisements appearing ever earlier in the year. Now it’s common to come across Christmas-themed store displays and print advertisements before Halloween, sometimes well before. How long before Christmas advertisements start in the summer? It sucks a lot of the specialness out of the season when Christmas is exploited throughout the non-Christmas-time of the year.

Gifts under the Christmas tree
But that’s an external issue. What really bothers me the most is the expectations that come along with gift-giving. I’m an embodiment of the cliché that it is better to give than receive. When I was a child I loved to get gifts, and I was fortunate that my parents and other relatives lavished me and my brother and sister with a lot of them. But now with maturity, I much prefer giving gifts to my family (girlfriend included), especially when I know they can use or want what I’ve gotten for them. It’s a rewarding feeling and genuinely selfless, I give for the enjoyment of others and not for my own satisfaction of giving (although I guess I can’t deny the presence of some selfish gratification I get from being perceived as generous and thoughtful). But unfortunately, too often gift giving is not a bonus of Christmas but a requirement, a burden.

Sunday, November 27, 2011

End of the Season

by Conroy

Roger Federer with his 70th winner's trophy
The 2011 ATP season is over [1], and it ended just like last year. Roger Federer dominated the late fall, winning his final seventeen matches, the last seven of which were against Top 10 ranked opponents (which might be a record), and capturing his record sixth ATP World Tour Finals [2] title. In the process, he demolished Rafael Nadal in his most lopsided victory against the Spaniard and further cemented his claim as the greatest player of all time.

But the end wasn’t at all what the 2011 tennis season was about. Looking back from the finish, it’s clear that the climax occurred two-and-a-half months ago in New York.

US Open Final
Arthur Ashe Stadium
Late on a Monday afternoon [3] in early September Novak Djokovic and Rafael Nadal, the world’s two best tennis players, took the court at Arthur Ashe stadium to contest the U.S. Open Final. It was a rematch of the 2010 Final, but this year Djokovic was the top seed, winning his way match-by-match through a near-perfect season. Nadal was a definite underdog. As they began their warm-up, the sinking sun already cast long shadows across the blue court and its expansive green penumbra. The capacity New York crowd of 25,000, the largest audience in tennis, buzzed in anticipation.

Djokovic had dominated the first eight months of the year. Heading into the U.S. Open Final he had remarkably won 63 of his 65 matches, nine titles, including the Australian Open and Wimbledon, and dominated Nadal, beating him in all five of their meetings (all in finals). By what seemed a minor miracle, and confirmation that he was in the midst of a historic year, he escaped near-certain defeat in the semi-finals against Roger Federer (click here to see one of the gutsiest and fearless shots you’re ever likely to witness). He looked destined to win the U.S. Open and consolidate the top ranking he had taken from Nadal at Wimbledon. For his part, Nadal was looking to reclaim some of his palpably diminished aura and end his losing streak to Djokovic. In the process claiming his second grand slam title of the year [4], and perhaps steal the number 1 ranking back by the end of the season.

Speaking as a tennis fan, the match was mesmerizing; an apotheosis of power-baseline tennis. Both men played at their peak, as good I think, as they could have played on that surface and that day in front of a huge expectant crowd and millions of television viewers. They are the two best defenders in tennis and probably the two most consistent baseline ball-strikers. Those skills resulted in point after point of long rallies, of sustained sprinting to all corners of the court, and of pure power hitting. Both men repeatedly retrieved what appeared to be sure winners from the other; points ending with each man’s legs and lungs burning. From a purely physical perspective it may have been the most brutal tennis match ever played (see some of the highlights here). By the end Djokovic was suffering from a back injury and the normally indefatigable Nadal appeared totally enervated, his body (and maybe his mind) unable to compete any longer. What else was obvious is that point-by-point Djokovic was better. It seemed that he returned every Nadal serve back at the Spainard’s feet. He won the majority of the long rallies. He won the important points.

An exhausted Djokovic after his U.S. Open win
Djokovic took the first two sets despite early leads in each by Nadal. With day fading into night and the stadium lights taking over, Nadal came back and won the long, intense, captivating third set. After the first game of the fourth set, Djokovic, who just a few minutes earlier was serving for the championship, took a medical timeout for a lower back injury. For a moment it appeared that a monumental comeback was in store. But it didn’t happen, Djokovic regrouped and Nadal was spent. After a little more than four hours [5] and four unforgiving sets the Serb stood as champion. To borrow a boxing analogy, he took the best punches that Nadal could throw and returned his own with interest. He fulfilled the promise of his season-long success. He was the best.

Post U.S. Open
A tired Djokovic at the World Tour Finals
And then, after that scintillating victory, the air was let out of the balloon. Djokovic played just ten more matches, going a very average 6-4 over the remainder of the season. At least two, but maybe all four of those losses were at least partially the result of a bad back or aching right shoulder. And after his mild exit from the World Tour Finals, Djokovic admitted that he was physically and mentally drained. The season, and all his success, had taken a toll. He needed time to recuperate.  Nadal went just 8-4 after the U.S. Open (and just 2-4 over his last six matches), including very lopsided losses to Federer and Andy Murray [6].

The post-U.S. Open fall season often plays like a quiet denouement to the rest of the year. And given how meekly Djokovic and Nadal played, the two bright lights of the season through the U.S. Open, it seemed especially so this year. But there is more to be gleaned from the late season results.

Monday, November 21, 2011

Political Failure

by Conroy

I’ve never used this blog as a place to write about political issues or rail against political developments. As far as possible, I like to maintain an apolitical tone, out of respect to readers and because my deep-seated pragmatism guides me to a more centrist perspective. But the events of today require a response.

The much hyped Congressional “Super Committee” (officially named the United States Congress Joint Select Committee on Deficit Reduction) of six Democrats and Republicans has failed in its one and only task of identifying at least $1.2 trillion in federal budget cuts (spread out over ten years).

As you may recall, back in the summer there was a debt-ceiling crisis – entirely of the political class’ doing – when the United States nearly reached its debt (borrowing) limit. The crisis was precipitated when Congressional Republicans refused to extend the government’s borrowing limits without a guarantee of commensurate reductions in spending (anathema to Democrats). For their side, Democrats refused to consider spending cuts without some form of tax increases (anathema to Republicans). Neither side budged and the prospect of a limited government default was at hand. At the eleventh hour a compromise deal, endorsed by President Obama and House Speaker John Boehner, was passed by Congress. It raised the federal borrowing limit but required substantial cuts in government spending. Those cuts were to be identified by the Super Committee. There were to be no new taxes.

Following all this haggling Standard & Poor's actually downgraded the United States credit rating for the first time in the nation’s history, and the stock market plummeted by several hundred points (several percent of its total). It was a stark indictment of the men and women controlling America’s federal finances.

The national debt has been an issue for a long time, but it has really soared over the last decade as increased defense spending and rapidly increasing Social Security and health care costs, combined with two large tax cuts and a major recession to throw the federal budget out of whack. Despite non-stop talk in Washington about corralling the ballooning debt, nothing was done. That was supposed to end with the latest budget bill.  

Politics without Leadership
Now, nearly four months later, despite all of the promises and hoopla surrounding the Super Committee, and after ten weeks of in camera meetings, no schedule of spending cuts has been identified. Congress (and the President) has failed to lead; to make tough choices. Instead, as this summer’s budget bill stipulates, automatic cuts will be enforced starting in 2013. These include 8-9% reductions in both defense and non-defense programs. However, third rail programs like Social Security will not be touched.

No doubt in the immediate term – like tomorrow – we will see another drastic decline in the stock market as investors react to yet more government ineptitude.

This abdication of responsibility allows Republicans to go back the their constituencies and boast about avoiding tax increases and Democrats to go back to their constituencies and brag about maintaining vital social programs. And here’s the kicker: Between now and 2013, Congress has the power to exempt programs from the automatic cuts. It’s very possible that by the time 2013 comes around special interests will have lobbied Congress out of all serious cuts. In other words, nothing will be cut. Business as usual; and the national debt will continue to pile up at record levels. The President held a short press conference this afternoon where he promised to veto any bill that exempts programs from the automatic cuts. Sound words, but we’ll see.

Tuesday, November 15, 2011

Getting Old

by Conroy

Jeanne Calment - the oldest person on record (122 yrs.)
What if we never had to get old? A couple of days ago I kept encountering this theme. First, there was a minor headline on the cover of this past week’s edition of The Economist that hinted at an answer to this most captivating of questions. I was immediately intrigued and excitedly flipped through the magazine to find the article. My mind was alive with thoughts of about staying forever young, about immortality. Alas, and not surprisingly despite my silly reaction, the article related emerging science that promised considerably less.

Read the article for the specific details, but experiments in mice have shown that counteracting certain genes in cells that have reached their biological age limit can check some of the deterioration associated with aging. Someday, it is hoped, these types of approaches could be used to ease the effects of senescence in humans. I hope, for not entirely selfish reasons, someday comes sooner rather than later.

Then, by sheer coincidence, I was re-reading a portion of James Joyce’s Ulysses – searching for a remembered (different) passage – when I rediscovered the following paragraphs (it’s a longish excerpt, but worth it [1]):

What spectacle confronted them when they, from the host, then the guest, emerged silently, doubly dark, from obscurity by passage from the rere of the house into the penumbra of the garden?

The heaventree of stars hung with humid nightblue fruit. [2]

With what meditations did Bloom accompany his demonstration to his companion of various constellations?
Meditations of evolution increasingly vaster: of the moon invisible in incipent lunation, approaching perigee: of the infinite lattiginous scintillating uncondensed milky way, discernible by daylight by an observer placed at the lower end of a cylindrical vertical shaft 5000 ft deep sunk from the surface towards the centre of the earth: of Sirius (alpha in Canis Major) 10 lightyears (57,000,000,000,000 miles) distant and in volume 900 times the dimension of our planet: of Arcturus: of the precession of equinoxes: of Orion with belt and sextuple sun theta and nebula in which 100 of our solar systems could be contained: of moribund and of nascent new stars such as Nova in 1901: of our system plunging towards the constellation of Hercules: of the parallax or parallactic drift of so called fixed stars, in reality evermoving from immeasurably remote eons to infinitely remote futures in comparison with which the years, threescore and ten, of allotted human life formed a parenthesis of infinitesimal brevity.

And then by further coincidence, I flipped on the television and caught the very end of Rocky III, where in a more succinct and less inflated way than Joyce, Apollo Creed [3] says to Rocky, in a fleeting reflective aside about the waning of his once overwhelming physical skills:

"You know Stallion?  It's too bad we gotta get old."
And so my mind settled on the sad fact of getting old and mortality. Happy thoughts, I know, but ones we’re all apt to brood over from time to time.

I’m 31-years-old, and I still throb with the vigor of youth [4]. But in the coming years the signs of aging are going to appear. We all know these signs: decreasing speed, dexterity, and reaction time; graying hair; balding (for men); wrinkles; weight gain; loss of muscle mass and bone density; lower sex drive; flagging energy; arthritis; deteriorating vision and hearing; loss of collagen that makes your skin all droopy and inelastic; age spots; age-related illnesses like heart disease; memory loss, and maybe dementia. Just think of your grandparents. Nothing that you look forward to, all things you want to delay as long as possible. It’s not a pretty picture. Just compare the following two photographs.

 Robert Redford in his early 30s

 Robert Redford in his mid-70s

(Of course I’m planning (read: hoping) to age well, like Paul Newman or Cary Grant. No one has actually laughed when I’ve suggested this, which is nice of them.)

And then somewhere along the arc of aging, we die. Aging and death are cruel facts, especially for intelligent life. It’s cruel that humans can witness our senescence and contemplate our physical demise. But, contemplation is the first step to understanding, which is where we stand; there are several theories about why we age:

Tuesday, November 8, 2011

Big Budget Bland

by Conroy

The epitome of the mega-budget movie
We live in the era of the mega-budget movie. Those films that cost hundreds of millions of dollars to make and tens of millions more to market. By my count, in the last seven years there have been eighteen movies with a production budget of at least $200 million (see table below). These movies share one characteristic – they’re dominated by digital effects.

Back in 1998 David Foster Wallace skewered Hollywood for enthusiastically embracing what he termed the “F/X Porn” genre: Mega-budget movies that feature massive doses of highly effective, sensuous special effects, but very little character or plot. Prominent examples from the 1990s include Terminator 2: Judgment Day, Jurassic Park [1], and Twister. He likened these movies to pornography – instead of porn’s prurient carnality, there are a few elaborate, terrifically convincing effects sequences separated by long segments of vapid, formulaic storytelling.

Wallace cited the Inverse Cost and Quality Law, which may sound like a concept plucked from a microeconomics textbook, but was actually his invention. Stated simply, the “ICQL” says, “the larger a movie’s budget is, the shittier that movie is going to be.” For a movie snob like me, this idea seems so obvious that I’m embarrassed I didn’t articulate something similar. When Wallace developed the ICQL, it hadn’t been that long since “T2” and Jurassic Park revolutionized the scope and role of special effects. They were no longer a feature of many big budget movies; effects became their very reason for being. The ICQL posits that mega-budget effects movies need a bankable star, a simple plot relying on proven formulas and easy sentiment, and lots of distracting digital effects. In addition, corollaries to the ICQL state that (a) the more lavish the effects the worse the non-effects parts of the movie will be, and (b) the necessities of a mega-budget-effects movie will subsume the creativity and originality in even the most talented director [2].

Maybe the first "F/X Porn" movie
The upshot of the ICQL is that mega-budget movies are stripped of what attracts people to storytelling in the first place, namely characters and plot. These elements have been the foundation of storytelling since the oral tradition of our distant ancestors. I love movies because, to paraphrase Martin Scorsese, they are our dreams brought to life. Movies offer a different experience (and better in many ways) than novels or plays or poems, or any of the other media we have to express our innate drive to tell stories – to communicate our human experience. Movies bring together sight and sound and humanity in ways unmatchable by other forms. That's what's so disappointing about mega-budget-effects movies, as the ICQL says, these movies abandon the core tenets of good storytelling and focus on distracting our minds by overwhelming our senses.

Two rhetorical questions: How many lines of dialogue can you remember from your favorite movie? What do those words, coming from those characters, mean to you? Two follow-up rhetorical questions: How many of the dazzling special effects sequences can you describe, in detail, from one of the recent mega-budget-effects movies? What do those effects sequence mean to you?

Tuesday, November 1, 2011

Ghost Stories

by Conroy

A ghost on the stairs?
This past Sunday, in anticipation of Halloween, CBS’ Sunday Morning featured a segment on ghosts and haunted places. I, incredulous as always, was stunned to hear the following statistic: 40% of Americans believe in ghosts and fully half of them (20% of Americans) believe they have actually seen or experienced a ghost (!). Needless to say, I’m unsettled (if not entirely stupefied) by the fact that 60 million of my compatriots seem to, well, either have suffered some sort of delusion or actually believe in the ridiculous. Still, ghosts, or the idea of ghosts, has too long a history and is too engrained in human culture not to intrigue.

Let me write upfront (if it isn’t already clear) that “ghosts” don’t exist, at least in the appear-as-a-phantasmal-presence-in-a-dark-corridor-out-of-the-corner-of-my-eye type of way. Believers would label me a skeptic, but my disbelieving position is the majority view (thankfully), so let’s set that as the perspective of the rest of this post. “Ghosts” is an interesting and enduring cultural-religious conceit not an actual phenomenon.

Ghost Stories
Currently, the most popular movie in American theaters is Paranormal Activity 3, a prequel to the very effective original, a word-of-mouth hit from a couple of years ago. This movie is just the latest in what is a never ending procession of ghost stories. Maybe it shouldn’t be surprising that it’s a hit: ghosts sell. Paranormal Activity (the original) cost almost nothing to make yet grossed almost $200 million in theaters worldwide. The same feat had previously been pulled off by The Blair Witch Project, a ghost-horror movie from 1999 that cost well under one million dollars (maybe a lot less) and grossed $250 million worldwide. And ghost stories are the subject of big-budget blockbusters as well (some serious, some not): the Pirates of the Caribbean trilogy, The Sixth Sense, Ghost, and Ghostbusters, for example. The lasting popularity of ghost movies means something.

Perhaps the most famous ghost in literature
Ghosts have appeared in our folktales and myths and finest literature, from the Bible and the Egyptian Book of the Dead to Homer’s Iliad and Odyssey. By Dante in the Divine Comedy, Shakespeare in Hamlet and Macbeth, Dickens in A Christmas Carol, Henry James in The Turn of the Screw, and Oscar Wilde in The Canterville Ghost. Modern examples include Joyce’s Ulysses, and T.S. Eliot’s The Waste Land (even if the ghosts in Ulysses are just, to paraphrase Hamlet, in the mind’s eye).

I think you can also see how elemental the idea of ghosts is by the many terms we have for them. In addition to ghost there is: spirit, phantom, spook, specter, banshee, demon, soul, shade, wraith, haunt, apparition, haint, poltergeist, and revenant. And that’s probably an incomplete list.

Every Halloween images of ghosts are used as decorations (or lame costumes). Everyone knows of ghosts, whether that be Casper the Friendly Ghost, the surprisingly effective Haunted Mansion ride at Disney World, the over-sweet Boo Berry cereal, the sometimes heroic and sometimes funny (sometimes neither) Space Ghost, popular (sadly) television shows following “ghost hunters,” and advertised lists of what must be hundreds if not thousands of reputedly “haunted” places in the U.S. alone. The bottom line is that for most Americans the idea and symbology of ghosts is familiar, in a casual almost unmindful way.

So why the ubiquity of ghosts in our culture, especially since most people don’t believe they exist? I’ll attempt a few explanations:

Thursday, October 27, 2011

Visiting History: The Battleship North Carolina

by Conroy

USS North Carolina
My dad and I, ever the enthusiasts of the historically interesting, were driving from Baltimore to Wilmington, North Carolina, on a quick sojourn to see the USS North Carolina [1], a World War II era battleship. It was early March, and our destination was more than a little motivated by my desire to take a short trip somewhere south (we were only just emerging from an unusually cold Maryland winter). Visiting the ship would allow us to interact with an actual historical artifact, not a battlefield or a monument, but a real object, something that had been "there," something we could see, and touch, and walk through.

In late afternoon, and after nine hours on the road, we arrived in Wilmington, which is located in the far southeast [2] of the state, along the Cape Fear River and just a few miles from the Atlantic Ocean. The city's population has nearly doubled in the last two decades, but even by a generous assessment it remains mid-size. On first view it makes little impression, there's no distinctive skyline -- in fact no skyline -- on the approach to town. There's little sense that you've actually arrived anywhere, just the flat coastal greenery broken up by the standard low-rise, low-density development characteristic of American suburbia. However, we found our way by twists and turns to the center of the city, which included a quaint downtown with an abundance of antebellum architecture situated on the east bank of the river. And from that vantage point we spied the impressive silhouette of the North Carolina, cozily berthed in a narrow cove on the river's west bank. It was getting late, the ship was dark against the fading sun, but at least we knew where we were going the next day.

Central Wilmington
Wilmington is an interesting location for the ship, now a National Historic Landmark. As North Carolina's only real seaport, it makes sense for the ship to be docked there, but Wilmington is a provincial city, out of the way -- you have to want to go to Wilmington to end up there. In a way that's a fitting place for the North Carolina, a fair analogue to the role it played in history.


USS North Carolina
The next day, Saturday, was exquisite. The sun was what meteorologists like to term, brilliant; the sky a clear cerulean. The air held the first hints of vernal warmth. In late morning, we crossed the river just downstream of the ship, but we didn't get a good view until we pulled into the parking lot. We were among the first there. Looking from up close, it's hard not to be impressed with the North Carolina. Our first perspective was of the ship's profile. The hull sweeps in a majestic arc, parallel to the water at the stern and rising to its exaggerated wave-splitting bow. The busy but balanced superstructure rises from the center of the hull and is guarded closely by three massive gun turrets, two fore and one aft of the superstructure. The ship is painted with a fresh dazzle scheme of alternating gunmetal gray and bluish white, which was meant as a form of camouflage...though I doubt the North Carolina could be missed. The ship is more than two football fields long and its radar tower rises to 200 feet above the waterline. Large enough that you have to turn your head to see it all even from a hundred paces away.

One must pass through a visitor's center [3] (and gift shop) to board the ship. There's a gangway from the visitor's center to the stern-half port-side deck (the ship faces west). The cove where the North Carolina is moored looks like it was made to fit the ship (and it probably was dredged for this purpose). Indeed, the water is so still and likely has received so much sedimentation since the North Carolina was towed into position fifty years ago, that the massive hull probably rests on the bottom. My dad labeled it, "a mud hole," [4] and it does seem incongruously narrow and shallow and confining for the mass of the ship. The deck surface isn't steel, but teak [5], a wood durable enough to stand up to the corrosive effects of seawater and that doesn't get slippery when wet. It's a pleasant sandy brown color, which complements the gray and white of the rest of the ship. Once on board, everything looks even bigger than from the shore. It looks like what it was designed to be, an engine of war.

Tuesday, October 18, 2011

A Unilingual World - A Counterpoint

I have personally struggled with the issue that Conroy discusses in his most recent post, trying to decide whether to make the tremendous (for me) effort to learn Spanish as a second language. Most of the people in my office speak Spanish (I work in South Florida); for some of them, it is their first language. We have had clients who speak only Spanish, and I've had to have a translator help me communicate with them. Clearly, I would be better off bilingual. But I'm not bilingual, and learning a second language at my age (I'm in my 30s) is no easy task. Plus, the time I spend learning basic Spanish skills is time I could spend learning more about the law, which includes a rich specialized vocabulary all of its own. I have also found that most of my bilingual colleagues come to me for advice about the English language, and they don't care that I don't speak Spanish (or even attempt to). They are much more eager to learn my language rather than teach me theirs.

So even though I agree with Conroy that it would be great to know a second a language (or a third, or fourth), I've decided to maintain focus on my primary occupation for now. Of course, things might change. My wife speaks some Spanish and is thinking about learning more after she finishes her MBA. If she decides to study Spanish, that may tip the scale for me. It would be considerably more fun and less stressful to learn another language together with a partner. That's how I've addressed the question for myself. But I would also encourage others to focus on improving their English skills as opposed to studying a foreign language. For one thing, I believe we should strive for a universal language, and learning multiple languages does not promote that end. Don't get me wrong. I am in favor of diversity in language. But as we can see from English, or any other particular language examined closely, one language alone can be incredibly rich and diverse.

In fact, one of the main reasons English is so rich and diverse is that it has assimilated so many other languages. I'm strongly in favor of that. So spice up your writing with a foreign phrase now and then. Take what is best from other languages—the most vivid, the most useful words and expressions. There is plenty to learn. This past weekend, for instance, I learned that Kurt Gödel was known as Herr Warum ("Mr. Why") because of his unquenchable curiosity. Today, I learned that Justice Brandeis used to refer to FDR's policies as Kunststucke ("clever tricks"). I also learned that the term "shyster," often falsely attributed to the character Shylock from Shakespeare, actually has its origins in the German word scheisse (look it up), a word worth knowing. (Read Michael Lewis's recent article, "It’s the Economy, Dummkopf!" for a strange, scatological analysis of this and other German words.) 

These are all great foreign words, and I'm happy to have learned them. But learning individual words or phrases is far different from learning an entire language. It is much easier, and more fun, and, I would argue, more beneficial to the English language. And it gives you a broader sampling of all the interesting languages out there. So go out there and take the most interesting words from German, take them from Spanish, from French, and Yiddish, and Russian, and even Mandarin (if that's possible). Don't spend your time trying to gain fluency. Instead, mine these languages for their most vivid words and phrases—and steal them!—enrich the English language with them! Just as you, Conroy, have enriched this blog by teaching us the French phrase mot juste.

As for teaching children foreign languages, I think there's an argument for that. It's debatable, though. Being bilingual has certain advantages. But it may also have disadvantages; you can't be an expert in everything. This problem is even more pronounced with adults. Why spend precious time grappling with a foreign language, only to gain a halting grasp of it, when you could be using that time improving your English? The fact is most adults simply will not be able to gain fluency in a foreign language no matter how hard they try. However, they probably can make small but important improvements to their English, which will be more likely to help them with their careers—these days it pays to know English as well as you can know it—and in doing so they will help keep the English language strong.

That said, Conroy, if you still want to study a foreign language, I'm supportive of that. If anyone has the willpower to become a polyglot as an adult, it's you. An interesting and neglected book on the subject that you might find interesting is called "Language Made Plain" by Anthony Burgess. It's chock full of interesting ideas, including techniques on how to learn a foreign language. He also wrote a follow-up book called "A Mouthful of Air," which I haven't read and can't vouch for, but I'm sure it's worth looking into. And if any of our readers have other suggestions for learning a language, particularly books on the subject, please let us know.

A Multilingual World

by Conroy

Language map of the world
There are nearly 7,000 languages spoken in our world. Eighty-five languages have more than 10 million native speakers. At least ten, but probably twelve languages are spoken as a first language by more than 100 million people (see list below). What remarkable linguistic diversity. What a shame that I can communicate in only one.

English as a Global Language
Of course if I'm limited to one language, I guess it's good that it's English. Over the last few hundred years English has grown from a provincial language spoken by a few million people in England, Wales, and lowland Scotland to a de facto global tongue. Today English is the most widely spoken language in the world; it is estimated that over one-quarter of the world's population can communicate in the language to at least a rudimentary level (>1.5 billion people).

English is the official language for aviation and seafaring, an official language of the United Nations, European Union, and the International Olympic Committee, and predominant in diplomacy and international communications, science, computing and the internet, business, and entertainment. The rise of English can be traced to the preeminent international role in economic, cultural, diplomatic, and military affairs played by English speaking nations--Great Britain from the late eighteenth century and the United States since World War II. Review a list of the most popular movies of all time, or the most popular musicians, all are were produced or performed in English. How many of The 2011 Time 100, Time Magazine's list of most influential people in the world (an imperfect measure for sure), are native English speakers or fluent in the language? All but a handful.

The most popular band of all time wrote (almost) entirely in English
In fact, English has been adopted in so many parts of the world for so many purposes that the nature of the language might be changing. There are arguments that English could be in the process of being co-opted from the anglophone world, and is morphing into something considerably different, World English. The end result of this process would leave Modern English as nothing more than a dialect of a larger global language. We'll see, Latin was once thought to be a global language and now it's all but dead. And in an increasingly connected world will English really bifurcate and evolve as substantially as theorized?

Today, the language is the Mother Tongue of a rather short list of nations: the United Kingdom, the United States, Canada, Australia, Ireland, New Zealand, Belize, Guyana, and several Caribbean countries. However, it's an official language spoken widely in many nations that used to be part of the British Empire (or former  American colonies), e.g., South Africa, Nigeria, and India. An English speaker could travel to most places in the world confident of finding locals with a passable understanding of his or her language. Many English speakers may be tempted to think all other languages secondary, and a working fluency in other languages unnecessary. Perhaps Americans, like the present writer, are most guilty of this linguistic chauvinism.

Friday, October 14, 2011

Poor English

by Conroy

I was just reading an old essay from the late David Foster Wallace about tennis and Roger Federer (what else). In part of the essay he was recounting some commentary regarding racket technology and its affect on the modern game, which was written on plaques hung on the hallowed halls of Wimbledon's Millennium Building. There was one sentence in particular that drew his attention:
"Nowadays it is the powerful hitters who dominate with heavy topspin."
He goes on to masterfully analyze and correct the underlying premise, but what made me laugh was this footnote:
"(...assuming, that is, that the sign's 'with heavy topspin' is modifying 'dominate' rather than 'powerful hitters,' which actually it might or might not--British grammar is a bit dodgy.)
Wallace was an acclaimed novelist, critic, essayist, and expert grammarian (not to mention a tennis fan). I like his cheekiness to challenge "British" grammar, sure to rankle our cousins across the pond. Of course my smile was quickly tempered when I thought about what Foster's critical eye would have made of this blog, and my myriad grammatical errors, word missteps, and sometimes careless proofreading.

The truth is that writing is hard. I think I write alright, but English (American or British or any other variation) is so complex, with so many grammar rules -- many of them debated -- and so many words that it can be hard, and might be impossible, to write without mistakes. Consider a few of these challenges:

Writing the Right Word

The Man and I have both written about using the exact right word, the mot juste, to convey the meaning and context of a subject (see here and here). Well our language's seemingly endless vocabulary makes that both possible and daunting, but there is another challenge, more mundane and maybe more important. That's avoiding (accidentally) using the wrong word, a pitfall that even good writers can stumble over. English is full of homonyms and words with similar structures and related, but not synonymous, meanings. This can lead to trouble. A (very) small sampling:

[and I'm sure a close reading of this blog will reveal the occasional incorrect usage (hopefully just occasional) along these lines]

  • accept versus except - I'll gladly accept any explanation you offer, except those that are obvious lies.
  • accurate versus precise - It's accurate to write that Pi is about 3.14, but listing just two decimals isn't very precise.
  • affect versus effect - The extreme cold weather had affected the crops , the net effect was a shortage of grain.
  • alternately versus alternatively - Day alternates with night. I had many transportation alternatives when I was in new York.
  • altogether versus all together - Altogether, I'm glad the riot is over. The crowd was all together when things turned violent.

Thursday, October 6, 2011

The Music of 1991

by Conroy

Loveless - among the great albums of 1991
There's a new documentary out, Pearl Jam Twenty, that chronicles the eponymous band's two decade history. A history that started with a bang with the release of the hugely acclaimed, popular, and influential album, Ten in late 1991. Ten became a central document of the Seattle music scene, on the leading edge of the wave of grunge, a genre that seemed to change the course of rock music. The legacy of Ten is secure, but what is remarkable is that when taking a broader look back at the music of 1991, Ten appears to be just one - and not the best one - of many landmark albums from what must certainly be one of the great years in music history.

The Music of 1991
Readers of this blog know of my strong feelings about judging art. Only with time can we come to value all art, including music. Well, the passage of twenty years is surely time enough (I would think) to put the music of 1991 in proper perspective. The familiar narrative of that time holds that a band from Seattle changed the course of music history with the release of a monumental, epoch-defining album. This band wasn't Pearl Jam, but Nirvana, and the album the amazing Nevermind, which was spearheaded by the exceptional single "Smells Like Teen Spirit." It's been frequently noted that Nevermind became so popular that it knocked Michael Jackson's album Dangerous from the top of the album charts. A symbolic change that seemed to close out the music of the 1980s, including hair metal of bands like Warrant, Motley Crue, Cinderella and others. Nirvana was joined by Pearl Jam, Soundgarden, Alice in Chains and other grunge bands to remake rock, under the greater label of "alternative rock", having an indelible influence on music for the rest of the decade.

That's the traditional story anyway, but like all history, it's a simplification that ignores what actually happened in 1991. It was a great year in music, but there was a lot more to it than the grunge scene of the Pacific Northwest.

Thursday, September 29, 2011

Why We're Fans

by Conroy

Rays celebrate their playoff-clinching win
This isn't a sports blog. But this blogger is a sports fan, and I occasionally write sports-centered and sports-related posts. Whether as a witness to tennis excellence, as a critique of college sporting bureaucracy, or as in the case of this post, reflecting on a sequence of events from last night's Major League Baseball games that encapsulates why we fans spend our time and energy following the sports we love. Why we're fans in the first place.

A very short synopsis: last night, on the final day of the regular season, the Tampa Bay Rays and the St. Louis Cardinals clinched the American and National League Wild Card playoff slots, edging out respectively, the Boston Red Sox and Atlanta Braves. But how it happened is genuinely remarkable.

Major League Baseball has existed as an organization since 1876, just this past weekend the league's 200,000 game was played. That's vastly more than any other profession league in any other sport can boast. Yet journalists and broadcasters that follow the game, and long-term interested fans like this writer, have never witnessed a night like last night. A night eagerly anticipated, and then fulfilled by games loaded with tension and drama, improbable and sudden twists, and a timing of outcomes that's hard to believe. A baseball fan, a sports fan, could ask for nothing more.

Two Historic September Collapses
Coming into last night's games, the Rays and Red Sox were tied for the American League Wild Card. Ditto the Cardinals and Braves in the National League. The standings were surprising because the Red Sox and Braves had led the Wild Card races by 9 and 8.5 games at the beginning of the month. No team in baseball history had ever failed to make the playoffs with September leads that large. But awful months by both teams (7 wins and 20 losses for the Red Sox; 9 and 18 for the Braves) and strong finishes by the Rays and Cardinals made races out of what should have been early playoff berths. When I write awful, I mean not only were these teams losing, but in many ways their play - so solid all year - almost seemed self-destructive. Both teams were imploding and had one last night to try and salvage the season.

A Phenomenal September 28
All 30 teams played last night, but the baseball world's attention was focused on just four; the Red Sox and last-place Orioles in Baltimore, the playoff-bound Yankees and Rays in St. Petersburg, the Cardinals and woeful Astros in Houston, and the (MLB best) Phillies and Braves in Atlanta.

Cardinal win easy
The Cardinals clobbered the hapless Astros, jumping out to a 5-0 first inning lead on route to an easy 8-0 win that was over before 10:30 PM. They had clinched at least a one game playoff with the Braves.

The Braves jumped out to a 3-1 lead over Philadelphia. The Red Sox jumped to a 1-0 lead over Baltimore, fell behind 2-1, tied the game, and then took a 3-2 lead in the fifth inning. The Yankees jumped all over the Rays and starter David Price, taking a 5-0 second inning lead thanks to Mark Teixeira's grand slam, and stretching the lead to 7-0 by the fifth inning.

So far, nothing remarkable. In fact the Rays-Yankees game was turning out to be anti-climactic. Around 9:30 PM it started raining in Baltimore, leading to a rain delay with the Red Sox still leading 3-2 in the middle of the seventh inning. The Braves were leading by the same score. The Rays had managed just two hits. The Cardinals were rolling to victory. Anyone who turned away at this point could have never anticipated what was about to happen. Allow me to summarize in bullet form:

Tuesday, September 27, 2011

Day and Night

by Conroy

A day/night map of the world
Just a few days ago, September 23rd to be precise, was the September equinox. The date when the sun passes the Equator on its southward journey to the Tropic of Capricorn. Or to put it in a more accurate way, the date (and time) during our planet's revolution of the sun when the Earth's axis of rotation is directly perpendicular to the imaginary line connecting the centers of the Earth and sun. After this date the southern hemisphere is tilted toward the sun (and the northern hemisphere tilted away). From our terrestrial perspective, for the next few months the sun will head ever more southward until the December solstice (December 22 in 2011), when it will reach its southern limit and begin its journey back north, passing the equator for the March equinox (March 20, 2012), and reaching its northern limit at the June solstice (June 22, 2012). Only to head back south again, repeating the cycle, forever.

This familiar cycle, corresponding to the seasonal variation that is such a distinctive feature of life, especially in temperate climates, has always fascinated me. The alternation between summer and winter, light and dark, has directly affected where and how humans live.

The Abuse of the NCAA - A Counter Point

Conroy's post about the NCAA is interesting for the legal and economic issues it raises, in particular the antitrust problem that the NCAA faces. The fact that the NCAA is a cartel—a coalition of colleges agreeing with each other to restrain trade—organized as a monopsony helps explain its egregious behavior. Cartels are generally unstable, so enforcement and credibility are key. If anyone steps out of line, the hammer must fall. That's why the NCAA is so strict in enforcing its rules. And the NCAA is a best described as a monopsony—a single buyer—rather than a "monopoly"—a single seller. Its member colleges "buy" the labor of college athletes with scholarships, etc., but they have agreed not to compete with each other over the terms of the sale. A monopsony will drive prices down, which is what the NCAA has done.

So college players are exploited in the sense that they are paid less than what they would be paid in a competitive market, in which colleges would bid against each other for the athletes, driving the price up; and they also have less control over their own labor than they would if they had more bargaining power. And the NCAA is apparently able to get them to sign over their publicity rights, permanently. Contrast that with a typical non-compete agreement, wherein generally the restraint  has to be reasonable in time and location. Finally, bear in mind that the money these athletes could have earned, by for example selling their autographs, is transferred to other students, in the form of scholarships and cheaper ticket prices. This means that the NCAA effectively transfers money from athletes, who are often from underprivileged backgrounds, to students who are on average wealthier. It's not only inefficient but inequitable.

But often, when firms cannot compete by lowering (or, in a monopsony, raising) prices, they compete in some other dimension. So athletes reap other kinds of benefits. In fact, college athletes derive significant non-pecuniary benefits from being college athletes. They are revered, for one thing, even if not forever. It reminds me of the exchange in the movie Eight Men Out between Rothschild and Atell (the ex-boxer):
Arnold Rothstein: Altogether, I must've made ten times that amount betting on you and I never took a punch.
Abe Atell: Yeah, but I was champ. Featherweight champion of the world!
Arnold Rothstein: Yesterday. That was yesterday.
Abe Atell: No A.R. you're wrong. I was champ, and can't nothin take that away.
Of course, the NCAA creates more rules to limit competition in these dimensions, and generally it is more efficient to compete over prices, so these facts are not relevant to an antitrust analysis. Such an analysis would begin by noting that they NCAA is a cartel in which members are restrained in their ability to freely negotiate contracts—it's a horizontal price fixing arrangement. This makes it a "per se" violation of the Sherman Act. But when the Supreme Court examined the NCAA, it chose not to invalidate the arrangement as a per se violation. Instead, it applied the "rule of reason" and considered pro-competitive benefits that might justify the restraint of trade. The Court did so, because it believed that the "product," the college athlete—unpaid (i.e., amateur), attending classes, etc.—simply would not exist without the cartel:
This decision is not based on a lack of judicial experience with this type of arrangement, on the fact that the NCAA is organized as a nonprofit entity, or on our respect for the NCAA's historic role in the preservation and encouragement of intercollegiate amateur athletics. Rather, what is critical is that this case involves an industry in which horizontal restraints on competition are essential if the product is to be available at all.
This argument is not meritless. There are reasons to believe that if the NCAA were eliminated, college sports would suffer. Colleges would face higher costs. Labor costs would obviously rise, but there would also be tax and legal costs. Colleges might have to pay taxes on the income they derive from their teams, since they would be operating a business remote from their educational mission. They would also open themselves up to lawsuits brought under Title IX for discrimination, since female players would be paid less than male players. And converting college teams into professional teams would reduce team loyalty—alma maters would feel less connection to the players (and therefore the teams), be discouraged by trades, and feel less inclined to donate; and players would feel more pressure to take alternate offers and less loyalty to their teams.

The quality of the players might increase, and more resources might be diverted into college sports. But perhaps our society already puts too much emphasis on, and invests too much in, competitive sports. After all, it's a market characterized by superstars and arms-races, and for that reason prone to inefficiency. Choosing to participate in college sports means spending less time in the classroom. Eliminating the NCAA would make that trade-off more dramatic. It's a competitive world on and off the field, and we would do better as a society by investing more heavily in education and less heavily in sports.