Friday, December 28, 2012


And so, as we teeter on the edge of the fiscal cliff, Norman Ornstein, an eminent scholar of the history of Congress, comes up with an idea for averting catastrophe.  We'll be reading Ornstein's book, written with Tom Mann of the Brookings Institution, this spring.  Meanwhile, prepare for some choppy waters.



Tuesday, December 18, 2012

Left to right:  Ken Kolson, Joe Sadek, Laura Allen, Mike McCandlish. 

Sunday, December 9, 2012

The Case for Policy Wonks



The late Senator Daniel Patrick Moynihan, one of my personal heroes, famously said that "Everyone is entitled to his own opinion, but not to his own facts."  That's another way of stating the position argued by Dylan Matthews in an op-ed piece in today's Washington Post.  Matthews advocates investing in the kind of non-partisan research associated with the Government Accountability Office, the Congressional Research Service, and the Congressional Budget Office.  Lawmakers will always argue about the ends of public policy, Matthews concedes, but there shouldn't be so much confusion about the facts; we ought to have a better idea, for example, of which means are appropriate for the promotion of particular ends. And about how much it's going to cost.

I'm hoping that the Spring 2013 Glenn Fellows will take a very close look at Matthews's piece as they think about possible policy paper topics for next semester.  And I'm hoping that they come away from the WAIP program with a nuanced idea of the "facts" and the extent to which they can be expected to inform public policy. 




Monday, November 19, 2012

Bellows in NYC, Lichtenstein in DC



Last summer the National Gallery of Art mounted an important exhibition on George Bellows, member of the Ashcan School and one of the leading American painters of the early twentieth century (and also a Columbus boy and an alumnus of The Ohio State University).  A special tour of the exhibition was arranged for the Summer 2012 class of John Glenn Fellows.

This fall the Bellows exhibition moved to the Metropolitan Museum of Art, which means that it is now within the purview of Sanford Schwartz, who writes frequently on the visual arts for The New York Review of BooksHere's a link to Schwartz's review.

The National Gallery, meanwhile, has mounted an exhibit featuring the work of another Buckeye:  Roy Lichtenstein.  Note to Ohio State alumni:  maybe it's time to begin acknowledging that OSU is not just about football anymore.  (And never was.)

OHHH. . .  ALRIGHT. . .

Monday, November 12, 2012

The Strange Career of Pithole City (reprise)

It's week 13, which means that the Autumn 2012 edition of the Washington Academic Internship Program is starting to wind down. I like to wrap things up by reading several public policy classics, including Garrett Hardin's "The Tragedy of the Commons," which tries to explain why fouling one's own nest is both unnatural and widespread.  This semester I'm asking the fellows to read a case study that I recently published aout the environmental degradation accompanying the world's first oil boom, which occurred in the 1860s not far from where I grew up--though it antedated me by a few years--in western Pennsylvania. There is a link to my essay, "Pithole City: Epitaph for a Boom Town," over on the right-hand side of this blog. And here is a link to a 7-minute summary of the astonishingly brief but intense history of Pithole City. The photo above is the view down Second Street today. Obviously, Pithole exists today mainly as an archaeological site; it could scarcely even be called a ghost town.


Thursday, November 8, 2012

The Day After


It's always fun to plan a syllabus, but you can never expect the universe to unfold in a way that conforms to your reading list.  It's all about timing, and every now and then, you get lucky.

This week the Autumn 2012 class of Glenn Fellows is reading Andrew Bacevich's Washington Rules, a searing indictment of military and intelligence spending in support of the "permanent war" that the United States has been waging--against international communism, and more recently against international terror--since the end of World War II.  Several fairly inconspicuous stories in today's Washington Post make me think that for once my syllabus actually got the timing right.

Walter Pincus, ordinarily snooze-inducing, devotes today's column to likely Pentagon budget cuts in the wake of President Obama's re-election.  "With Tuesday's election results," Pincus's column begins, "President Obama and Congress should take steps to end the 'warfare state' instituted by the George W. Bush White House."

Another story, entitled "Boeing Shrinking Its Defense Division," may be even more telling.  It seems that Boeing is disbanding its Missiles and Unmanned Airborne Systems division and cutting back on its number of defense executives by some 30% in order to intensify its concentration on commercial aircraft. 

All of this makes me think that maybe someday we'll have a Pentagon budget that reflects what the military is actually asking for, rather than what Congress thinks they should have.



Saturday, November 3, 2012

The Quiet American--Book Review

I think it’s fair to say that most of us were introduced to the English novelist Graham Greene by way of a film, The Third Man, about which Wikipedia—whatever did we do without it?—has this to say:

The Third Man is a 1949 British film noir, directed by Carol Reed and starring Joseph Cotten, Alida Valli, Orson Welles and Trevor Howard. It is particularly remembered for its atmospheric cinematography, performances, and unique musical score. The screenplay was written by novelist Graham Greene, who subsequently published the novella of the same name (which he had originally written as a preparation for the screenplay). Anton Karas wrote and performed the score, which used only the zither; its title music, “The Third Man Theme,” topped the international music charts in 1950. It is often ranked among the greatest films of all time.
One can quibble with the Wikipedia write-up—I am inclined to think that the Vienna sewers are the real star of The Third Man—but there’s no doubt that it is an unforgettable movie, and it was an important literary event to the extent that it led to wider appreciation of the oeuvre of Mr. Greene.

In my case, The Third Man led directly to The Power and the Glory, which I read in college, and then to Travels with My Aunt, which I read a few years later, and, finally, The Quiet American, which I read last week, as Tropical Storm Sandy was bearing down on the Northeast Corridor of the United States.

The Quiet American is set in Indochina during the early 1950s, when the Vietnamese were trying mightily to throw off the yoke of French imperialism. They succeeded, finally, in 1954, with the victory of the Viet Minh over the French at Dien Bien Phu. The Viet Minh were aligned with international communism, but there were a number of other movements competing with them for the honor of taking Vietnam back from the French. These groups included the Hoa Haos, a Buddhist movement; the Caodaists, an oddball religious grouping; the Binh Xuyen, an independent militia; and various freelancers and gangsters, such as the character whom Greene calls General Thé. In the context of the Cold War and the United Nations’ “police action” in Korea, there seemed to be a great deal at stake in Indochina during the early 1950s. That’s why there was so much covert action there on the part of foreign governments, including the United States.

There are three main characters in The Quiet American. Thomas Fowler is a worldly British journalist who is separated from his English wife, whose Catholicism would seem to render a legal divorce impossible. Fowler, a cynical and perhaps corrupt man who appears to have “gone bush,” manages to console himself with a beautiful young woman named Phuong (whom he can never marry so long as his wife refuses to file for divorce), and a serious opium habit. The third character, Alden Pyle, is a young American—a Harvard man—whose mission in Viet Nam, we eventually are made to understand, involves terrorist bombings undertaken in the name of freedom and democracy. Pyle and his masters, whoever they may be—probably the CIA—believe that it’s in the best interest of the United States to nurture indigenous liberation movements (so long as they are anti-communist) in all parts of what we now call the Third World.

The adjective “quiet” appears many times in many contexts in Greene’s novel, and while the title of the book may be, as the critic Robert Stone puts it, “a joke” (since Alden Pyle is a “prattling fool"), there may be a kind of rough justice in the fact that Pyle’s indiscretion contributes to his own demise--never mind that Fowler earns an assist. The Englishman's impatience with Pyle looks like pure anti-Americanism alloyed with the perception that innocence of any kind is dangerous in the real world. The lesson of The Quiet American is that idealists have an uncanny knack for wreaking havoc not only on themselves but upon everyone in their general vicinity. Fowler’s problem is that his motives inevitably will be questioned by all who know--and that would include the French provincial police--that Pyle was his rival for the affections of the same woman: Phuong.

This 21st-century reader of The Quiet American was struck by two things. First, the book makes such a strong and persuasive case against intervention in Vietnam that it seems incredible—more so now even than it did at the time—that the U.S. was willing blithely to wade into the same Vietnamese morass--guns, ideals, and  naïveté blazing. The second is that American innocence lingered long aferwards, long enough to inspire our more recent adventure in Iraq, where regime change unfolded in just about the way that Greene would have predicted. It’s hard to believe that any policy could have been better calculated to enhance Iran’s geopolitical fortunes in the Persian Gulf.

The second thing that occurs to me is that The Quiet American may do a better job of arguing against America’s permanent war on terror than the book I have assigned this semester for that very purpose: Andrew J. Bacevich’s Washington Rules (New York: Henry Holt, 2010). I am thinking hard about that as the autumn 2012 Glenn Fellows begin to read that book, and as I put the finishing touches on the WAIP syllabus for spring, 2013.

Sunday, October 28, 2012

A Diplomat's Progress (reprise)


Last week the Autumn 2012 class of Glenn Fellows read Samuel Huntington's famous Foreign Affairs article on "The Clash of Civilizations." As an introduction to the not-always-glamorous world of professional diplomacy, I have this week assigned a book called A Diplomat's Progress, written by Henry Precht, a retired foreign service officer. Mr. Precht was born in Savannah, Georgia, and educated at Emory University. He joined the foreign service in 1961 and served in U.S. embassies in Italy, Mauritius, Iran, and Egypt. He was the Department of State’s Desk Officer for Iran during the revolution and hostage crisis when the Shah was overthrown, and he was deputy ambassador in Cairo when Anwar Sadat was assassinated. His nomination by President Jimmy Carter to the post of U.S. ambassador to Mauritania was blocked by Senator Jesse Helms, who blamed him for "losing Iran."

After leaving the foreign service, Mr. Precht served as president of the World Affairs Council in Cleveland, Ohio, where he also taught at Case Western Reserve University. A few years ago, he published A Diplomat’s Progress, a work of fiction consisting of a series of vignettes about a State Department official named Harry Prentice. It is an engaging work that reveals, as one reviewer has put it, the “grittier side of embassy life with a wry sense of humor and a bit of an edge.” To the extent that the work is autobiographical, A Diplomat’s Progress is rather remarkable.

For one thing, the “grittier” aspects of diplomacy are portrayed warts and all. In one of the vignettes, the young Harry Prentice and his wife attend a dinner party at the home of the foreign minister of Mauritius, during which the lecherous host assaults the drunken daughter of the Japanese ambassador. In a vignette set in Egypt, the protagonist must tend to a dead body and a suitcase full of drug money. In “Caviar and Kurds,” set in Iran, Prentice unwittingly leads the Shah’s secret police to an underground freedom fighter named Hassan, whom Prentice finds hanging from a lamppost the next morning. In this account of embassy life, it seems that no good deed goes unpunished.

Most remarkable as an autobiography—and surely it must be regarded as partly that, in spite of the veneer of fiction—is the book’s unflattering portrait of its protagonist. Throughout A Diplomat’s Progress, Harry Prentice’s diplomatic efforts are undone by his unusual combination of naivete and cynicism. Typically, the reader is given a glimpse of a career diplomat preoccupied, not with the national interest, as one might suppose, but rather, with his own career advancement. At one point, for instance, Prentice seems to have been the unwitting accomplice of a Palestinian terrorist. What does he do about it? He gets up in the middle of the night to compose a somewhat Bardachian “balance sheet of possible courses of action.” There appear to be two:

First, the natural inclination of every Foreign Service Officer: Do nothing. Wait on events and react as necessary and as seems prudent at the time. . . . Alternatively, I could report my suspicions to the police. Playing it straight and admitting wrong might be partially redeeming. The key word was “partially.” The embassy surely would be informed and handle my future as if it had no value. The same with the Israeli authorities. I had to face it: Only I really cared about my future, not any American or Israeli career-building bureaucrat.
During his posting to Cairo, Prentice is asked to interview a Sheikh who might have been in a position to influence the extremists holding a number of American hostages in Beirut. Prentice’s efforts fail. “But never mind,” seems to sum up his reaction. “I could only hope that someone—the ambassador or an unknown friend in the department—would make an excellent report of my performance for my file.” The adventure, he concludes, “just might be a turning point—upward—in my career.” On the basis of the evidence provided by the author, the judgment handed down by Prentice’s first wife seems just: He has “a pretty good soul, even though sometime it seems quite lost in the bureaucratic maze.”

May 13, 2012 update: In today's Washington Post, Jonathan Yardley reviews another diplomatic memoir, this one published posthumously, the work of one of the veteran foreign service officers accused during the McCarthy era of having "lost" China. It's worth a read.

August 3, 2012 update: The current issue of The New York Review of Books has a very fine review by Roger Cohen of a book by Christopher de Bellaigue about Muhammad Mossadegh, the prime minister of Iran who was ousted by an Anglo-American coup in 1953. The review features a cameo appearance by Ataturk and a number of references to U.S. diplomatic snafus that will call to mind some of the stories from Henry Precht's A Diplomat's Progress about the role of Big Oil in Middle Eastern politics and of SAVAK in Shah Reza Pahlavi's Iran.

November 13, 2012 update:  According to Yahoo News, Mossad tried to assassinate Saddam Hussein with an exploding book.

Autumn 2012 Glenn Fellows at the Embassy of the Netherlands

From left to right, front row:  Maddie Fireman, Paige MacMorland.  Second row:  Katie Colburn, Maggie Murdock, Marissa Cooper.  Third row:  Joe Sadek (program coordinator), Amber Seira, Erin Moeller, Leah Apothaker, Isaac Choi.  Fourth row:  Jade Holmes, Adam Kase, Grace Fry.  Fifth row:  Ken Kolson (WAIP director), Robert Whiteman (European Union), Cameron Griffith (France), Tim Wood (United Kingdom), Julia Koppius (the Netherlands). 

Friday, October 26, 2012

Glenn Fellows Meet with President Gee




From left to right, first row:  Joe Sadek, WAIP program coordinator; Katie Colburn; Paige MacMorland; Dr. Gee; Maddie Fireman.  Second row:  Ken Kolson, WAIP program director; Amber Seira; Marissa Cooper.  Third row:  Maggie Murdock, Isaac Choi, Jade Holmes, Leah Apothaker.  Top row:  Adam Kase, Erin Moeller.  Photo by Christine Kontra.

Wednesday, October 24, 2012

Glenn Fellows at Bryce Harlow Advocacy Forum





From left to right:  Erin Moeller, Gianna Rendano (Duquesne University), Amber Seira, Andrea Peralta (Lynn University), Adam Kase, Maddie Fireman (in front), Ryan Sewicke (Roger Williams University, behind), Yanbeli Gomez (SUNY Plattsburgh), featured speaker Scott Salmon (United States Steel Corporation), Katie Colburn, Jade Holmes, Paige MacMorland (in front), Ken Kolson (Director, Washington Office of the John Glenn School of Public Affairs, behind), Maggie Murdock, Marissa Cooper (in front), Isaac Choi (behind), Leah Apothaker (in front), Mark Dalhouse (president of the Washington Internship Institute, behind), Joe Sadek, WAIP program coordinator.  Photo by Linda Dooley, president of the Bryce Harlow Foundation.

Monday, October 22, 2012

Turkey on the World Stage (reprise)





Anyone who has ever taught for a living will understand that a large part of the appeal, and the challenge, lies in trying to package a wide range of scholarly sources in such a way as to tell a compelling story. Unfortunately, the charms of syllabus development can lead to the folly of imagining that it can ever be a completely finished product; in this way a reading list is akin to public policy. To quote Lord Salisbury: "There is no such thing as a fixed policy, because policy like all organic entities is always in the making."

The result is that a syllabus or a reading list can be the occasion for unanticipated intellectual excursions. Four years ago, when I began leading the WAIP policy seminar that is now PUBAFRS 4020, it never occurred to me that modern Turkey, a remnant of the old Ottoman Empire regarded as "the sick man of Europe" prior to World War I, is a remarkably useful lens for viewing world affairs.

The seminar has evolved in such a way that Turkey intervenes at three different points in the course of the quarter. First, there is the Cuban Missile Crisis of 1962, a classic case study in crisis management and a staple of all introductory courses in public policy. The standard treatment has President Kennedy staring down Premier Khrushchev, with the Soviets finally blinking and removing their missiles and dismantling their Cuban bases, all in exchange for our promise to leave Castro alone. It turns out that there was more to it than that. Robert F. Kennedy, JFK’s Attorney General, offered discrete assurances to Soviet Ambassador Dobrynin that we would take our Jupiter missiles out of Turkey, which shared a tense border with the U.S.S.R. at the time. We did so less than six months later.

Second, we read Samuel P. Huntington’s famous, or infamous, "clash of civilizations" essay, in which Turkey is treated as the epitome of a “torn” country, having been riven by competing traditions, some of them Muslim (though not particularly Arabic), and some European (though not especially Christian). Turkey—the secular, Western-oriented republic created by Mustafa Kemal Atatürk (pictured above)—rejected Mecca, only to be rejected in turn by Brussels; at the end of the 20th century Huntington saw Turkey as "making strenuous efforts to carve out [a] new identity for itself.”

Turkey, mainly a sidebar in 20th century history, promises to feature much more prominently in the narrative of 21st-century world affairs. In a recent issue of The New York Review of Books, Stephen Kinzer discusses four books that assess the profound policy initiatives being pursued by Prime Minister Recep Tayyip Erdogan and his Justice and Development party. Erdogan’s Turkey is a modernizing republic inclined to put the military in its place and turn its back on secularism--though not on economic growth or autocracy. Tellingly, Kinzer’s piece is entitled “Triumphant Turkey?”

Kinzer raises a number of interesting questions about Turkey's changing place on the world stage, and given the current condition of Europe, it may inspire one to ask why the Turks are so keen to join the European Union. To help bail out the Greeks, perhaps?

Finally, in October, 2012, there is much talk of the civil war in Syria creating instability throughout the Middle East, and Turkey may be the first case in point.

September 16, 2011, update: For the Washington Post, Craig Whitlock reports that the U.S. and Turkey have signed an agreement that will allow the U.S. to install a radar station that will be part of a system designed to fend off missile attacks from either Iran or Russia. Separate negotiations about predator drones continue.

November 12, 2011 update: Soner Cagaptay has a column in the Washington Post on U.S.-Turkish relations.

November 3, 2012 updateAnthony Faiola reports in today's Post that Turkey's economic boom seems to be petering out, and that problems caused by events in neighboring Syria have put new pressures on Turkey's prime minister Recep Tayyip Erdogan.

Tuesday, October 16, 2012

The Guns of August (reprise)


When I was a high-school sophomore, I was assigned on the basis of standardized testing to Advanced Placement social studies. After suffering for a year—I wasn’t mature enough to appreciate primary resources or to contribute to seminar discussions—I bailed out of AP. Unfortunately, that meant that I had missed the standard Plato-to-NATO narrative of Western Civilization that the mainstream kids had taken in tenth grade. As a result, my knowledge of European history remains spotty to this day. What were the Wars of the Roses all about? Who was Albert Dreyfus, anyway? And when, exactly, was the Italian Risorgimento? I have to look these things up every time.

At about the time I was seceding from Western Civ, Barbara Tuchman was putting the finishing touches on The Guns of August, winner of the Pulitzer Prize for 1962. I have imagined ever since that the book might offer a painless way of addressing some of the deficiencies resulting from my misspent youth. The Guns of August has been on my reading list for a very long time.

Now, a half-century later, I have done my duty. All in good time. The Guns of August turns out to be an extraordinarily good read, as President Kennedy recognized while it was sitting atop the best-sellers lists fifty years ago. Kennedy gave copies to members of his cabinet and top military advisors. There are those who say that Tuchman’s analysis of the first month of the Great War influenced Kennedy’s handling of the Cuban Missile Crisis.

It’s hard to know which of the book’s many virtues Kennedy valued the most, but for me it’s Tuchman’s vivid account of how military goals are routinely undermined by the random blundering and miscommunication that inevitably occur in the fog of war.

For example, Tuchman relates the story of the Goeben and Breslau, two of a handful of German warships that happened to be in the Mediterranean in early August, 1914. When Germany attacked France, the Goeben and Breslau got busy shelling French ports in Northern Africa. The British naturally assumed that the German ships would worry about getting trapped in the Mediterranean and so would make a break for the Strait of Gibraltar and the open seas in the event of a British declaration of war against Germany. And so, when Admiral Milne cabled London to report the position of the German ships at 37.44 North, 7:56 East, Prime Minister Winston Churchill telegraphed back: “Very good. Hold her. War imminent.” Unfortunately, Tuchman writes, “when reporting their position, Admiral Milne had neglected to say which direction the Goeben and Breslau were steaming. Churchill naturally assumed they were heading west with further evil intent upon the French.”

In fact, the ships were heading east, and so Admiral Milne was halfway between Malta and Greece when he was informed by the Admiralty that Austria had declared war on England. Milne abruptly gave up the chase to avoid an encounter with any Austrian fleet that might emerge from its base in the Adriatic. “Unfortunately the word [i.e., the cable from Admiralty] was an error by a clerk who released the prearranged code telegram for hostilities with Austria by mistake. . . . One more opportunity was lost.” That meant, to make a long story short, that the Goeben and Breslau were now free to proceed to Constantinople, where the Germans negotiated an alliance with Turkey. From there, the German ships moved into the Black Sea, blocking Russian access to the Mediterranean and provoking them into declaring war on Turkey.

Then there were the French, whose military was smitten with the idea that effective warfare consisted of two things: élan, or the will to conquer, and a policy of relentless offense, even to the point of neglecting national defense. Britain’s Lord Kitchener was among those who recognized the absurdity of such a plan of campaign, but “it had to be accepted because there was no time to make another. . . . The momentum of predetermined plans had,” Tuchman concludes, “scored another victory.”

But none of Tuchman’s stories about the futility of master planning is better than the one about the German plan to attack France by sending an enormous army through the heart of Belgium, which was a neutral country whose security was guaranteed by the five Great Powers, including both France and England (not to mention Germany herself!). The great disadvantage of this plan was that it would draw England into the war on the side of Belgium and France. And yet, the Belgian route had been the Germans’ game plan for many years.

And for the Chief of the German General Staff, General Helmuth von Moltke, the predetermined plan was the only thing that mattered. And so, on August 1, 1914, the night before the start of World War I, Kaiser Wilhelm (pictured above), finally recognizing the grave risks inherent in the default plan of attack against France, announced to General Moltke that he wanted him to turn his armies east, initiating a Russo-German war instead. Moltke, we are told by Tuchman, “refused point-blank.”
Moltke was in no mood for any more of the Kaiser’s meddling with serious military matters, or with meddling of any kind with the fixed arrangements. To turn around the deployment of a million men from west to east at the very moment of departure would have taken a more iron nerve than Moltke disposed of. He saw a vision of the deployment crumbling apart in confusion, supplies here, soldiers there, ammunition lost in the middle, companies without officers, divisions without staffs, and those 11,000 trains, each exquisitely scheduled to click over specified tracks at specified intervals of ten minutes, tangled in a grotesque ruin of the most perfectly planned military movement in history.
Tuchman’s book destroys a number of shibboleths along the way, including the idea, prevalent in the early years of the twentieth century, that free trade had made the leading economies so dependent on one another that major, continent-wide wars had become unsustainable, which meant in turn that 20th-century wars were likely to be short and to turn on a small number of decisive battles. No such luck! Finally, The Guns of August excelled at demonstrating that military men stubbornly refused to appreciate the significance of Clausewitz’s dictum that war is the extension of politics by other means; in other words, they underrated the importance of politics.

In addition to influencing actual decision makers in the Kennedy Administration, The Guns of August profoundly affected the academic study of public policy by shaping the thinking of a young scholar named Graham T. Allison, who came up with a model of decision making based on Tuchman’s insights, one that he posited as an alternative to the notion of unitary states basing policy on a perfectly rational calculation of costs and benefits.

Allison’s Organizational Process model of decision making stressed the importance of pre-established routines in limiting policy options to Standard Operating Procedures (SOPs). Organizations, Allison argued, are “blunt instruments,” which is why they cannot be expected to come up with nuanced policies, and why the decisions taken by their leaders are “frequently anticlimactic” and not necessarily rational in any conventional sense; they are about as rational as the curriculum-planning decisions of a fifteen-year-old.

Autumn 2012 Glenn Fellows Visit SCOTUS

Unfortunately, the Supreme Court is covered with scaffolding, so Joe Sadek took a picture with the Capitol dome in the background.  First row, from left to right:  Maddie Fireman, Marissa Cooper, Paige MacMorland, Katie Colburn, Maggie Murdock.  Back row:  Amber Seira, Erin Moeller, Isaac Choi, Grace Fry, Jade Holmes, Adam Kase, Leah Apothaker.

Tuesday, October 9, 2012

Autumn 2012 Glenn Fellows at the White House

Bending over, from left to right: Paige MacMorland, Marissa Cooper, Isaac Choi.

Standing, from left to right: Adam Kase, Amber Seira, Erin Moeller, Leah Apothaker, Jade Holmes, Katie Colburn.

Sunday, October 7, 2012

The Brief against Brandeis (reprise)



There is no denying that the long-lived Louis D. Brandeis (1856-1941) was an American treasure. The son of Eastern European Jewish immigrants, he graduated at age 20 with the highest GPA in the history of Harvard Law School. He made his reputation as a Progressive lawyer and as a leader of the worldwide Zionist movement. In 1916, he was nominated for a seat on the United States Supreme Court by President Woodrow Wilson.

The definitive biography of Justice Brandeis was published by Pantheon in 2009. The work of Melvin I. Urofsky of Virginia Commonwealth University, the 955-page tome has received rave reviews. One, written by Anthony Lewis, appeared in The New York Review of Books. Brandeis, according to Lewis,


was intensely interested in facts. His law clerks did research on facts as much as law. When the Court considered a case on presidential appointment power that involved the 1867 Tenure of Office Act, Brandeis had his law clerk, James M. Landis (who became the dean of Harvard Law School), go over the Senate journals of 1867 to see what the views of the times were. Landis spent months in the Library of Congress reading the journals page by page.

Brandeis even tried to get Justice Holmes, who read philosophy in the original Greek, to take more interest in facts. He urged Holmes to spend the summer break reading up on working conditions and visiting the textile mills in Lawrence, Massachusetts. A year later Holmes wrote Harold Laski that “in consideration of my age and moral infirmities, [Brandeis] absolved me from facts for the vacation and allowed me my customary sport with ideas.”

Brandeis’s obsession with facts continues to reverberate through American law and politics. Consider, for example, what Wikipedia has to say about the term “Brandeis brief,” which refers to


a pioneering legal brief that was the first in United States legal history to rely not on pure legal theory, but also on analysis of factual data. It is named after the litigator Louis Brandeis, who collected empirical data from hundreds of sources in the 1908 case Muller v. Oregon. The Brandeis Brief changed the direction of the Supreme Court and of U.S. law. The Brandeis Brief became the model for future Supreme Court presentations in cases affecting the health or welfare of classes of individuals. This model was later successfully used in Brown v. Board of Education to demonstrate the harmful psychological effects of segregated education on African-American children.

This week members of the Autumn 2012 class of Glenn Fellows are reading essays and court cases organized around the theme of fact-finding and its jurisprudential consequences. As they read these materials, my hope is that they will perform a little thought experiment by asking themselves about the facts that the Court recognized in Muller, Brown, and Roe v. Wade, and whether it would have been wiser for the Court to base its rulings on strictly legal grounds, rather than conducting fact-finding expeditions.

In Brown, for example, the Supreme Court had the option of resurrecting Justice Harlan’s stirring dissent in Plessy v. Ferguson, which would have meant striking down school segregation on the grounds that “our constitution is color-blind,” rather than on the less substantial grounds that segregated schools inflict psychological damage upon African-American children. Likewise, in Roe v. Wade, there were a number of precedents that the Court, rather than wrestling with the question of fetal viability and formulating a national “right of privacy,” might have used to finesse the issue of abortion by declaring that public health is a matter that the Constitution, through the Tenth Amendment, reserves to the states. I hope the Fellows will ask themselves, in short, whether the Brandeis brief, so well intentioned, has been responsible for a great deal of legal and political mischief in the century since Muller v. Oregon.

October 8, 2012 update: It turns out that this could be a big week for affirmative action. Oral arguments are scheduled for Wednesday in Fisher v. University of Texas, a case filed by a white woman who claims to have been a victim of racial discrimination when she was rejected for admission to the university. According to Robert Barnes in the Washington Post, the case has elicited 92 amicus curiae briefs. It also has inspired an op-ed piece in Sunday's Post co-authored by the law school deans at both Harvard and Yale. Ready to hear the case against affirmative action? Tomorrow is the publication date for a book by Richard Sander and Stuart Taylor, Jr., called Mismatch, which is attracting rave reviews mainly, but not exclusively, from the right.

Thursday, October 4, 2012

Garfield: A Book Review (Reprise)


My first full-time teaching job was at Hiram College in northeastern Ohio. When I washed up on the shores of that bucolic campus in the summer of 1970—I was 25 years old—I was vaguely aware that the school was the descendant of something called the Western Reserve Eclectic Institute, and that it had been founded by the Disciples of Christ in 1850. I also was aware that its most famous alumnus was James Abram Garfield, the twentieth president of the United States. Somewhere along the way I had learned that Garfield was assassinated by a "disappointed office seeker" and that he was succeeded by a non-entity named Chester A. Arthur.

That was about it. For me Garfield was merely one of several post-Civil War Ohio Republican presidents who had been officers in the Union Army during the Civil War and wore full beards. I probably could not have picked Garfield out of a lineup if it had included Rutherford B. Hayes and Benjamin Harrison. Over the next decade and a half, I was to learn a lot more, some of it from Allan Peskin’s definitive biography, Garfield (Kent, OH: Kent State University Press, 1978), and some of it from my faculty colleagues, alumni of the college, and local townspeople.

Early on, It was pointed out to me that one of the handsomest houses in Hiram Village, still in use as a private residence, had been Garfield’s home while he served as teacher and principal of the Western Reserve Eclectic Institute. Several alumni of the college were, it was said, on friendly terms with direct descendants. Faculty colleagues supplied some important biographical details. Garfield, I was to learn, was born in a very rude log cabin on the Ohio frontier, endured desperate poverty through much of his childhood, and went to work early on the Erie and Ohio Canal. Garfield’s was a Horatio Alger story—literally, I read the book. He worked his way through the Eclectic as a janitor, proving to be a brilliant and industrious scholar with a gift for friendship and leadership. He wrestled with his students, and he debated itinerant atheists. There were persistent rumors about his having carried on a love affair with Almeda Booth, one of his teachers at the Eclectic. In 1858, he married a local girl, Lucretia Rudolph; their love letters were collected and edited by a colleague in the English department. Another colleague produced a play about Garfield’s assassination.

Garfield was an accomplished scholar in several fields, including Latin and Greek. Though he studied ancient languages, he was enlightened in many ways that we would consider modern. He was a voracious reader; he was one of the few Members of Congress who made good use of his lending privileges at the Library of Congress; he was a confirmed abolitionist before the war and remained committed to full racial equality afterwards. He treated everyone with respect, had a playful sense of humor, and saw both the tragic and comic aspects of the human condition. In an age of rampant political corruption, Garfield was a man of honor, though he was no goody two-shoes.

That Garfield was “not just a tragic figure, but an extraordinary man” is one of the major themes of a new book: Candice Millard’s Destiny of the Republic: A Tale of Madness, Medicine, and the Murder of a President. The book is a careful study of the assassination based on extensive research in what appear to be the most relevant sources. The madman at the center of the tale is, of course, the assassin, Charles Guiteau. The practice of medicine was very much in flux at the time, with older physicians in the United States being strongly inclined to resist the revolutionary ideas of England’s Dr. Joseph Lister, who called for antisepsis in the operating room based on his understanding of the role of germs in the spread of disease. As for murder, Millard endorses the testimony that Guiteau provided at his trial: Guiteau might have done the shooting, but Garfield’s attending physicians murdered him with two months of wrong-headed, agonizing treatment. The chief physician, the ironically named Dr. Bliss, introduced infection when he and many others repeatedly stuck their fingers in Garfield’s wound searching for the bullet. Later, they were unable to recognize the infection that had set in, let alone stop its spread. Millard is unable to resist the temptation to assert that this was a case in which ignorance, literally, was Bliss. The other major character in this sad tale is Alexander Graham Bell, who invented a metal detector called the Induction Balance that he hoped would aid Garfield’s physicians in their search for the bullet. Unfortunately, the perfection of the device came too late to save the intended beneficiary.

This is a wonderful book, though in a recent Washington Post review, Del Quentin Wilber makes a legitimate point when he complains that the story of Bell’s Induction Balance is somewhat tangential to the Garfield drama. I am inclined to concede the point, but for me it doesn’t begin to ruin what is an informative and moving story. I do, however, have two reservations of my own.

The first has to do with Guiteau and his motives. Invariably, Guiteau is described as a “disappointed office speaker,” and Millard shows that he lobbied shamelessly to be appointed to a consulship to Paris. There can be no question about his having been a disappointed office seeker. But, as Millard makes clear, he was also a lunatic, a religious fanatic who was convinced that his deed had been divinely inspired. It suited the enemies of the spoils system and the advocates of civil service reform to play down his derangement while stressing the role that the patronage system played in causing a disappointment keen enough to inspire assassination.

The second has to do with the book’s title, which asserts that the destiny of the republic was at stake during the many weeks that Garfield’s physicians attended so incompetently to their patient. This is a little overwrought. For one thing, it doesn't consider the extent to which the powers of the presidency were circumscribed in the late 19th century, despite Lincoln’s aggrandizement of the office during the Civil War. And in any case it isn't clear what public policies were at stake as the honest and enlightened Garfield lay on his deathbed and the hapless Chet Arthur, the creature of a political machine, cowered in a Manhattan townhouse. Garfield may have been the one politician of the Gilded Age who had it in him to put an end to the spoils system, introduce the principle of merit into public service, and put a hammerlock on Jim Crow—had he not been thwarted by an assassin’s bullet. But, as it happened—and Millard tells this story very well indeed—mediocre Chet rose to the occasion to an extent that no one had imagined possible, which is further cause for wondering whether Guiteau's heinous deed altered the course of American political history.

If it seemed to some people at the time that the destiny of the republic truly was at stake, it may be because the president of the United States, in addition to being chief legislator, chief diplomat, and leader of his party, serves as head of state—part of what Walter Bagehot called the “dignified” aspect of government, in contradistinction to the “efficient” exercise of political power. The American people will mourn a president—even one who is practically unknown to them, like William Henry Harrison, or one who was unloved because he was unlovable, like William McKinley—because the president is, among other things, the embodiment of the state. In Garfield’s case, the mourning was profound, because his many virtues, which included his gregarious and passionate nature, were so conspicuous. He must have been an easy man to love. Careful readers of Millard’s admirable book will mourn his loss still.

Monday, September 24, 2012

The Empty Chambers (reprise)



I've always been a little ambivalent about the "broken branch" thesis. On the one hand, Thomas Mann and Norman Ornstein make a good case that things have gone downhill in both houses of Congress since the glory days of Lyndon Johnson and Sam Rayburn. George Packer has made the same argument, specifically about the Senate, in The New Yorker. Actually, no one has issued the indictment more eloquently than former Senator John Glenn. Looking back on his long career, he writes:

In my twenty-two years in the Senate, I had watched the legislative process change. There was always partisanship--that was the nature of the system. Although it produced disagreement and debate, it ultimately forged budgets and laws on which reasonable people could differ but that worked for most. In general, lawmakers performed their duties in an atmosphere of mutual respect.

This was no longer the case. By the 1994 election, we had single-issue candidates, the demonization of government, the sneering dismissal of opposing points of view, a willingness to indulge the few at the expense of the many, and the smug rejection of the claims of entire segments of society to any portion of the government's resources. Respectful disagreement had vanished. Poisonous distrust, accusation, and attack had replaced it.

On the other hand, sometimes it seems to me that maybe the good old days weren't all they're cracked up to be--maybe, as a wag once suggested, the good old days aren't what they used to be--and never were! Certainly, the vicious caning of Senator Charles Sumner of Massachusetts by South Carolina's Preston Brooks in 1856 (pictured above) hardly qualifies as "respectful disagreement."

On the third hand, you can make the case that what's wrong with Congress is that its powers have been usurped by an all-consuming executive branch whose mandate comes from what James Madison referred to as "the superior force of an interested and overbearing majority." Or you could argue that Congress has simply abdicated while the executive--and the judiciary--have been flexing their muscles. Either way, the explanation for Congressional irresponsibility starts to sound like the old saw about academic politics: it's vicious precisely because "the stakes are so low."

January 25, 2012 update: There's an interesting story in today's Politico on the GOP legislative agenda--interesting in part because it features Rep. Steve Stivers.

September 24, 2012 update: Two books about Congress, one of them Mann and Ornstein's It's Even Worse Than It Looks, are reviewed by Ezra Klein in a recent issue of The New York Review of Books. It's premium content, so here's hoping this link will work.

Saturday, September 22, 2012

National Book Festival 2012

It's now officially fall, which means it's time to check out the Library of Congress's National Book Festival on the Mall. Here's a list of featured authors.

Saturday, September 15, 2012

The Skaters

The Autumn 2012 class of Glenn Fellows at the National Gallery of Art, in front of Gilbert Stuart's portrait of Sir William Grant on ice skates. From left to right: Joe Sadek, Program Coordinator; Jade Holmes, Maggie Murdock; Erin Moeller; Ken Kolson, Director; Paige MacMorland; Marissa Cooper; Katie Colburn; Grace Fry; Adam Kase; Maddie Fireman; Amber Seira; Leah Apothaker; Isaac Choi.

Wednesday, September 5, 2012

Why Is There No Socialism in America? (reprise in honor of the Democratic National Convention)


In a lecture reprinted by The New York Review of Books, the late Tony Judt of New York University tells us that this query—why is there no socialism in America?—was posed a century ago by a German sociologist, Werner Sombart. The question remains pertinent, for reasons that I try to explain below, despite the enactment of a great deal of “social democratic” legislation in the course of the twentieth century.

Judt’s lecture explores some of the many answers that have been formulated in response to Sombart’s question. I was surprised, however, that Judt never mentions Louis Hartz (1916-1986), a political philosopher with an original take on American political history that he published during the McCarthy Era as The Liberal Tradition in America (New York: Harcourt, 1955).

Somewhat surprisingly, Hartz’s answer boils down to this: there can be no genuine socialism in America because there was never any genuine conservatism here. And we have no conservatives because in the New World there was no Old Order to conserve. Early settlers came to the British colonies in North America in an effort to get away from vestiges of feudalism (primogeniture and the divine right of kings, for example) that retained their oppressive potency in Europe. We Americans are the descendants of religious dissenters and others who voted with their feet against the Old Order. The deal was sealed when our few remaining Tories, aristocrats, and monarchists escaped, or were chased, to Canada after the American Revolution.

Canada, in fact, proves Hartz’s point. Even today there are a few honest-to-God Tories, and roughly the same number of authentic socialists in Canada, and neither feels obliged to offer apologies for itself. The result, to take just one example, is that the Canadians were able to create something akin to socialized medicine; it couldn’t be rejected, as it has been in the U.S., as part of a wholly alien tradition.

In the United States, by contrast, liberalism (think John Locke, for whom society is atomistic, i.e., the sum of its individual parts) is the only tradition we have. Some American liberals may be inclined to promote equality, even at the expense of personal liberty; Hartz calls them “liberal democrats.” Others may favor liberty over equality; Hartz calls them “liberal whigs.” We have neither a Far Right reminiscing about an organic, corporate order dominated by a benign and paternalistic gentry, nor a Far Left intent on overthrowing bourgeois capitalism and replacing it with a collectivist Social Welfare state (i.e., a Workers’ Paradise). The good news is that there is nothing in our tradition for fascism to feed on. Never mind all the dire warnings over the years about indigenous fascism that have been issued by the Far Left; the closest we’ve ever come was Father Coughlin in the 1930s, and that wasn’t very close. BTW, that's Ben Shahn's image of Father Coughlin and his Hitlerian fist pump up top.

The result, according to Hartz, is that American politics oscillates between the two “extremes” of liberal democracy and liberal whiggery, which aren’t extreme at all, but variations on the same theme. Thus, it is very much in the Hartzian tradition for Judt to pose the following musical question about American politics: “Why is it that here in the United States we have such difficulty even imagining a different sort of society from the one whose dysfunctions and inequalities trouble us so?” It’s because our liberal tradition is so capacious it makes everything else seem beyond the pale.

In the United States, the liberal democrats (i.e., people like Judt) have traditionally had the upper hand. This is because they (unlike, say, the author of Federalist No. 10) have no real reservations about majority rule, and they know how to appeal to majoritarian instincts, some of which are not very honorable (e.g., the abolition of debts). Liberal whigs (e.g., today’s Republicans) have a harder time of it, because if they articulate their principles clearly they run the risk of offending the many who stand to profit from majority tyranny. Still, the liberal whigs are able to compete by planting seeds of fear and doubt in the American democrat. Conjuring up the rags to riches fantasy (e.g., Andrew Carnegie’s “gospel of wealth”) allows the American right, such as it is, to enjoy what Hartz called the Great Law of Whig Compensation, by which he meant that for the death of Hamilton (and genuine Toryism) they are rewarded with the perpetual triumph of McKinley (an Ohioan, of course). You take what you can get. Come to think of it, Hartz himself was born in Youngstown, the son of Russian Jewish immigrants.

Still with me? Hang on, there’s just a bit more. Implicit in Hartz’s description of a consensual and monotonous liberal order is the idea that the parameters of American political discourse are unusually narrow. Tony Judt is on exactly the same page when he says, apologizing for the academic jargon, that the great shortcoming of American politics is discursive. One of the effects of that is that the stakes of American politics are fairly low, though politicians do everything they can to try to make them seem much higher, especially during an election year.

Some will note that the U.S. has had its collectivist moments: the Progressive movement at the turn of the twentieth century; the New Deal during the Great Depression; Lyndon Johnson’s Great Society. And that is true, though each was more of an improvisation than part of a Grand Design, which explains why American institutions differ so markedly from their European counterparts. During our spasms of Social Democracy (to use Judt’s term) in the 1900s, the ‘30s, and the ‘60s, we were trying to solve practical problems; we harbored no wish to create a Brave New World. From the days of Benjamin Franklin at least Americans have been practical-minded empiricists (the Branch method, rather than the Root), not theoreticians.

What Judt has to say at the very end of his lecture is extremely interesting. He is clearly disgusted with the American left for not recognizing that it “has something to conserve,” i.e., the collectivist, social democratic heritage of the twentieth century. He notes that the left often seems intent on apologizing for its own legacy. Judt also criticizes the left for not recognizing that the right (thanks largely to George W. Bush, though he doesn’t say that in so many words) has put itself in the awkward position of advocating utopian ideas such as not worrying about budget deficits (“Deficits don’t matter,” according to Dick Cheney) and making the world safe for democracy. The right, according to Judt, “has inherited the ambitious modernist urge to destroy and innovate in the name of a universal project.” They ought to feel more uncomfortable in that position than they seem to be.

This, in my view, is astounding, especially when one considers that (quoting Judt again, but now with a bow in the direction of Charles Lindblom) “If we learned nothing else from the twentieth century, we should at least have grasped that the more perfect the answer, the more terrifying its consequences.” (Consider, for example, Hitler’s answer to “the Jewish question,” or Stalin’s answer to the challenge posed by the kulaks--that is, prosperous peasants--whose very existence as a class was an affront to Marxist ideology.) Yes, what we have here is another argument for muddling through.

Saturday, September 1, 2012

Autumn 2012 Glenn Fellows Visit VOA

From left to right: Jade Holmes, Amber Seira, Adam Kase, Marissa Cooper, Maggie Murdock, Erin Moeller, Katie Colburn, Paige MacMorland, Leah Apothaker, Maddie Fireman, Grace Fry, Isaac Choi.

Tuesday, August 28, 2012

Mr. Justice Scalia and the Moritz College of Law (reprise)



In one of my first posts on this blog I observed that easterners are inclined to dismiss midwesterners as rubes and that Glenn Fellows, who tend to be professionally ambitious and have every reason to be, forget or ignore this at their peril.

There could be no more dramatic example than that provided a few years ago by Antonin Scalia, Associate Justice of the United States Supreme Court. As Adam Liptak reported in May, 2009, in The New York Times, Justice Scalia, speaking at American University in Washington, D.C., explained to an audience of law students that their chances of landing a clerkship with a Supreme Court justice were slim or none because those plums are reserved for students from America’s most prestigious law schools. According to Liptak, the “hard truth” is that “Over the last six years, the justices have hired about 220 law clerks. Almost half went to Harvard or Yale. Chicago, Stanford, Virginia and Columbia collectively accounted for 50 others.” Liptak reports that “Justice Scalia said he could think of one sort-of exception to this rule favoring the elite schools.” To wit:


"One of my former clerks whom I am the most proud of now sits on the Sixth Circuit Court of Appeals” in Cincinnati, the justice said, referring to Jeffrey S. Sutton. But Justice Scalia explained that Mr. Sutton had been hired by Justice Lewis F. Powell Jr. after his retirement and then helped out in Justice Scalia’s chambers. “I wouldn’t have hired Jeff Sutton,” Justice Scalia said. “For God’s sake, he went to Ohio State! And he’s one of the very best clerks I ever had.”

As one can readily imagine, Justice Scalia’s remarks inspired a kerfuffle in Buckeyeland. The Columbus Dispatch reported that Scalia was “not a big fan of OSU law graduates,” and the Ohio State Bar Association objected to the “insult” and issued a sharp rejoinder, arguing that “Intellect, skill and fundamental integrity are not measured by the school someone attends. Birthright, money, LSAT scores and magazine rankings of law schools are not the standards by which this profession judges itself.” My reading of this story is that Justice Scalia was conveying brute facts that are not really in dispute, and that his enthusiastic endorsement of Judge Sutton indicates that he understands that the prejudice in favor of elite law schools ultimately is not entirely rational. True, he would seem disinclined to buck the system from which he has profited, yet I think it’s pretty clear that his “For God’s sake” remark was intended as irony. They learn that sort of thing at the elite law schools, such as Harvard, where Scalia earned his law degree.

September 14, 2009, update: Further evidence that Harvard law graduates tend to be lovers of irony comes from an AP story that Lawrence Hurley cites in his Supreme Court blog, Washington Briefs. Elitist joke alert: Asked if too many of the justices came from elite law schools, Chief Justice John Roberts says no—some went to Yale (AP).

Monday, August 27, 2012

John Glenn Honored at Progressive Field

John Glenn threw out the first ball prior to the Indians' game with the Yankees yesterday. He also talked to reporters about Neil Armstrong, space travel, and what it's like to have Ted Williams as your wingman.

Friday, August 24, 2012

Rationality and Public Policy Making (reprise)


It's early in the semester, which means that we'll soon be taking a close look at Eugene Bardach's A Practical Guide for Policy Analysisdue. Bardach's book has always struck me as a kind of Rorschach test. While Bardach recognizes that policy analysis is "more art than science," he is, ultimately, an optimist. He thinks that public policy is improved when it is informed by rigorous empirical research. As a dyed-in-the-wool futilitarian, the Washington Buckeye is less sanguine about the prospects of rationality in the policy-making process, but he tries to suspend disbelief.


The October 8, 2009, issue of the New York Review of Books had a remarkable article that bears on the issue: "The Anarchy of Success," by William Easterly, an economics professor at NYU. The article is a review of two books, Leonard Mlodinow's The Drunkard's Walk: How Randomness Rules Our Lives, and Ha-Joon Chang's Bad Samaritans: The Myth of Free Trade and the Secret History of Capitalism.

Here's the nub of the argument. Easterly says that the phenomenal rates of economic growth enjoyed by Hong Kong, South Korea, Singapore (see skyline photo above), and Taiwan in the period between 1960 and 2007 inspired a tsunami of research by economists eager "to find in the empirical data which factors reliably lead to growth. Yet hundreds of research articles later, we wound up at a surprising end point: we don't know."


Think of it. After the investment of billions and billions of dollars and Euros in the righteous cause of economic development, we actually don't know the causes of growth. According to Easterly, summarizing Mlodinow, economists have identified 145 factors associated with growth, but "most of the patterns were spurious, because they failed to hold up when other researchers tried to replicate them." As for Bad Samaritans, Easterly says that Chang criticizes "those who have made overly strong claims for free trade and orthodox capitalism, but then he turns around and makes equally strong claims for protectionism and what he calls 'heterodox' capitalism, which includes such features as government promotion of favored industries, state-owned enterprises, and heavy regulation of foreign direct investment."

Could it be that "the science of muddling through" is the best we can do?

Monday, August 20, 2012

Autumn 2012 Glenn Fellows at National Building Museum

Back row, from left to right: Adam Kase, Grace Fry, Erin Moeller, Jun-yong Choi, Amber Seira, Jade Holmes, Paige MacMorland, Katie Colburn. Front row, from left to right: Margaret Murdock, Leah Apothaker, Maddie Fireman, Marissa Cooper.

The Federal City


After the Constitution of the United States went into effect in 1789, the government proceeded to make a number of momentous decisions, some of which had to do with the finances of the precarious new republic. Congress had been granted the power to levy taxes, to regular interstate commerce, and to print money—all of which had been denied the Congress under the Articles of Confederation. But the challenges were many, including the issue of who would be responsible for repaying debts incurred during the American Revolution. Some of the states had made an effort to retire their loans, but others had not. Our creditors included both individual Americans and foreigners, and it wasn’t clear whether the states respectively or the national government under the new Constitution should bear the burden of repayment.

The first Secretary of the Treasury, Alexander Hamilton, who harbored a vision of a “strong, well-mounted government” and a bustling commercial republic, viewed the national debt as a national blessing—up to a point, at least. Hamilton proposed that all of the nation’s public debt be assumed by the new national government and funded at par, a policy that enriched the many speculators who had bought up depreciated war bonds during the hard economic times of the 1780s. In addition to making some people rich (and in effect buying their loyalty to the new republic), Hamilton also proposed the creation of a national bank and investment in infrastructure, that is, “internal improvements” such as roads and canals. To win Congressional approval of this highly controversial plan, Hamilton had to negotiate a deal with those harboring a more modest, agrarian vision of America’s future, particularly the two Virginians, Thomas Jefferson and James Madison. A deal was cut over dinner at a New York townhouse: Hamilton’s financial measures would be approved by the Congress, but in return states that had paid off their debts would be reimbursed by the federal government ($1.5 million in the case of Virginia), and the national capital would be moved away from the northeast, where the commercial classes were prominent, to a location more convenient for and receptive to the rural and slave-holding south.

The issue of the national capital was addressed by Congress with the Residence Act of 1790, which authorized President George Washington to select a location somewhere along the Potomac. Unsurprisingly, Washington favored a spot that was below the fall line and not too far from Mount Vernon; to implement the plan, Washington recruited aides, including Hamilton, whom he had learned to trust during the Revolution.

Enter the shadowy figure of Pierre Charles L’Enfant, the man whose name is synonymous with the design of the city of Washington, DC. L’Enfant had come to the New World to help General Washington win the Revolutionary War. He made himself useful at Valley Forge, and he did some networking among the officer class through the terrible winter of 1777-78. Afterwards, he employed his talents—many of them artistic—to further the creation of the Society of the Cincinnati, which some people regarded as an American version of the English House of Lords. It was L’Enfant who designed Federal Hall in New York, the building where Washington was sworn in as president of the United States on April 30, 1789, and he earned something of a reputation for what we would call “event planning.” After passage of the Residence Act, L’Enfant offered his services as designer of the city that would arise in the new Federal District straddling the Potomac.

Though L’Enfant was enamored of life in the New World—he wanted to be called “Peter,” for example—it was natural for him to look to his home town, Paris, for inspiration, and that suggested the standard baroque playbook of geometric plans with radiating boulevards, public squares with their neoclassical palazzos, obelisks, and equestrian statues, and long axial vistas—elements suitable for military parades and revues and for exploiting the local topography, the whole composition being an implicit rejection of the humble Jeffersonian gridiron that was to become ubiquitous throughout the rest of urban America.

The result is that among cities in the United States, Washington is unique, and has always been so. L’Enfant thought that the several states would take responsibility for developing “their” grand avenues and piazzas, and that the city as a whole would issue from these nodes like a puppy growing into its paws. That happened in the end, but it took the better part of a century. During that time Washington was ridiculed as an “embryo capital,” featuring “squares in morasses,” and “obelisks in trees,” a city of “magnificent distances,” with tree stumps in the boulevards and a swamp dividing the President’s House from Jenkins’ Hill (i.e., Capitol Hill). For many decades, L’Enfant’s plan seemed a hopelessly grandiose exercise in futility. Benjamin Latrobe called it a “gigantic abortion.”

L’Enfant himself, unfortunately, was a prideful and somewhat prickly character who rubbed DC’s commissioners the wrong way, alienated the most powerful local landowner, and finally wore out his welcome with President Washington. L’Enfant was dismissed in February of 1792, and an imperfect version (see image above) of L’Enfant’s plan executed by the surveyor Andrew Ellicott. Rather quickly, L’Enfant drifted into obscurity along with, after 1800, most of the leaders of the Federalist party that had been led by his patrons.

Washington, DC, began to look like a proper national capital only with the growth of government that accompanied the Civil War, with soldiers, bureaucrats, construction crews, office-seekers, and prostitutes descending upon the capital city. But the growth that ensued was higgledy-piggledy, unguided by the L’Enfant plan, which was neglected along with memory of the man himself. The elderly L’Enfant lived as the “permanent houseguest” of kindly friends at Warburton Manor, where he spent his time petitioning Congress for proper recognition of his service to his adopted country. He died and was buried in an inconspicuous grave in 1825.

Recovery of L’Enfant’s original vision was spurred by the professionalization of landscape architecture and the popularity of Beaux-Arts classicism during the Gilded Age. The watershed event was the Chicago Fair of 1893—formally, the World’s Columbian Exposition celebrating the “discovery” of America. Through the Senate Park Commission, also called the McMillan Commission, Progressive politicians called for recommitment to the basic principles of L’Enfant’s plan; their wooden models are on permanent display at the National Building Museum. As for the long-neglected Major L’Enfant, his mortal remains were exhumed in 1909; his grave now occupies a place of honor near the front of the Lee-Custis Mansion in Arlington National Cemetery.

L’Enfant’s original plan for the city is easily discerned in the modern city. The Victorian train station on the National Mall was eventually removed, part of a deal struck to build Union Station, Washington’s most eloquent tribute to the Chicago Fair. Tiber Creek, which L’Enfant turned into a canal, was covered over, finally giving way to Constitution Avenue. Until fairly recently, Washington still had many of the features of a somewhat sleepy Southern city, racial segregation being only the most lamentable of these. As late as the early 1960s, it was still possible for President Kennedy to joke about the city’s unique combination of “southern efficiency” and “northern charm.” Before long, the Capital Beltway and the Metro had transformed the black-and-white city that had dazzled Senator Jefferson Smith when he came to Washington in the person of Jimmy Stewart. Architectural controls and building height limitations have preserved much of the spirit of the L’Enfant plan.

And now, with publication of Scott W. Berg’s Grand Avenues: The Story of Pierre Charles L’Enfant, the French Visionary Who Designed Washington, D.C. (New York: Vintage, 2008), we have a biography worthy of the city that took shape so gradually over a long span of time. Berg shows us that the distinctiveness of Washington, D.C.—it’s beauty, most would be willing to say—is due entirely to its designer’s recognition that this city, unlike all others, “would not happen; it would be made.”

June 20, 2012 update:  See this piece by Amanda Hurley on the recent flurry of adventurous architectural activity in DC.   

Thursday, August 9, 2012

Perfect Practice Makes Perfect (reprise)



Because the Washington Academic Internship Program emphasizes the importance of public service, and because our students—Ohio State juniors and seniors all—will soon be venturing out on the job market, we devote a fair amount of attention to career planning. We have found that our alumni are a valuable resource on this front, both as mentors and as guest speakers or presenters. And we are very proud that a fair number of former Glenn Fellows find their way into public service jobs in the nation’s capital. I have heard Senator Glenn estimate that about 20% of our students end up in D.C. I would guess that the percentage these days—perhaps because the Washington-area job market is not as distressed as that of Ohio—is actually closer to 25%. Placement is an important enough part of our mission that it is one of the metrics by which we would want to be judged.

That is why we schedule a presentation early each quarter by Julie Saad, a former Glenn Fellow who works as an analyst at the Office of Personnel Management. It’s also the reason we like to introduce the Glenn Fellows to Presidential Management Fellows and OSU alumni who work in Congressional offices. We invite civil servants with hiring authority to critique the fellows’ résumés, and we pay attention to employment patterns, hiring practices, and training opportunities.

That’s why I recently picked up a book that a former Glenn School colleague, Ryan Meadows, had on his reading list for M.P.A. students a few years ago. The book, written by Geoff Colvin, a senior editor at Fortune, is called Talent Is Overrated: What Really Separates World-Class Performers from Everybody Else (New York: Vintage, 2008). A central tenet is that nurture is more important than nature, which is why Colvin’s book would be more accurately entitled Innate Talent is Overrated. But never mind….

Colvin’s is a positive message, in that being a great performer does not in any serious sense depend on having a special “gift” for one’s chosen profession. People aren’t born with or without the innate ability to hit a three-iron like Tiger Woods, plot chessboard moves like Gary Kasparov, or belt out a tune like Luciano Pavarotti. And being a first-rate scholar is not all about IQ. The skills required to excel in any line of work have to be acquired—through practice. But Colvin—and this is the “bad news”—argues that people in general and business corporations in particular have very little understanding of what one has to do to acquire the skills necessary to work at world-class levels. And that means that while some people might be willing to put in long hours of arduous effort, they may not know how to practice the right way, which means their efforts will be futile.

Colvin develops his thesis with great care, and he relies on a number of case studies that are fairly compelling. Colvin’s portrait of Tiger Woods, which was written prior to Woods’s mortification, focuses on Earl Woods’s fanatical devotion to his son’s training; they were on the course together by the time Tiger was two years old. Judging from Colvin’s account, one wonders whether Earl Woods was more obsessed with nurturing genius than any man since Leopold Mozart.

Or consider the case of the Polgar sisters of Budapest. Their father was a psychologist committed to the proposition that geniuses are made, not born. He purposefully set out to prove it by turning his children into chess prodigies, which he did to prove a point: neither he nor his wife were accomplished chess players, so no innate talent was involved. His efforts at home-schooling proved to be completely successful, largely because he devised the right kinds of drills for his daughters to structure their practice.

Being a genius, in other words, is all about being willing to endure the regimen of what Colvin calls “deliberate practice,” which is not just going through the motions over and over again, but an entirely self-conscious process of constantly pressing the envelope of one’s competence. In order to become an Olympic champion ice skater, for example, Shizuka Arakawa had to endure at least twenty thousand episodes of failure, because that’s what deliberate practice is all about: “Landing on your butt twenty thousand times is where great performance comes from.”

I’m betting that Geoff Colvin is not a baseball fan, for if he were, he would have known to invoke Cal Ripken, Jr., as the quintessential product of the training regimen of deliberate practice, a regimen devised by his father, Cal Ripken, Sr. (see photo above). Much like Colvin, Ripken père rejected the idea that “practice makes perfect”; in fact, he insisted that “It’s not practice that makes perfect, but perfect practice that makes perfect.” For Ripken fils this meant self-consciously repeating drills designed to address whatever his inadequacies were at a given point in his development as a shortstop and hitter—the baseball equivalent of falling on his butt twenty thousand times. It made the legendary “iron man” a first-ballot Hall of Famer.

There is another world class innovator missing from Talent Is Overrated, and his story is dramatically conveyed by Dava Sobel in her Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time (New York: Penguin, 1996). His name is John Harrison, an eighteenth-century clockmaker whose innovations resulted in the perfection of a timekeeping device that was accurate and reliable enough to determine longitude at sea. Harrison’s is an unforgettable story of sheer, mind-boggling tenacity over four decades during which the British parliament kept raising the bar, sending Harrison back to his workshop over and over again to improve his marine chronometer. It’s a case study that Colvin should have cited because it demonstrates—conclusively, to my mind—that innovation is based on knowledge and the mastery of sharply focused technique (deliberative practice), and that it is foolish to think, as do some admirers of the cult of amateurism, that “too much knowledge of the domain or familiarity with its problems might be a hindrance in creative achievement.”

There is another lesson in Talent is Overrated to which Glenn Fellows ought to pay heed. It is the idea that career planning isn’t just about landing a desirable entry-level job in one’s chosen profession. It’s about maintaining and adding to the skills associated with high performance on the job. Finally, one should be encouraged by what Colvin has to say about the inexorable effects of aging. It turns out that outstanding performers “suffer the same age-related declines in speed and general cognitive abilities as everyone else—except in their field of expertise” [emphasis added]. In short, on-going professional development and career planning are life-long enterprises, to be sustained up to and even into retirement.

Saturday, August 4, 2012

The Strange Career of Pithole City (reprise)

This is week nine for the Summer 2012 edition of the Washington Academic Internship Program, which in this unique new semester spells The Beginning of the End. By way of conclusion, I like to consider several public policy classics, including Garrett Hardin's "The Tragedy of the Commons." The essence of the tragedy of the commons is fouling one's own nest, and this quarter I'm asking the fellows to read a case study that I recently published. It's about the environmental degradation accompanying the world's first oil boom, which occurred in the 1860s not far from where I grew up--though it antedated me by a few years--in western Pennsylvania. There is a link to my essay, "Pithole City: Epitaph for a Boom Town," over on the right-hand side of this blog. And here is a link to a 7-minute summary of the astonishingly brief but intense history of Pithole City. The photo above is the view down Second Street today. Obviously, Pithole exists today mainly as an archaeological site; it could scarcely even be called a ghost town.