Wednesday, February 27, 2013
The following review has been posted on the Books page of the History News Network.
In at least some respects, it's hard to imagine a better author for a primer on the historical profession than James Banner. A man who went to college (Yale) in the fifties, graduate school (Columbia) in the sixties, taught at a major university (Princeton) in the seventies, and has played a series of roles outside of academe ever since -- including co-founder of the History News Network, a direct forerunner of HNN (where he currently sits on the advisory board) -- Banner has pretty much seen it all. He is thus able to write with authority on a wide variety of topics of interest to aspiring or working historians, far beyond what he describes as the "academic trinity" of research, teaching, and institutional service.
Though he has enjoyed a series of elite perches in his career, which have circled around elite research universities, the core argument of Being a Historian is a strongly ecumenical one: there are many ways to unselfconsciously claim that identity. Yes: being a historian always involves writing -- but not necessarily monographs, much less books, in an age of blogs, video and other new media. Yes, historians are always teachers -- but not necessarily at colleges, much less classrooms, at a time when "students" of history come in all shapes, sizes, and walks of life. Unfortunately, he notes, far too many people absorb the message that a meaningful career turns around the circumscribed compass of tenured professor. The reasons for that, he explains, are largely historical (and he duly traces the emergence of the profession in the last nineteenth century and its solidification in the first half of the twentieth). But the last few decades in in particular have witnessed a vastly expanded scope and reach for Clio in ways that its practitioners have failed to recognize, much less prepare for.
Perhaps not surprisingly, given his own professional journey, Banner pays a good deal of attention to public history in its manifold forms. The role of national organizations like the American Council of Learned Societies and the National Endowment for the Humanities are described here, as are museums, historical sites, and other institutions with more of a generalist orientation. Banner includes two useful digressions on public intellectuals and expert testimony in his chapter on the subject of history outside the academy, the latter in particular one that has received little attention inside or outside the profession.
Among his more provocative observations are those that note the diminished role for style and voice in contemporary historical writing. "When have we read, as we do in literary and art criticism, of the literary influences on a historian, of the other historians from whom a scholar seems to take inspiration for argument, evidence, and style in addition to ideology and research approach of the shaping of traditions of historical writing and form rather than method or subject?" he asks in a footnote. "Historical knowledge is impoverished to the degree that question must be answered with silence." To put the matter more succinctly: "One often summons great respect for works of history without taking much pleasure in reading them."
All this said, there is an air of the unreal surrounding Becoming a Historian, because Banner seems to understand the principal problems with the profession in terms of misplaced perceptions and overlooked training opportunities, when its real crisis is grounded not in the preparation of scholars or in longstanding arguments about epistemology, but rather its economic foundations. He does take note of longstanding controversies like unionization efforts among graduate students or the growing -- even dominant -- role of adjuncts in most history departments. At one point he describes the reduction of institutional reliance on part-time faculty as "an effort broadly under way," which seems out of touch, because there is little discernible success in this regard among historians or anyone else in higher education.
Banner seems to conceive becoming a Historian in terms of talented and ambitious people assessing it as a possible career in the way they might finance or health care. (Until recently, one might have said medicine or law, but these two fields are undergoing the kind of upheaval aspiring professionals might have hoped avoid -- upheaval of the kind workers in other sectors of the global economy have taken for granted for a generation.) Instead, a decision to embark on a career as a historian today is a little closer to that of deciding to become an actor, musician or professional athlete, an ambition likely to instill concern if not terror in a parent's heart. As even the most insulated senior scholar now says as a virtual incantation: "There are no jobs."
Of course, this is not literally true. Baby Boomers are leaving the profession all the time, and someone will get that tenure-track job -- for a while longer, anyway. And yes, there's probably more of a demand than ever for certain kinds of public history professionals, albeit on a much shakier pecuniary basis. Amid the growing power and appeal of foreign universities (which will increasingly retain their own students), the proliferation of alternative online learning opportunities (whose existential threat to smaller liberal arts colleges is quickening), and the shrinking government role in financing research and subsidizing tuition (which is forcing schools to refocus their priorities, a.k.a. reduce their program), change is coming to the historical profession -- academic or civic, public or private -- whether scholars want to acknowledge it or not.
To invoke a distinction Banner is careful to cultivate at the outset of this book: history will survive as discipline, as it has for thousands of years. But as a profession? That's a lot less clear. A more likely scenario would seem to involve history as a disposition or focus for an individual teacher in a larger humanities field, which has shrunk as a locus of student interest amid well-founded anxieties about finding well-paying job (never history's strong suit in any iteration). I happen to chair the history department of a private high school where roughly half of the teachers I supervise hold doctorates, most, though not all, refugees from the university world. But if I have my way, our department will be evolving in a significantly interdisciplinary direction in which history will become one color on a palette from which students will practice the arts of reading, writing and thinking. I'll always consider myself a historian, even though my own doctorate is in American Studies. But that's a matter of personal loyalty, not livelihood.
James Banner is an ideal figure to tell us what is has meant to become a historian. His remarks surely have relevance for at least some current graduate students. But for what it will mean to become a historian: that volume has yet to be written. When it is, I suspect the book -- or, perhaps more accurately, "book" -- will look, sound, and feel substantially different than this one does.
Sunday, February 24, 2013
On the eve of the Academy Awards, CNN.com has run my profile of Daniel-Day Lewis's career, with a focus on his recent role as as Abraham Lincoln. The piece draws on my analysis of of his work in my new book Sensing the Past. Read all about it here.
Saturday, February 23, 2013
The main themes of my book Sensing the Past: Hollywood Stars and Historical Visions are discussed in a piece that has appeared online and will be part of the the Sunday "Outlook" opinion section of the Washington Post this weekend. I feel very fortunate to have my work appear in this august publication.
Thursday, February 21, 2013
The following comments were prepard for my school's first Darwin Day symposium, February 21, 2013. I was asked to revise them, but declined to do so. I run the piece here largely as written.
As far as I can tell, all human beings have to grapple with two internal struggles. The first is a struggle to understand the world as it is. The second is a struggle to make sense of the gap between the world as it is and the world as we would like it to be. People in different times and places have dealt with that second struggle in varying ways. Sometimes, it’s been a matter of trying to accommodate with the world as it is – which often involves a belief that if we actually understand the world better, we will be able to come to terms with it more effectively. Other times, it’s been a matter of trying to change the world and bringing it closer to the ideals we imagine. This notion of changing the world is the one we pledge allegiance to here at Fieldston. It is literally our mission.
But 21st century Fieldstonites aren’t the only people who have had this mission. Actually, the world in which Charles Darwin came of age – the world of 19th century Great Britain – was also a time when people believed they could change the world, a time when people spoke confidently of “progress” with a capital P. The reasons for this confidence were obvious: the railroad and telegraph were conquering time and space; factories were mass-producing goods in ways that promised to abolish scarcity (and, with it, slavery, something Britain did decades before the United States). The source of this technological mastery was modern science, which unlocked the secrets of steam, and electricity, and iron, allowing human beings to manipulate them toward desired ends.
This was the context in which The Origin of Species was published in 1859, a world in which Darwin was a product of knowledge as much as he was a producer of knowledge. In constructing a fact-based story about rocks and birds with great explanatory power, Darwin has helped a great many people understand the world as it is.
But in offering an explanation of the world as it is, Darwin created manifold problems for those trying to understand the gap between this world and the one we wish it to be. As many of us know, Darwin himself was deeply troubled by the religious implications of his work, so much so that he sat on his findings for many years until it became clear the world was going to learn what he had discovered anyway. But I am here to tell you that Darwin’s work remains a problem for a great many more people than those who happen to believe the world was created by God in seven 24-hour days. In the late 19th century, some people took Darwin’s ideas – over Darwin’s own objections – and used them to explain the fate of the poor and weak as a function of their own inferiority, a concept that has come to be known as Social Darwinism. Social Darwinists argued that phenomena like poverty were explained by maladaptive genes that made some people unable to function, much less compete, in modern society. Some of the people who are today regarded as heroes of the modern contraception movement sincerely believed that we’d better off if people of some races and ethnicities were never born.
We know better. Or, I should say, we “know” better. Today the tales we tell based on Darwin’s facts are harnessed for the use of liberal rather than conservative ends. Instead of focusing on the grim determinism of genetic inheritance, we prefer to dwell on the sunny side of environmental adaptation. We speak of “plasticity” and “learned behavior.” We like to think there’s an affinity between the sustainability language ecological diversity of and the progressive education language of multicultural diversity. We tend not to dwell on the random components of evolution, because we want to believe that that we are agents, if not masters, of our own futures. So we speak of our identities as chosen, and refuse to accept the proposition that biology is destiny.
I’m not saying any of this is wrong. I am saying that the very logic of the Darwinian evolution specifically, and the scientific enterprise generally, rests on the interpretation of evidence that is always contingent. As I understand it, the paradigmatic scientific proposition goes like this: “We used to think x; now we know y.” The facts don’t necessarily change – new ones may appear – but what we interpret those facts to mean is subject to ongoing revision. Newtonian of laws of gravity were fixed, until Einstein came along and they weren’t. Evolution tells us that things change gradually – until, as the fossil record shows, something cataclysmic happens and they don’t. What you, know is really a matter of what you believe at any given time.
Let me tell you some of the things I believe at this time:
- I believe, based on an ample historical of sea and air travel, that the earth is, for all practical purposes, round.
- I believe, based on written authority I trust and what Paul Church has told me, that this earth is billions and billions of years old.
- I believe, based on first-hand evidence, that there is no direct correlation between gender and intellectual capacity.
- I believe, largely because I want to and because I can’t keep the studies straight, that coffee and chocolate won’t kill me.
- I believe, at least in part because I’m her father, that my daughter is adorable.
- I believe, because I’m not aware of any conclusive evidence to the contrary, that there really is an intelligent design that can explain evolution and all the rest, and that Jesus Christ has something to do with it.
I myself have not tested any of these beliefs scientifically. I happen to spend my time chasing other truths. But I’m not aware that any of these assertions are demonstrably scientifically false. (Of course they’re not scientifically true, either, because they they don’t rest on positivistic, falsifiable propositions.) I invite you to share my beliefs. But I don’t take for granted that you will embrace all of them. Actually, I’m a little awed by how much I don’t know.
So that’s what I believe. What do you believe? And how strong is your faith?
Monday, February 18, 2013
He was born into a small-time aristocracy—and let’s put more emphasis on “small-time” than “aristocracy,” if indeed a self-proclaimed elite on the edge of the world can be called aristocratic. Having lost his father as a child, the family’s status was insecure, and the boy was forced to rely on his older brother for his education. But there is little doubt he was ambitious. He picked up the useful skill of surveying, handy for the real estate speculation that would so decisively shape his fortunes. He also showed a penchant for making useful political connections, managing to get himself named a non-commissioned officer in the army, only to endure a series of military blunders in the French and Indian War. He’d never be a real gentleman or a true officer in the British empire. Still, a provincial American Dream – marry into money, get some land, get some slaves – was more than within reach. Had he been run over by the proverbial wagon in 1772, when the portrait to the left was painted by Charles Willson Peale, you’d have to say: This guy did all right for himself.
In the end, as we know, he did more than all right, and more than for just himself. There are three reason why that I’d like to point out as we shop, vacation, nap, or all those other good things a President’s Day holiday is for.
Courage. He was brave in the most literal sense—multiple accounts testify to his bravery under fire. But it took other forms, too. It’s easy enough to understand that leading an insurrection against the greatest empire in the world is not for the faint of heart, whether or not you happen to wield a gun. But I’m struck by some of the more subtle manifestations later in his life. For example, he pretty much knew that Alexander Hamilton and Thomas Jefferson, both of whom served in his cabinet and hated each other, were smarter than he was, and he knew that they knew it, too. That was all right with him, as long as they did good work. Which, as was true of many of the people who served under him, they did. There’s no doubt he was vain, and worried—you might say obsessed—about his reputation. But over and over again over the course of his career, whether as a young soldier or as an old lion who reluctantly lent his name to a new Constitution, he risked that reputation. He wasn’t quite flawless in this regard (for example, he kept his mouth shut about slavery for a very long time), but even these limits—and the limits of those limits—are in their own way impressive.
Patience. You can see it over and over again: In working his way out of debt and making Mount Vernon economically self-sufficient; in waiting out the British in defeat after defeat in 1776 before finally pouncing at Trenton that Christmas (or at Yorktown five years later); in enduring severe privation at places like Valley Forge along with his men; in simply enduring a Congress that acted in ways that he regarded as beneath contempt in keeping him supplied in war and carping about his diplomacy in peace. There is no doubt the man had a temper, and it was not one you’d particularly care to have focused on you. (God help a runaway slave.) But that iron discipline could also bend toward justice, as in the will he wrote that was implacably crafted to keep grasping family from denying his slaves their freedom.
Passion. This might seem like a joke. Him? That grim, seemingly unknowable visage in all those portraits, like the one we use to buy candy bars and cans of soda? But there’s finally no other way to understand a man who made the choices he did in transforming an abstraction into a nation. It wasn’t only courage and patience. You can see it in that blazing address he made to the officers who contemplated mutiny in the Newburgh Conspiracy – are you out of your minds, he asked in a white-hot fury, reducing them to tears – and in his desperation to leave the presidency before he died (no, I will not be King George, he said explicitly at Newburgh and implicitly and for the rest of his life). A red-blooded love coursed through the man’s arteries.
He wasn’t a saint. He was a man. A great man. If we lack the imagination to see that, to feel that, then we no longer deserve the freedom he made possible. This is a recurring fear in the tributes. And, alas, a justified one.
Happy Birthday, Mr. Washington. And thank you.
Thursday, February 14, 2013
Some thoughts on homework, and the existential angst it reflects
It’s a basic tension I’ve seen in my entire career as a high school teacher, but one that seems to have become more pronounced lately: the desire of my students to perform well combined with complaints about academic workload that peaks at predictable junctures in the school year (the week leading up the holidays, the one leading up to spring break, and the looming end of the school year, among others). My school does more than most to provide circuit breakers to reduce the tension, among them homework bans during vacations, prohibitions against due dates major assignments shortly after returning from vacations, and a test calendar that limits how many assessments they can have on any given day. Despite such strategies (or perhaps because of them – pushing down pressure at one point may just displace it to others) the anxiety never seems to go away, and indeed seems to become a topic of conversation not just among students, but also their parents, school administrators, and faculty.
I tend to react to such conversations with impatience. Sometimes this is a matter of disdain for my students’ lack of stamina; they seem to regard more than about 15 pages of homework reading, for example as excessively onerous. (I think of one’s appetite and pace for reading as akin to being in good physical shape: the more you do the easier it gets.) But since I often hear a subtext of blame in such expressions of anxiety – you teachers just assign too much work, seeming to forget students have other classes and real lives beyond what goes on in any given course – I also tend to mentally push back. You think I like assigning homework? I silently ask. Nothing would make me happier to not have to grade that set stack of papers I’m obliged to assign before vacation and then read during that vacation. (I chuckle grimly when a student writes “Enjoy!” in an email when handing one in.) Everybody seems to want me to challenge students and coax them into a meaningful form of excellence, but nobody seems to like the cost excellence imposes. Indeed, some think they can wish it away, believing progressive educators like myself are those that allow students to do what they want without even knowing they’re working hard. Sorry: it doesn’t work like that. There’s lots of different ways to work hard, but hard work is . . . hard. If it’s easy, it doesn’t mean much. That’s not to say that hard work is always meaningful, either. Indeed, one of my toughest intellectual challenges as a teacher is coming up with assignments that have some sense of larger value.
Of course, I’m bound up in a system I deplore, too. The main reason why I assign essays and tests is that I’m paid to assign, read and grade them. If I didn’t assign them, I’d fail to afford students opportunities to learn, much of which happens as a result of the ineluctably solitary effort to express themselves, notwithstanding the legitimate coaching of teachers, parents, or (less legitimately) professional tutors. And if I didn’t read them, I’d have little information on where my students actually are, which is the most obvious measure the way I’m assessed by parents and supervisors. And if I didn’t grade them, I wouldn’t be doing my part in performing the larger work of sorting students into tiers of perceived quality, a process nobody much professes to like, but which identifies those most likely to compete effectively in the college process.
The college process: that, finally, is what this is all about. We fight it, we deny it, but we can’t escape it at my school and thousands of others, public or private, urban or rural. School is typically experienced by a student at a time, in a classroom at a time, in a school at a time. But education always happens in a much wider context, consciousness of which gradually encroaches on a family the closer it gets to a student’s graduation. Once upon a time, that context was statewide or national. Increasingly, it’s global. That’s true even at cash-starved second tier schools, hungry for foreign students willing to pay full tuition. There are few better ways to exhaust oneself mentally than to think about how small one’s achievements are in a world that grows ever vaster the more it shrinks.
For a long time, I ascribed the pervasive anxiety surrounding college admissions to the vicissitudes of coming of age in a declining empire. Like British youth a century ago scrambling for places in an ever-shrinking Colonial Office, today’s students lack the luxury of their parents or grandparents, who could afford to regard their schooling with an air of detachment, confident that a robust national economy would find a place for them. A major reason that economy is less robust, of course, is foreign competition, which takes the form of everything from cheap products to plentiful engineers pouring out of foreign universities (and domestic ones). Under such circumstances an ambitious high school student can scarcely afford to take it easy, no matter how desperate that student was to do so.
I still think there’s some truth to this analysis. But it now seems much too insufficient. A thirst for distinction seems to be universal. I don’t mean to say that everyone has it (many a parent has flailed in the face of this reality, not certain about whether to accept it as final – some of those lollygagging students of previous generations really did get their act together). But at any given time, in any given population, there will always be people who want to excel and such people will tend to congregate at schools like mine.
The rub, of course, is what excel means. It is academic excellence? An intellectual bent? Superior social skills? Of course it’s any of these, and more. We’re all endowed with different levels of such indicators, which blend in different ways to constitute a standard of success in the world beyond the school. The world is big in this regard: there are lots of ways to be successful. But it’s also true that for any given form of success, there are always more people who desire it than there are places, and even those who occupy such places are typically restless about where they are and want something more.
This is our blessing; this is our curse. Having a goal sustains us, gives us a sense of purpose, allows us to believe the pains we endure may yet be part of a successfully realized larger design. But it breeds continuous discontent and persistent fears that we’re simply not good enough. What’s even worse is the suspicion – and eventually knowledge – that those fears are justified. Sometimes they stalk us even when we have attained our goals (“attainment” proving to be surprisingly difficult to define unless it’s unambiguously out of reach). So it is that we remain ever restless, boats against the current.
Tuesday, February 12, 2013
He's right there when I enter the classroom first thing in the morning, his gentle smile directly in my line of sight. That's just the way I wanted it. The photograph is in the public domain, and so I could have gotten it for free, but I was glad to pay an online poster company for an image that's about 3 feet tall and 2 feet wide. It came shortly before his hundred 199th birthday. Now I celebrate every day.
It's a pretty famous picture. One of about a half-dozen we have engraved in our collective memory, trotted out by retailers for Presidents’ Day sales. It was taken by Alexander Gardner, former assistant of the famed Matthew Brady, who got tired of Brady getting credit for his pictures and struck out on his own. Gardner had been out in the field taking pictures at the front, but came back to Washington and had secured an appointment with the president. Though there's some dispute about the dating, the consensus is that was taken on April 10, 1865, about four days before he died. This was just after the fall of Richmond, one of the few truly happy days of his presidency. Earlier that week, he'd gone to the Confederate capital itself and swiveled in Jefferson Davis’s desk chair (he had a rebel five dollar bill in his pocket that night at Ford’s Theater). He had the good grace to be embarrassed when a group of former slaves threw themselves at his feet on the street, thanking him for their freedom. It was God, not I, who freed you, he said. Only one day earlier, Lee had surrendered to Grant; for all practical purposes, the war was over.
One of the things I love so much about the picture is that smile on his face, slight but unmistakable. That's very rare. People tend not to smile in 19th-century photographs because exposure times were relatively prolonged, and such expressions seem fake if you have to sustain them for more than a moment. Of course, there was also the matter that he didn't have a whole lot to smile about in those terrible days. The fact that he was doing so here, just after his gargantuan task was accomplished and just before he became another casualty in the struggle, seems almost unbearably moving.
Indeed, the smile, real as it is, does not hide the deep sense of sorrow etched into his face. He fingers his glasses with a kind of absent-minded gentleness. His bow tie is slightly off-center; to the last he never lost his rumpled quality. He managed to retain a full head of jet black hair and beard, only slightly touched with gray. Yet there's something almost steely about them. Though his face seems about as soft as the bark on a tree, I find myself wishing I could run my hand across it. Walt Whitman had it right -- he's so ugly that he's beautiful.
But it's the eyes that haunt me. His right eye is a socket; he looks like he's half dead already. His left eye is cast downward slightly. It does not seem focused on anything in the room, but seems instead to be gazing within, saturated with a sadness that nothing will ever take away. They say he had a great sense of humor and loved cracking jokes to the very end, and I believe it. Surely there was no man on the face of the earth who could have savored a good laugh more. A look into those eyes could leave no doubt.
But the strongest impression conveyed by the photograph is one of compassion. Kindness as a form of wisdom. That's my aspiration. On Monday morning, this room will be filled with hungry, well fed adolescents. Some will be laughing, some will be content. But surely it will do someone some good to have him there. He'll be gazing out for the discussion of Little Big Horn, the Pullman Strike, the New Deal, the request for an extension on the research essay, and lunch. Long after I'm gone, he will remain.
Happy 204th, Mr. Lincoln.
Friday, February 8, 2013
The long road from D.W. Griffth to Steven Spielberg
The follwing post went up today on the Oxford University Press blog
The follwing post went up today on the Oxford University Press blog
Today represents a red letter day – and a black mark – for U.S. cultural history. Exactly 98 years ago, D.W. Griffith’s Birth of a Nation premiered in Los Angeles. American cinema has been decisively shaped, and shadowed, by the massive legacy of this film.
D.W. Griffith (1875-1948) was one of the more contradictory artists the United States has produced. Deeply Victorian in his social outlook, he was nevertheless on the leading edge of modernity in his aesthetics. A committed moralist in his cinematic ideology, he was also a shameless huckster in promoting his movies. And a self-avowed pacifist, he produced a piece of work that incited violence and celebrated the most damaging insurrection in American history.
The source material for Birth of a Nation came from two novels, The Leopard’s Spots: A Romance of the White Man’s Burden (1902) and The Clansman: An Historical Romance of the Ku Klux Klan (1905), both written by Griffith’s Johns Hopkins classmate, Thomas Dixon. Dixon drew on the common-sense version of history he imbibed from his unreconstructed Confederate forebears. According to this master narrative, the Civil War was as a gallant but failed bid for independence, followed by vindictive Yankee occupation and eventual redemption secured with the help of organizations like the Klan.
But Dixon’s fiction, and the subsequent screenplay (by Griffith and Frank E. Woods) was literally and figurative a romance of reconciliation. The movie dramatizes the relationships between two (related) families, the Camerons of South Carolina and the Stonemans of Pennsylvania. The evil patriarch of the latter is Austin Stoneman, a Congressman with a limp very obviously patterned on the real-life Thaddeus Stevens. In the aftermath of the Civil War, Stevens comes, Carpetbagger-style, and uses a brutish black minion, Silas Lynch(!), whose horrifying sexual machinations focused, ironically and naturally, on Stoneman’s own daughter – are only arrested by at the last minute, thanks to the arrival of the Klan in a dramatic finale that has lost none of its excitement even in an age of computer-generated imagery.
Historians agree that Griffith, a former actor who directed hundreds of short films in the years preceding Birth of a Nation, was not a cinematic pioneer along the lines of Edwin S. Porter, whose 1903 proto-Western The Great Train Robbery virtually invented modern visual grammar. Instead, Griffith’s genius was three-fold. First, he absorbed and codified a series of techniques, among them close-ups, fadeouts, and long shots, into a distinctive visual signature. Second, he boldly made Birth of a Nation on an unprecedented scale in terms of length, the size of the production, and his ambition to re-create past events (“history with lightning,” in the words of another classmate, Woodrow Wilson, who screened the film at the White House). Finally, in the way the movie was financed, released and promoted, Griffith transformed what had been a disreputable working-class medium and staked its power as a source of genuine artistic achievement. Even now, it’s hard not to be awed by the intensity of Griffith’s recreation of Civil War battles or his re-enactments of events like the assassination of Abraham Lincoln.
But Birth of a Nation was a source of instant controversy. Griffith may have thought he was simply projecting common sense, but a broad national audience, some of which had lived through the Civil War, did not necessarily agree. The film’s release also coincided with the beginnings of African American political mobilization. As Melvyn Stokes shows in his elegant 2009 OUP book D.W. Griffith’s Birth of a Nation, the film’s promoters and its critics alike found the controversy surrounding it curiously symbiotic, as moviegoers flocked to see what the fuss was about and the fledgling National Association for the Advancement of Colored People used the film’s notoriety to build its membership ranks.
Birth of a Nation never escaped from the original shadows that clouded its reception. Later films like Gone with the Wind (1939), which shared much of its political outlook, nevertheless went to great lengths to sidestep controversy (the Klan is only alluded to as “a political meeting” rather than depicted the way it was in Margaret Mitchell’s 1936 novel). Today Birth is largely an academic curio, typically viewed in settings where its racism looms over any aesthetic or other assessment.
In a number of respects, Steven Spielberg’s new film Lincoln is a repudiation of Griffith. In Birth, Lincoln is a martyr whose gentle approach to his adversaries is tragically severed with his death. But in Lincoln he’s the determined champion of emancipation, willing to prosecute the war fully until freedom is secure. The Stevens character of Lincoln, played by Tommy Lee Jones, is not quite the hero. But his radical abolitionism is at least respected, and the very thing that tarred him in Birth – having a secret black mistress – here becomes a badge of honor. Rarely do the rhythms of history oscillate so sharply. Griffith would no doubt be bemused. But he could take such satisfaction in the way his work has reverberated across time.
Tuesday, February 5, 2013
The following review has been posted on the Books page of the History News Network.
Here's a book that has its title right -- a statement worth making because so many stretch or bend them for marketing purposes. And that's only the beginning of the elegant distillation George Washington University political scientist David Shambaugh provides in this useful volume, which offers a detailed yet concise portrait of a nation widely perceived as on the cusp of what the Chinese government often ascribes to its American rival: hegemony.
But not that that fast, Shambaugh says. While it's clear that China's rise has been wide, deep and rapid, it has a long way to go before it's truly a global rival for the United States. An effective response to that rise, he says, requires one to understand its contours, which are surprisingly jagged.
Shambaugh surveys China's place in the world by a series of metrics: diplomacy, economics, culture, and military prowess, among others. In every case he notes that the nation has made tremendous strides since Deng Xioping's transformative changes following 1978, reforms whose impact appears to be accelerating. And yet for a variety of reasons China falls far short of global dominance or influence. So, for example, its goods are flooding the world -- but not in elite, high-tech products. Its navy has been growing by leaps and bounds -- but its impact is largely limited to the western Pacific. It has an increasingly visible profile in international institutions, but its role tends to be passive, if not contradictory.
A big part of the reason for this, as Shambaugh explains, is a deep-seated sense of national ambivalence. Nursing a lingering sense of grievance for its century and a half of humiliation at the hands of Japan and the West from the early 19th to the mid 20th centuries, the Chinese government and its people view the prevailing international order with the skepticism, even hostility, of Third World Nation, even though such a label hardly describes it. At the same time, China's millennial understanding of itself as the Middle Kingdom makes it reluctant to push far beyond its territorial frontiers -- or to interact with other nations on a basis of genuine reciprocity.
These tendencies are only intensified by a Communist regime fretful of its grip on power at home, and inclined to fret about the domestic implications of any given foreign policy decision. Such a stance interferes with its stated goal of pursuing cultural influence along the lines famously described by Joseph Nye in his now-classic 2004 book Soft Power: The Means to Success in World Politics. Since the essence of soft power is extra-governmental, and the Chinese government tends to filter as much as it practically can through the Communist Party's institutional apparatus, this goal is ever at cross-purposes with reality. The result, as Shambaugh explains, is a belief that "not to agree with Chinese official policy or to be critical is seen as misunderstanding China." As in so many other domains, insecurity breeds truculence -- a stance that has always roiled China's relations with its neighbors, and which sometimes roils its relations with the rest of the world, as indeed it has since the global Great Recession, which the Chinese government viewed as more of turning point in geopolitics than it has turned out to be, at least in the short term.
Under such circumstances, Shambaugh believes that the most practical approach to dealing with China is a constructivist one. Since "China is only shallowly integrated into the norms of global order, and it possesses little consciousness of global public goods or social 'responsibility,'" it's foolish to think it will accept long-prevailing Western protocols. It's even more foolish to believe that China can be contained, given its undeniable and growing economic and military might. Instead, Shambaugh believes, China must be pragmatically conditioned toward integration as a form of self-interest, with compromise as its most realistic option in achieving its goals. Actually, this sounds a little like a (soft power) version of containment as George Kennan originally imagined it -- a flexible instrument of adaptation rather than a knee-jerk reaction of opposition.
What of course no one can know is how long China will remain a partial power. Actually, a portrait of the United States in 1913 would in many ways be similar: an economically powerful, culturally marginal, and diplomatically ambivalent power that rest of the world was waiting for to grow up -- and dreading at the same time. Shambaugh has sifted through a lot of data to draw a complex but discernible trajectory. History suggests, however, that unexpected events have a way introducing fresh angles in the destinies of nations.
Friday, February 1, 2013
Springsteen Makes a Western
Among the many virtues in Bruce Springsteen’s music is a rich sense of history. And like many of those virtues, that sense of history has emerged organically over the course of his career. Springsteen’s first albums, Greetings from Asbury Park and The Wild, the Innocent and the E Street Shuffle, were marked by a powerful sense of immediacy; to a great extent, they’re records of the present tense. Beginning with the release of Born to Run, a consciousness of history – principally in the form of a growing awareness of past failure, and a desperate desire to avoid similar mistakes – begins to suffuse the consciousness of his characters. This consciousness is deeply personal, typically expressed, for example, in generational tensions between fathers and sons. That’s what I mean by “organic.”
By about 1980, Springsteen’s sense of history begins to get broader. It emerges in a series of forms, ranging from his decision to perform songs like Woody Guthrie’s “This Land is Your Land” (reading 1980 Joe Klein’s biography of Guthrie as the suggestion of his manager, Jon Landau, seems to have been a watershed experience) to recording original songs like “Wreck on the Highway,” avowedly patterned on the style of country & western singer Roy Acuff. His 1982 album Nebraska is saturated with a sense of the 1930s (his 1995 album The Ghost of Tom Joad even more so), and even deeply personal songs like “Born in the U.S.A.” connect the private struggles of their protagonist to much larger historical ones. This trajectory is a striking, and impressive testament to an artist’s power to grow and integrate everyday life into a broader human drama.
One of the less remarked upon aspects of Springsteen’s body of work is his fascination with the West. This is, of course, counterintuitive – Springsteen is nothing if not the voice of New Jersey, an embodiment of urban, ethnic, working-class values and culture typically associated with the Northeast Corridor. But the western signposts are there, as early as “Rosalita,” which climaxes with a vision of triumphant lovers savoring their victory over paternal repression in a café near San Diego. That’s a fleeting reference. But beginning with Darkness on the Edge of Town – think of the “rattlesnake speedway in the Utah desert” of “The Promised Land” – the West becomes a vivid and indispensable setting for a number of songs. Springsteen being Springsteen, he’s not always content simply to invoke or use such settings in conventional ways. So, for example, the gorgeous yearning that marks his 1995 song “Across the Border,” redolent with music, instrumentation, and language of the Southwest, is purposely ambiguous which side of the border its protagonists long to go. Springsteen’s mythic tendencies are often marked by creative friction with the concrete details and ironic realities of everyday life.
“Outlaw Pete,” the leadoff track on Springsteen’s latest album, Working on a Dream, represents the next turn of the wheel in a way that’s somehow predictable, surprising, and inevitable all at once. Superficially, the song, like the album as a whole, is something of a throwback, a return to the dense, lush, melodic pop songs that were once Springsteen’s stock-in-trade. At eight minutes long, it’s also the first time in decades that’s he’s recorded a mini-epic on the scale of “Incident on 57th St.” or “Jungleland.” For thirty years now, the overall trend in Springsteen’s work has been toward more sparse, even minimalist songs that approach spoken-language records, though the approach here was first broached on Magic in 2007.
It’s almost jarring to hear his eager embrace of melodic hooks and multi-track harmonies. It’s also almost jarring in that “Outlaw Pete” so willfully introduces us to a protagonist who seems like a cartoon figure from an imitation John Ford movie, who “at six months old” had “done three months in jail” and “robbed a bank in his diapers and little baby feet.” Pete’s signature question, “Can you hear me?” seems like a childish insistence for attention. Some might be amused by such a description; others might dismayed, even irritated by its triviality. One could be forgiven for perceiving that Springsteen is slipping into superficiality in his advancing age, perhaps trying to recapture the sense of popular appeal that once seems so effortlessly his.
But appearances are deceiving. More specifically, our perception of Outlaw Pete is deceiving. After hearing the seemingly requisite description of a horse-stealing, heart-breaking scoundrel – rendered in an amused voice that suggests the narrator views him as a figure closer to a rakishly charming Jesse James than a hard, frightening, Liberty Valance – the story turns on a dime (the music, which shifts to a declining phrase of repeating notes, indicates this) as Pete gets a vision of his own death that prompts him to marry a Navajo and settle down with a newborn daughter on a reservation. Yet in some sense the story is only getting started. A vindictive lawman – another staple of western mythology – is determined to bring Pete down and precipitates a confrontation. “Pete you think you have changed but you have not,” Dan tells him, in so doing posing the existential question at the heart of the song, which is to what degree we have agency over our characters and thus our fate. In the showdown that follows Pete is nominally the victor, yet Dan literally gets the last word in observing before his death that “we cannot undo these things that we’ve done.” The question “Can you hear me?” is turned on its head, as Dan speaks to Pete instead of Pete speaking to the world.
Pete, now a fugitive from the law, makes an ambiguous disappearance from the story. Is it to be understood that his encounter with Dan demonstrates the fixed nature of his personality and the impossibility of any lasting mortal redemption? Or is it an act of abnegation that protects his wife and daughter from the wickedness that surrounds him? The final verses of the song depict Dan’s daughter braiding Pete’s buckskin chaps in her hair – original sin and grace at once – with the question “Can you hear me?” now completely reversed, as we listeners seek the vanished Pete. Like Alan Ladd in Shane or John Wayne in any number of westerns, Pete catalyzes action that leads to resolution, but pushes him beyond the frame.
Like a great many works of art, “Outlaw Pete” asks many more questions than it answers. But there are at least two things it does clarify. The first is the ongoing vitality of western mythology (now nicely updated with a multicultural accent) as a vehicle for exploring the complexities of American life. The second is the ongoing vitality of Springsteen himself, 37 years into an enormously broad and deep body of work, to reinvent himself through reviewing and revising our cultural traditions. He hears us, and we see ourselves.