Category Archives: American Studies

Is it “propaganda” if it advocates for something you want?

Guest post by Jonathan Auerbach

auerbachBecause I recently coedited a collection of essays on the subject of propaganda, I sometimes get approached by journalists asking me to weigh in on current events. How effective is Putin’s “propaganda” against the West in promoting the separatist movement in Ukraine? How best to counteract gruesome ISIS videos, aimed to entice recruits to jihad, but often described in shorthand as “propaganda”? And lately my inbox has been bombarded with emails urging me to “keep the pressure on” by fighting against the vile “propaganda” of warmongers in Congress who would reject the international deal to curb Iran’s nuclear capabilities.

In all of these cases, “propaganda” is assumed to be a self-evident concept, inherently false and sinister, against which urgent countermeasures and messages (but certainly nothing we would want to call propaganda!) need to be taken. If we step back a minute and try to put this matter in historical perspective, certain insights come into focus.

A century ago, right at the start of World War I, the term was frequently use to refer to any sort of mass advocacy, such as “propaganda” for suffrage or “propaganda” for conservation. In these instances, propaganda in both meaning and practice simply referred to efforts designed to sway public opinions and feelings on a large scale. During and immediately following the war, the meaning and practice of such mass persuasion took on an increasingly negative cast, leading Progressive political commentator Walter Lippmann in 1919 to ominously announce a crisis in democracy triggered by this unregulated “manufacture of consent.”

But what’s the difference between coercion and persuasion, especially in a democracy that relies on a vibrant public sphere and the free flow of information to debate and contest policies and ideas? Who is in charge of such information dissemination? What’s the difference between educating citizens, directing them, and indoctrinating them? How to distinguish among teaching, preaching, and selling, especially when your nation is at war and seeks to boost patriotic morale? Left to their own devices, how can citizens be trusted to sort through such an overwhelming avalanche of factoids and truthiness (as Stephen Colbert put it) to arrive at some rational conclusions about the world we live in? These are the key questions Progressive intellectuals, reformers, and politicians such as Lippmann, John Dewey, Julia Lathrop, and Woodrow Wilson grappled with a century ago, not to mention public relations gurus like Edward Bernays who were intent on engineering and managing the tastes and spending habits of citizen-consumers.

Clearly, these troubling questions remain very much with us today. My new Johns Hopkins University book, Weapons of Democracy: Propaganda, Progressivism, and American Public Opinion, seeks to shed light on our current state of affairs by tracing the changing face and fate of American public opinion in the early decades of the twentieth century as they unfolded before, during, and soon after World War I. By closely looking at Progressive era propaganda in thought and practice, including the inevitable entanglements between social reform and social control that emerged during this period, we put ourselves in a better position to understand how the United States continues to deploy its current weapons of democracy at home and around the globe.

Jonathan Auerbach is a professor of English at the University of Maryland–College Park. He is the author of Weapons of Democracy: Propaganda, Progressivism, and American Public Opinion and the coeditor of The Oxford Handbook of Propaganda Studies.

Leave a comment

Filed under American History, American Studies, Politics, Popular Culture, Washington

Meet Us in Toronto: American Studies Association

If you are in Toronto for the ASA annual meeting, be sure to stop by Booth #208, to meet our staff, browse our latest publications, and and take advantage of special meeting discounts. Throughout the meeting and after, JHUP books will be available at a 30% discount when your use the discount code HEAI. Check out what’s new and recent from JHUP in American Studies and related fields!

The Best War Ever
by Michael C. C. Adams

Was World War II really such a “good war”? Popular memory insists that it was, in fact, “the best war ever.” After all, we knew who the enemy was, and we understood what we were fighting for. The war was good for the economy. It was liberating for women. A battle of tanks and airplanes, it was a “cleaner” war than World War I. Although we did not seek the conflict—or so we believed—Americans nevertheless rallied in support of the war effort, and the nation’s soldiers, all twelve million of them, were proud to fight. But according to historian Michael C. C. Adams, our memory of the war era as a golden age is distorted. It has left us with a misleading—even dangerous—legacy, one enhanced by the nostalgia-tinged retrospectives of Stephen E. Ambrose and Tom Brokaw. Disputing many of our common assumptions about the period, Adams argues in The Best War Ever that our celebratory experience of World War II is marred by darker and more sordid realities.

In the book, originally published in 1994, Adams challenges stereotypes to present a view of World War II that avoids the simplistic extremes of both glorification and vilification. The Best War Ever charts the complex diplomatic problems of the 1930s and reveals the realities of ground combat: no moral triumph, it was in truth a brutal slog across a blasted landscape. Adams also exposes the myth that the home front was fully united behind the war effort, demonstrating how class, race, gender, and age divisions split Americans. Meanwhile, in Europe and Asia, shell-shocked soldiers grappled with emotional and physical trauma, rigorously enforced segregation, and rampant venereal disease.

In preparing this must-read new edition, Adams has consulted some seventy additional sources on topics as varied as the origins of Social Security and a national health system, the Allied strategic bombing campaign, and the relationship of traumatic brain injuries to the adjustment problems of veterans. The revised book also incorporates substantial developments that have occurred in our understanding of the course and character of the war, particularly in terms of the human consequences of fighting. In a new chapter, “The Life Cycle of a Myth,” Adams charts image-making about the war from its inception to the present. He contrasts it with modern-day rhetoric surrounding the War on Terror, while analyzing the real-world consequences that result from distorting the past, including the dangerous idea that only through (perpetual) military conflict can we achieve lasting peace.

Plutocracy in America
By Ronald P. Formisano

The growing gap between the most affluent Americans and the rest of society is changing the country into one defined—more than almost any other developed nation—by exceptional inequality of income, wealth, and opportunity. This book reveals that an infrastructure of inequality, both open and hidden, obstructs the great majority in pursuing happiness, living healthy lives, and exercising basic rights. A government dominated by finance, corporate interests, and the wealthy has undermined democracy, stunted social mobility, and changed the character of the nation. In this tough-minded dissection of the gulf between the super-rich and the working and middle classes, Ronald P. Formisano explores how the dramatic rise of income inequality over the past four decades has transformed America from a land of democratic promise into one of diminished opportunity. Since the 1970s, government policies have contributed to the flow of wealth to the top income strata. The United States now is more a plutocracy than a democracy.

Formisano surveys the widening circle of inequality’s effects, the exploitation of the poor and the middle class, and the new ways that predators take money out of Americans’ pockets while passive federal and state governments stand by. This data-driven book offers insight into the fallacy of widespread opportunity, the fate of the middle class, and the mechanisms that perpetuate income disparity.

View the book trailer

Narrating 9/11: Fantasies of State, Security, and Terrorism
edited by John N. Duvall and Robert P. Marzec

Winner, 2014 Dale Brown Book Award, Young Center for Anabaptist and Pietist Studies

Narrating 9/11 challenges the notion that Americans have overcome the national trauma of the terrorist attacks on the World Trade Center and the Pentagon. The volume responds to issues of war, surveillance, and the expanding security state, including the Bush Administration’s policies on preemptive war, extraordinary rendition, torture abroad, and the suspension of privacy rights and civil liberties at home.

Building on the work of Giorgio Agamben, Slavoj Žižek, and Donald Pease, the contributors focus on the ways in which post-9/11 narratives help make visible the fantasies that attempt to justify the ongoing state of exception and American exceptionalism. Narrating 9/11 examines a variety of contemporary narratives as they relate to the cultural construction of the neoliberal nation-state, a role that mediates the possibilities of ethnic and religious identity as well as the ability to imagine terrorism.

Touching on some of the mainstays of 9/11 fiction, including Jonathan Safran Foer’s Extremely Loud & Incredibly Close and John Updike’s Terrorist, the book expands this particular canon by considering the work of such writers as Jess Walter, William Gibson, Lauren Groff, Ken Kalfus, Ian McEwan, Philip Roth, John le Carré, Laila Halaby, Michael Chabon, and Jarett Kobek. Narrating 9/11 pushes beyond a critical focus on domestic realism, offering chapters that examine speculative and genre fiction, postmodernism, climate change, and the evolving security state, as well as the television series Lost and the film Paradise Now.

The Higher Learning in America: The Annotated Edition
by Thorstein Veblen, edited with an introduction and notes by Richard F. Teichgraeber III

Since its publication in 1918, Thorstein Veblen’s The Higher Learning in America has remained a text that every serious student of the American university must confront. Intellectual historian Richard Teichgraeber brings us the first scholarly edition of Veblen’s classic, thoroughly edited, annotated, and indexed. An extensive introduction discusses the book’s composition and publishing history, Veblen’s debts to earlier critics of the American university, and the place of The Higher Learning in America in current debates about the American university.

Veblen’s insights into the American university system at the outset of the twentieth century are as provocative today as they were when first published. Insisting that institutions of higher learning should be dedicated solely to the disinterested pursuit of knowledge, he urged American universities to abandon commitments to extraneous pursuits such as athletics, community service, and vocational education. He also believed that the corporate model of governance—with university boards of trustees dominated by well-to-do businessmen and university presidents who functioned essentially as businessmen in academic dress—mandated unsavory techniques of salesmanship and self-promotion that threatened to reduce institutions of higher learning to the status of competitive business enterprises.

With a detailed chronology, suggested readings, and comprehensive notes identifying events, individuals, and institutions to which Veblen alludes, this volume is sure to become the standard teaching text for Veblen’s classic work and an invaluable resource for students of both the history and the current workings of the American university.

Light It Up: The Marine Eye for Battle in the War for Iraq
by John Pettegrew

American military power in the War on Terror has increasingly depended on the capacity to see the enemy. The act of seeing—enhanced by electronic and digital technologies—has separated shooter from target, eliminating risk of bodily harm to the remote warrior, while YouTube videos eroticize pulling the trigger and video games blur the line between simulated play and fighting.

Light It Up examines the visual culture of the early twenty-first century. Focusing on the Marine Corps, which played a critical part in the invasion and occupation of Iraq, John Pettegrew argues that U.S. military force in the Iraq War was projected through an “optics of combat.” Powerful military technology developed in the Iraq and Afghanistan wars has placed war in a new posthuman era.

Pettegrew’s interviews with Marines, as well as his analysis of first-person shooter videogames and combat footage, lead to startling insights into the militarization of popular digital culture. An essential study for readers interested in modern warfare, policy makers, and historians of technology, war, and visual and military culture.

JHU Press Journals:

American Quarterly
Technology and Culture
Philosophy and Literature

Leave a comment

Filed under American Studies, Conferences, Cultural Studies, Journals

“Most could never forget what they had seen and experienced . . . ” But will we remember?

Guest post by John C. McManus

mcmanusRecently the Anti-Defamation League conducted a worldwide survey designed to measure the extent of anti-Semitic attitudes and knowledge of the Holocaust. Over 53,000 adults in 102 countries were queried by professional pollsters using a data-based research survey method. The results were not encouraging. According to the poll, some 26 percent of respondents admitted to deeply held anti-Semitic attitudes. Perhaps even more disturbing, from an historical viewpoint, is that 54 percent of those surveyed worldwide had never heard of the Holocaust. Overall, almost two-thirds of those surveyed had either never heard of this most monumental of all history’s many great crimes or, worse, they believed it never actually happened.

Not surprisingly, Anti-Defamation League representatives expressed deep disappointment and alarm at such stark evidence of modern day hatred and ignorance. Abraham Foxman, the League’s national director in the United States, said, “The results confirm a troubling gap between older adults who know their history and younger men and women who, more than seventy years after the events of World War II, are more likely to have never heard of or learned about what happened to the six millions Jews who perished.”

Though no less troubled than Mr. Foxman, I was not especially surprised by the results. For several years now, I have witnessed ignorance of the Holocaust in some of my students and especially in popular culture as a whole. On occasion in that same popular culture, I have seen ignorance mutate into outright denial, sometimes out of rebellion against a perceived popular narrative of historical events, sometimes out of misplaced sympathy for anti-Semitic, anti-western, middle Eastern Arabs, and sometimes simply out of sheer hatred for Jews.

As a professional historian, it is not really my intent to become enmeshed in today’s geopolitical controversies. Instead my purpose is to document, chronicle, and analyze the events of the past, while perhaps offering some lessons for our future. My particular focus is on military history, with a specialization in World War II and the history of American soldiers in battle. In eleven books published over the course of more than a decade, I have explored the combat experience for those Americans who do the real fighting in time of war. If there is one theme that has stood out to me, it is the grim, visceral nature of combat for soldiers, especially amid the meat grinder of World War II, by far history’s deadliest war. Many of these same soldiers who fought for their lives on the front lines also liberated or witnessed concentration camps in Germany at the end of the war. Very few had any previous knowledge of the existence of these camps. Over the years, I have been struck by how many of these men told me or other historians or wrote in memoirs or letters that this experience was their most traumatic and unforgettable during the war. Indeed many were never the same after seeing a camp (or multiple camps in some cases). And yet, even though the Holocaust is one of the most heavily documented events in human history, the literature includes very little material about the liberation experiences of American soldiers.

So, in hopes of filling this void, as well as finding out what could have been worse for soldiers than battle, and combating what I perceived as persistent ignorance and denial of the Holocaust, I wrote Hell Before Their Very Eyes: American Soldiers Liberate Concentration Camps in Germany, April 1945. The book focuses on the liberation of three camps—Ohrdruf, Buchenwald and Dachau—during that momentous month in 1945. These three places, I felt, represented the larger whole of the Nazi concentration camp system in Germany, and the story of their liberation conveys a narrative of discovery as American soldiers experienced it that spring. Indeed, it is sobering to realize that the Holocaust was not just a crime of genocide; in a larger sense it was a huge slave labor operation targeting a multitude of ethnic groups, not just Jews. The camps liberated by Americans in Germany were designed for enslavement, not industrial killing of human beings in massive numbers like the death camps in Poland (where the majority of Jewish Holocaust victims lost their lives). As such, the majority of the survivors encountered by American soldiers were non-Jewish eastern Europeans.

Thus, Ohrdruf, Buchenwald and Dachau were not even among the worst camps in the Nazi empire. But they were horrible enough. In these three terrible, pestilential places, young American soldiers came face to face with a dark and upsetting world of human degradation, along with its sickening manifestations of terrible sights and smells—emaciated bodies stacked in heaps, ovens full of incinerated human remains, warehouses filled with stolen shoes, clothing, luggage, and even eyeglasses, prison yards littered with implements of torture as well as dead bodies and, perhaps most disturbing of all, the half-dead survivors of these camps. The troops became familiar with the unforgettable stench of these places, a nauseating mixture of dead bodies, feces, dirty clothing, body odor and, at times, burnt flesh. “There’s nothing else I can remember in my lifetime that remains as vivid and horrible as that,” Bob Cleary, a young lieutenant who led a reconnaissance unit into Ohrdruf, later said. William Charboneau, who was a nineteen-year-old infantryman in 1945, opined more than fifty years after the war, “Until you’ve smelled burnt flesh or decayed flesh, you have no idea what the odor is. I can still smell it today.” Not surprisingly, most could never forget what they had seen and experienced. “The scenes were so deeply etched in my memory that it is impossible to cast them aside–or to forget–or to permit time to dull the sharpness of those horrifying images of hell on earth,” Jerry Hontas, a Buchenwald liberator, said. “The only thing that vanished was our innocence.” Some could never talk about these horrors; others felt a sense of mission to tell the world, especially as they grew old and the world’s memory faded. This is their story . . .

John C. McManus is a Curators’ Professor of History at Missouri University of Science and Technology. He is the author of Hell Before Their Very Eyes: American Soldiers Liberate Concentration Camps in Germany, April 1945 which will be published this month by JHU Press. His previous books include The Deadly Brotherhood: The American Combat Soldier in World War II and Grunts: Inside the American Infantry Combat Experience, World War II Through Iraq.

Read the results of the Anti-Defamation League survey here.

Use promo code “HDPD” to receive a 30% discount when you place your pre-publication order for Hell Before Their Very Eyes.



Leave a comment

Filed under American Studies, History, Jewish Studies, Popular Culture, War and Conflict

Casualties of war on September 19 (1756)

Guest post by Len Travers

traversIf Robert Wilson had done what he did today instead of in 1756, they would have given him a medal. That year, September 19 was a Sunday. On that sweltering late-summer afternoon Wilson and nearly fifty other New England soldiers were scouting the rocky, wooded shore of Lake George in New York Colony when they walked into an ambush. Their assailants, Indian and French Canadian raiders three times their numbers, quickly overwhelmed the trapped colonials. Early in the fighting a bullet found Wilson, punching clean through his shoulder. Unable to fight, he somehow broke through the melee and ran for all he was worth eleven miles through the rugged forest back to Fort William Henry, where his doomed patrol had begun. Exhausted from blood loss and dehydration, he gasped out the news, less than three hours old, of what had befallen his companions. Unless help came to them soon, he feared that his company “could not escape” and that “the whole Scout would be Cut of[f].” For all he knew, he was the only one to escape alive.

He wasn’t, but Wilson’s harrowing story was one of many I found in the course of researching Hodges’ Scout, in which I attempted to recover a long-forgotten incident of the French and Indian War. Part of that story follows the fates of survivors, such as the twenty-two-year-old Wilson. Those who made it home, some after years of captivity, found nothing like the welcome, support, and admiration veterans receive today. Wilson spent the ensuing months slowly recovering from the “Grate pain” of his wound. And there was the expense: he had been forced to pay his own way home to Lexington, Massachusetts, and despite (or because of) medical care he “Remained Lowe with his wound all the winter.” After five months he was finally able to do some work, but “he [had] not the use of his showlder so well as he had nor [feared] he Ever shall.” As did so many other injured veterans of the French and Indian War, Wilson was forced to ask for public assistance from his colony government. It took more than two years from the time he was shot, but Wilson was awarded £6—less than half a year’s wages—“in full for his Service and Sufferings,” but the once-hardy young man, now a disabled veteran, would never be the same again.

Such token awards from cash-strapped colonial governments were typical, and if wounded veterans felt short-changed, the families of the hurt, killed, and missing fared little better. Widows often received little from their husbands’ estates (most soldiers died too young to have accumulated much), and remarriage was not always in the cards. Children went without, or were put to servitude for their support, breaking up what was left of families. Harder though not impossible to evaluate is the emotional toll on families rent by war. John Lewis had marched that day with Wilson, but was feared dead. His family moved quickly to administer his pitiful estate, but Lewis’ aged mother Hannah refused to give up on him, writing her “Beloved Son John Lewis if he be living,” into the will she made out the year following.

Hannah Titus was also a grieving mother. Her husband had died early in 1756; she then permitted her seventeen-year old son Benjamin to go for a soldier that year, probably counting on his soldier’s wages to help the family. He enlisted in the same company as his older brother Noah, and together they set off for Lake George. But Benjamin was killed in the same firefight that crippled Robert Wilson, and Noah died from disease soon after. By the end of December Hannah Titus was pleading with a judge to appoint an administrator for the estates of Benjamin, Noah, and their spinster sister Hepzibah, who also had died that year. Robbed of husband and children after a year of cruel loss, guilt-stricken over letting her last son go to war, Hannah understandably felt “not able to do such business myself.”

Recent events remind us that the “social safety net” constructed for modern veterans and their families has often provided too little, and too late, for too many. But in exploring the American past we confront societies for which such things were, comparatively, nonexistent—as the survivors of Hodges’ Scout so tragically learned.

Len Travers is a professor of history at the University of Massachusetts Dartmouth. He is the author of Celebrating the Fourth: Independence Day and the Rites of Nationalism in the Early Republic and Hodges’ Scout: A Lost Patrol of the French and Indian War, which will be published by JHU Press later this year.

Use promo code “HDPD” to receive a 30% discount when you place your pre-publication order for Hodges’s Scout.


Leave a comment

Filed under American History, American Studies, Military, War and Conflict

Controversy, thy name be Smithsonian

Guest post by Robert C. Post

Who Owns America's Past? $20.97 (reg. $29.95)

The Smithsonian Institution is currently wrapped in controversy involving an exhibit at its National Museum of African Art, Conversations: African and African Amercian Artworks in Dialogue. Nobody doubts the exhibit’s noble purpose, displaying art with “the power to inspire.” But one-third of the works are from the collection of Bill Cosby and his wife Camille, and the Cosbys donated $716,000 “to assist with the cost.” Moreover, the exhibit is partly about Cosby himself, about his fame, his geniality. Near a display of quilts there is a quote about these quilts telling a story “of life, of memory, of family relationships.” To many people steeped in the 24-hour news cycle, this seems beyond irony.

But we must remember that the Smithsonian Institution was born 170 years ago amid controversy and no little irony. When the bequest of an eccentric Englishman, James Smithson, arrived in Washington with instructions to establish an institution “for the increase and diffusion of knowledge among men,” it was not clear what he meant. And when the Smithsonian’s first secretary, Joseph Henry, steered the institution into scientific research, he provoked controversy. Others envisioned something quite different—a library, a university, most notably a museum. Henry was totally opposed. A museum, he warned, would squander resources, provoke more controversy, and, worst, render the institution “liable to be brought under direct political influence.”

He was right about that. The irony is that the public has long seen the Smithsonian as primarily a museum, or, rather, a museum complex. And there have been controversies aplenty. Some seemed as much personal as political. The Wright brothers were incensed when the Smithsonian assigned credit for the first “sustained free flight,” totally undeserved, to a man who had once been its secretary. Partisans of Alexander Graham Bell were terribly upset by an exhibit that seemed to deprive Bell of full credit for inventing the telephone, and they threatened to take the matter “to the public and to Congress.” Some controversies were wholly political. A few years ago, a Smithsonian secretary accepted a donation from one Ken Behring with the absurd contingency that there be a halt to exhibits that were “multicultural.” The Smithsonian, said Behring, must do “an American history museum.” Politicization materialized most famously in the 1990s when the National Air and Space Museum was forced to abort a planned exhibit of the Enola Gay, the bomber sent to destroy Hiroshima, along with horrific evidence of what happened on the ground. The airplane was displayed, the rest was not.

Perhaps more shameful in the long run have been episodes that librarians would see as akin to book burning. After the National Portrait Gallery staged Hide/Seek, an exhibit about same-sex intimacy, a video was removed when legislators threatened to “zero out” the Smithsonian’s budget, as had also been threatened with the Enola Gay/atomic bomb affair. Both times, there could be a plea of urgent necessity to capitulate; when an official remarked that “we have to be adept at communication,” he might better have said that “the institution must have its federal dollars or close its doors.” (70 percent of the budget is federal.)

But the Conversations controversy is different from others. No zeroing-out threats, but plenty of outrage. When the exhibit opened, an authorized biography of Cosby had just been published. It was being reviewed in the right places (in the Times Book Review as “wonderfully thorough”) just as Cosby’s rape allegations gained currency. Celebrities wanted their dust-jacket kudos deleted and a paperback was nixed, but there has been little pressure to remove the book from library shelves, to subject it to a figurative or perhaps literal burning. It’s been quite a different tale with the exhibit, with demands to “take it down.” Johnnetta Cole, the museum director and a close friend of the Cosbys, is devastated. So far, however, the institutional response has been that the show must go on, that appearing to celebrate a man accused of serial rape is preferable to “pulling” the exhibit as with the Hide/Seek video—and to harming artists with no responsibility for Cosby’s behavior. As a halfhearted response to critics, there is a sign outside the exhibit saying that the Smithsonian “in no way condones” this behavior, whatever it may have been.

This may be enough to carry the exhibit through to its scheduled closing in January, with no book burning, even in a figurative sense, as with the Hide/Seek video. While commending the Smithsonian’s decision “to stand by the exhibit on its artistic merits,” the Washington Post also expresses hope that the institution has “learned some lessons from this painful experience.” Perhaps it has, but looking back over the Smithsonian’s history, and looking to the emergent power of outsiders who claim a “stake” in the content of exhibits, I’d not be too sure.

Bob Post is the author of Who Owns America’s Past? The Smithsonian and the Problem of History, which was published by JHU Press. It details the controversies mentioned here and many others.


1 Comment

Filed under American History, American Studies, Cultural Studies, Current Affairs, D.C., Ethics, Politics

Patriot (Day) games: exploring the fantasies surrounding 9/11

Guest post by John N. Duvall and Robert P. Marzec

duvallWhat’s happening for the 14th anniversary of 9/11? For one thing, there are a lot of Harley rides. The sixth item in a Google search for “14th anniversary of 9/11” informs you about the 2015 9/11 Memorial Ride Harley Ride starting in Knoxville, Tennessee, in order to “remember those who gave the ultimate sacrifice on September 11th, 2001.” It will kick of with a ceremony that “includes a flyover, ‘Taps,’ and a 21-gun salute” and end with a “concert that night and special priced meal deal at the Shed Smokehouse & Juke Joint.” Hot damn. But you don’t have to travel to Knoxville to ride in memory of 9/11. In Bay Village, Ohio, “on Sunday, September 6th, there will be a ‘Never Forget 9/11’ Ceremony and Processional Ride to honor and remember the families of the loved ones that lost their lives in the four hijacked airplanes, World Trade Center, Pentagon, and Shanksville.” The ceremony “will conclude with a 21-gun salute, ‘Taps,’ and ‘Amazing Grace.’ ” Similar rides will be held that day in Virginia, Pennsylvania, and even Vancouver, Canada, where a “45-minute service is dedicated to all those who perished on September 11th, 2001 and to recognize and thank all who serve each and every day to make the lives of Canadians and Americans safe and free.” BBQ to follow.

It’s not the Fourth of July, for sure, but 9/11 is about as good as Labor Day as an excuse to continue to enjoy summertime activities. Bikers are far from the only ones recreating in remembrance. There are plenty of golf tournaments happening that weekend with the announced goal of helping us remember 9/11. And on Friday, September 11, the Lincoln Center crowd can go hear Mozart’s “Requiem in D Minor.”

Whatever we “do in remembrance,” however, never produces historical thinking. Remembrance is about the creation of local community and non-reflexive national identity. This is why fiction about 9/11 (as well as our new book, Narrating 9/11, which examines this body of literature) matters. Embedded in this transformative historical moment, the best narratives focusing on the terrorist attacks provide nuanced mediations on not only the pain and trauma of the day itself but also on the United States’ Orwellian designated response (from preemptive war to extraordinary rendition and enhanced interrogation) that turned the American “homeland” into the planet. (If you think that’s rhetorical excess, look at chapter 12 of The 9/11 Commission Report, which declares “the American homeland is the planet,” which implies that the folksong made famous by Woody Guthrie really needs a makeover: “This land is our land / Your land is our land”). Occasionally, this fiction even anticipates our present reality. Jess Walter’s 2006 novel The Zero, for example, imagines a monster truck rally (“WE’RE TURNING VETERANS ARENA INTO A GIANT MUD PIT TO HONOR OUR DEAD HEROES!”) that exactly captures the dehistoricizing of 9/11 that contemporary instances of commodified recreational remembrance produce.

In all this “remembering,” a fundamental fact is forgotten; namely, that the terrorist attacks were immediately instrumentalized by the Bush Administration, which with the aid of Homeland Security’s orange and red alerts constantly reminded Americans to “be afraid, be very afraid.” Has all that much changed since George Bush signed the USA Patriot Act into law a month after the terrorist attacks? The Obama administration is bracing for the anniversary by ramping up security around the world. A year ago, former Defense Secretary Chuck Hagel said that US military forces were “operating at a high state of readiness” around the globe. Meanwhile, Current Defense Secretary Ashton Carter is in the process of extending this military vision in his new “force of the future” initiative (often referred to as simply “the force”—which may not have much resonance with the next generation of American’s until after the “force awakens” in theaters December 18th of this year). And President Obama’s National Security Strategy, in addition to listing traditional concerns such as “Homeland Security” and “Persistent Threats of Terrorism,” now lists “Climate Change” as a key security issue. As the essays we collected in Narrating 9/11 reveal, the expansion of militarized holds on everyday existence are on the rise. The reduction of the historical complexities surrounding 9/11 to memorialized recreation only further compromises and conceals our depoliticized relations to an event now officially shrouded in the holiday designated “Patriot Day.”

John N. Duvall is the Margaret Church Distinguished Professor of English at Purdue University. The editor of the journal MFS: Modern Fiction Studies, he has published extensively on modernist and contemporary fiction. Robert P. Marzec is an associate professor of English at Purdue University. The associate editor of MFS: Modern Fiction Studies, he is the author of An Ecological and Postcolonial Study of Literature: From Daniel Defoe to Salman Rushdie.  Together, they are the editors of Narrating 9/11: Fantasies of State, Security, and Terrorism, published this month by JHU Press.

Use promo code HDPD to receive a 30% discount when you order your copy of Narrating 9/11: Fantasies of State, Security, and Terrorism.

Leave a comment

Filed under American History, American Studies, Cultural Studies, Current Affairs, Journals, Politics

Undisciplining knowledge

Guest post by Harvey J. Graff

graffThe ubiquitous appearance of the term “interdisciplinary” in current academic and educational writing suggests that it is rapidly becoming the dominant form of scholarly work. Major newspapers and periodicals create the same impression, especially in discussing research on current issues ranging from health care to the environment and national security. Commentators disagree about whether this trend is positive or negative. They also disagree about what they mean by “interdisciplinary.” There is much more hype, and heat, than light, and there is also loss.

Recognizing that interdisciplinary work demands a greater command of knowledge and methodologies than individual scholars may possess, universities contend that the organization of learning, and of work, depends on and advances collaboration. These statements reveal the particular discourse of interdisciplinarity, which asserts its transformative power and vital importance. They also suggest implicit tensions between applied research and fundamental problems of knowledge or theory, as well as conflicts between existing disciplines and emerging ones. It is true that universities also deal inadequately with problems of organization and career tracks. Interdisciplinarity can be a cover for downsizing faculty numbers and programs.

These complications—but not the ideology—underscore the fact that disciplinary and interdisciplinary work are inextricably linked, regardless of the assumptions of many proponents and opponents of interdisciplinarity. That each usually depends on the other is not often appreciated. In a discourse sharply divided by dichotomies, some commentators see the recent rise of interdisciplinarity as primarily a reaction against overspecialization and fragmentation in the disciplines. They urge integration and synthesis. Others declare that critical problems demand collaboration among specialists from different fields and disciplines. A more complete appreciation of interdisciplinarity’s development needs a longer look backward, at least to the late-nineteenth-century origins of modern disciplines in the developing research university and the relationships among them. Disciplinarity and interdisciplinarity stimulate, shape, and inform each other, as the making of biology, among other foundational fields, shows.

Despite the diversity of interdisciplines, “big science” has become a normative model that shapes expectations for and evaluations of interdisciplinarity in nonscientific research as well. Large-scale, team-driven, expensive experimental science is hegemonic in current thinking. With expectations for costs go judgments of importance. General education curricula, integrated media across the arts, or digital humanities pale in comparison. So does the interdisciplinary work of individual scholars and small groups, more appropriate to other fields and many problems. Efforts to claim the trappings of big science multiply mimetically. Many interdisciplines, including communications, cognitive studies, and operations research, have at one time or another attempted to pass as sciences. Attesting to the power and lure of science as a badge of identity, this effort has confused questions about the wider applicability of different approaches.

How well does this paradigm fit the most important research breakthroughs? Unusual wartime circumstances propelled the Manhattan Project, which invented the atomic bomb. Should credit be assigned to leading scientists or to military and civilian organizers? Watson and Crick’s collaboration in identifying the structuring of DNA’s double helix was relatively informal, as their exclusion of coworker Rosalind Franklin indicates. Close coordination among many laboratories in separate institutions contributed to mapping the human genome. How do we assess the crucial roles of external circumstances, nonscientific influences, institutional elements, leadership, and specific circumstances, as they interacted with intellectual breakthroughs and the marshaling of resources? Certain factors emerge as especially significant, chief among them the location, relationships, and organization of the interdisciplinary effort and its historical context. Preconditions, particularly research pointing the way to the critical moment and the social and political-economic context, matter enormously.

Lost in the rosy recipes and dire warnings is a different story with very different implications. At different times, in different contexts, interdisciplinarity takes different terms, forms, and locations and faces different chances of success or failure. I explore them in my new book Undisciplining Knowledge: Interdisciplinarity in the Twentieth Century. By far the greatest amount of interdisciplinary research and teaching lay in specialized and advanced studies. In contrast, the emphasis in general or so-called integrative work is in curricular and program development, especially for undergraduates. Both general, nonspecialized, and specialized work can be integrative. But the ways we talk about and praise, or criticize, interdisciplinarity confuse this.

Contrary to most views, Undisciplining Knowledge begins with the understanding that interdisciplinarity is part of the historical making and ongoing reshaping of modern disciplines. It is inseparable from them, not opposed to them. The organization, production, and dissemination of knowledge around universities, disciplinary departments, and research institutes, especially in the United States and Europe, have long given rise to interdisciplinary efforts and movements. Over time, those endeavors have crossed disciplines and disciplinary clusters in different ways and with differing outcomes. This is seen in the histories on which my study builds, ranging from genetic biology and sociology in the late nineteenth and early twentieth centuries to molecular biology, nanotechnology, and cultural studies in the mid to late twentieth century.

In my view, interdisciplinarity is defined and constructed by questions and problems of theory or practice, conditions of knowledge, and the means developed to answer those questions in new and different ways. Interdisciplines are fashioned from elements of different disciplines to form distinct approaches, understandings, or contexts. Interdisciplines are themselves historical constructs. Questions and problems are the focus, not the number of disciplines that are supposedly “mastered,” “integrated,” or “transcended” or the claim that normative disciplinary practices are bypassed. While avoiding dichotomies that interfere with our understanding, I recognize key conflicts and underlying contradictions. In the making of interdisciplinarity, disciplinary elements are interactive, not additive. Similarly, interdisciplinarity derives from the selection of appropriate and relevant ideas, approaches, theories, concepts, methods, and comparisons from different fields or disciplines. Those choices, whether successful or not, influence central questions and problems. In no way does interdisciplinarity depend on knowledge of entire disciplines or on global notions of the unity of knowledge. There is no single path to interdisciplinarity, no single model, no single standard for successful development. The process and results vary across disciplines and clusters. Like disciplines, interdisciplines are diverse in paths, locations, relationships to disciplines, organization, and institutionalization.

The long and complicated history of interdisciplinarity supports a strong argument to limit use of the word and its associated vocabulary. This is necessary in order to advance its provenance and power. Those who pronounce transdisciplinarity or, more recently with respect to bioscience, convergence to be “beyond interdisciplinarity” are seldom aware of the baggage that both those terms carry. Abuse of interdisciplinarity follows from a lack of familiarity and knowledge of the fields supposedly interrelated. This is particularly evident in the humanities and social sciences with respect to “cognitive science” as well as within the sciences themselves. Metaphors too commonly take the place of understanding.

These are very real questions in 2015, just as they were in 1980, 1950, or 1910. What is at stake is nothing less than the framing of efforts to make progress on major intellectual and social problems; issues of public policy; expectations and anticipations; the allocation of resources, including the time and efforts of people and institutions; the articulation of organizations and structures; and professional careers and human lives.

Harvey J. Graff is the Ohio Eminent Scholar in Literacy Studies and a professor of English and history at The Ohio State University. He is the author of Undisciplining Knowledge: Interdisciplinarity in the Twentieth Century, The Literacy Myth: Literacy and Social Structure in the Nineteenth Century City, The Legacies of Literacy: Continuities and Contradictions in Western Society, and other books.



Leave a comment

Filed under Academia, American Studies, Cultural Studies, Education, Higher Education