The Top 20 Best-Selling Titles… #20

UPK is thrilled to be celebrating our seventy-fifth anniversary, and we’d like to invite our supporters to celebrate along with us. Since our founding in 1946, we’ve published nearly 1,000 books. To celebrate seventy-five years of publishing excellence, we’re counting down our Top 20 best-selling titles. Stay tuned as we make our way to our top seller…

20. The Philippine War, 1899-1902 by Brian McAllister Linn

Brian McAllister Linn provides a definitive treatment of military operations in the Philippines. From the pitched battles of the early war to the final campaigns against guerrillas, Linn traces the entire course of the conflict. More than an overview of Filipino resistance and US pacification, this is a detailed study of the fighting in the “boondocks.”

In addition to presenting a detailed military history of the war, Linn challenges previous interpretations. Rather than being a clash of armies or societies, the war was a series of regional struggles that differed greatly from island to island. By shifting away from the narrow focus on one or two provinces to encompass the entire archipelago, Linn offers a more thorough understanding of the entire war.

Winner: Society for Military History Distinguished Book Award

“A thoughtful, deeply researched, and well-written work about a war that teaches much about the nature of revolutionary warfare—even today.” —Foreign Affairs

“The definitive study of this often-misunderstood war.” —Parameters

When Democrats and Republicans United to Repair the Earth

David Sarasohn, co-author of The Green Years, 1964-1976: When Democrats and Republicans United to Repair the Earth

Republican Howard Baker of Tennessee was the majority and minority leader of the US Senate, White House chief of staff, and a presidential contender. But what he really hoped to be remembered for, as he said at the end of his career, was his work on the Clean Air Act—not trying to destroy it, but establishing it.

Half a century ago, over the course of a dozen years, the United States adopted the environmental laws and procedures that we still follow today: more than 300 efforts, including the Clean Air Acts, the Clean Water Acts, the Endangered Species Act, the creation of the Environmental Protection Agency (EPA), and dozens of national parks, national forests, wilderness areas, and protected seashores were implemented. These advances, unimaginable in today’s poisonous partisan gridlock, were propelled by Democrats and Republicans working together and typically drew overwhelming support.

The story of that time, covered in The Green Years, 1964–1976: When Democrats and Republicans United to Repair the Earth, carries some surprising revelations and some surprising people. Republicans were key in these efforts to a striking degree, reminding us that the 1960s and 1970s Republicans were as close to the outdoor activism of Teddy Roosevelt as to the drill-everywhere attitude of Donald Trump. Baker, a national Republican leader, spent a decade working closely on environmental legislation with Democratic presidential hopeful Edmund Muskie. Throughout the 1950s and 1960s, Pennsylvania Republican representative John Saylor was one of the most active and determined conservation advocates in the House.

Democratic legacies from the Green Years look different as well. Washington senator Henry Jackson and Idaho senator Frank Church are etched in history on their opposing sides of the Vietnam war: the high-profile hawk and the resolute dove. But throughout this period, the two were close allies on the Senate Interior Committee, repeatedly producing major legislation on pollution and preservation and protecting massive stretches of the country from development; enacting rules that still govern how we deal with the land, water, and air around us.

The historical reputations of the era’s looming high-profile presidents, Lyndon Johnson and Richard Nixon, have not exactly been shaped by environmental issues. Yet both of their administrations left towering legacies on land and water, achievements unthought of before and unimaginable since.

Nobody remembers Lyndon Johnson for environmental achievements. Yet the original “Great Society” speech at the University of Michigan in 1964 had an entire section on the subject, and Interior Secretary Stewart Udall led the fight for a series of achievements, beginning with the long-sought Wilderness Act of 1964. At the end of his administration, battered and discredited by Vietnam and racial violence, Johnson was still urging Congress and signing measures to create and expand national parks and forests. When the Bureau of the Budget tried to resist spending for protecting more territory, Johnson’s aide Joseph Califano commented, “Budget’s trouble is that it consistently underestimates how much this man loves the land.”

The legislative achievements during the Nixon administration, under a president famous for walking the beach in wingtips, were even greater and include the Clean Water Act, the Clean Air Act, the Endangered Species Act, and the creation of the EPA and environmental impact reports. Some were administration initiatives, some were enacted over administration resistance, but the achievements were driven by figures whose role is now forgotten. Nixon’s domestic counselor John Ehrlichman, today known only as a convicted Watergate conspirator, had been a land use lawyer in Seattle, and his environmental leanings colored White House policy. The Sierra Club’s David Brower later concluded that the movement would have done better to court Nixon rather than battle him. Long after being driven from office, Nixon told George H. W. Bush’s EPA director, “I founded EPA. I’m an environmentalist, too.”

Beyond the simple list of what was passed is how it was all passed. Legislation was hard-fought, extensively debated, and often took several Congresses to enact. But typically, the final measures passed both houses overwhelmingly and were supported by all parties: by segregationist Southern Democrats, hardline Midwestern conservatives, and urban liberals. The Endangered Species Act of 1973, which could not possibly get through Congress today, passed the Senate unanimously and the House 354-4.

Such harmony is hard to envision today, of course, because of the bitter attitudes pervading Washington. But there has also been a change in the makeup of Congress, which, in turn, results in both changed environmental policies and environmental landscapes. Everyone knows Lyndon Johnson’s accurate prediction that the civil rights struggle would cost the Democrats the South for decades, but environmental legislation also blasted the party in the inland West. The disappearance of those Democrats and the ones from the South, along with Republicans from the Northeast, widened the gap between the parties and removed figures vital to bipartisan environmental efforts.

But the advances of the Green Years were not just a matter of Congress, or of executive orders from the White House. The story of the Wilderness Act didn’t begin in the 88th Congress, or even in the previous sessions when it fell short. It goes back to Abraham Lincoln setting aside the Yosemite Valley, and to Benjamin Harrison, Grover Cleveland, and Theodore Roosevelt beginning the protection of vast stretches of undeveloped land, and to decades of lobbying and organizing work by the Wilderness Society. The drive to protect animals and plants didn’t start with the Endangered Species Act, but goes back to colonial Massachusetts. The Green Years, 1964–1976 traces those roots. The book measures that mandating community input, thought to be a curb on protection, actually stimulated citizen activism. Congress ultimately, and eventually, reflects its constituencies. The rising pressures of global warming and mounting weather disasters and extinction events could yet make themselves felt in Congress. In politics, as on the calendar, a green season can come again.

David Sarasohn is a retired editor and columnist at the Oregonian and the author of The Party of Reform: Democrats in the Progressive Era

Farina King on Indigenous People’s Day, 2021

Dr. Farina King, author of The Earth Memory Compass

This Indigenous People’s Day, I think of Indigenous childhoods through generations, honoring the children who survived and those who we must always remember. Remembering is an action.

Shí éí Bilagáanaa nishłí̹ dóó Kinyaa’áanii báshíshchíín. Bilagáanaa dashicheii dóó Tsinaajinii dashinálí. I just introduced myself by my clans, acknowledging my ancestors and kin as a woman of white English-American settler descent born for the Towering House and Black-Streaked Woods People of the Diné. I am a citizen of the Navajo Nation and the daughter of a boarding school survivor. I grew up with the stories of Indian boarding schools from my father and paternal relatives. Their stories have drawn me to understand Diné and diverse Indigenous experiences in boarding schools over generations.

I exist, because my father survived boarding school, and his mother before him survived boarding school, and her father before her survived boarding school, and his parents before him survived the Long Walk—the forced removal and concentration of Diné at Hwééłdi, “Land of the Suffering.” Because of my ancestors, my children and I have the opportunity to thrive as Diné. These thoughts really hit me recently, as I ponder how the US government is finally launching a Federal Indian Boarding School Truth Initiative through the leadership of Secretary of the Interior Deb Haaland (Laguna Pueblo).

In my first book, The Earth Memory Compass (University Press of Kansas 2018), I share the story of how my father ran away from the Ramah Indian Boarding School. I woke up recently crying, rethinking my father’s story of running away because it dawned on me that my father almost did not survive boarding school. He almost froze to death when he ran away with another boy in the winter. I asked him if I could share this story again, and he consented to it. He told me how bullies at the school led him to run away, and he asked friends if they wanted to run away with him. Another boy decided to come with him because he also wanted to go home. On their way they got caught in a canyon during a snow drift that almost killed them. But they were fortunately found by a rancher who saved their lives. I thought of all the stories of boarding school runaways and how some children died that same way that my father almost did—freezing to death in their attempt to return home. When I asked him why he ran away, he told me that he “did not run away from the education.”

Think of all the daughters, sons, brothers, and sisters who are family and never returned home or passed away soon after getting home. Think of their posterity that could have been. My father should have never had to face such struggles and hardships. This history lives on in him, me, and my children. Diné and many Native American and Indigenous peoples continue to fight every day for basic human rights such as access to clean water, shelter, food, healthcare, and schooling for and by their own people. The Navajo Nation is still fighting to reclaim Diné education.

My father may have survived the boarding school, but he suffered many injuries—and not just physical ones. He will never say these things because he does not live his life as a victim. He is an active agent who has persevered much but has also lived in joy and peace. Yet my father never taught me and my siblings Diné bizaad, so I fear that the seed of the Navajo language that he has carried may not survive. There is much that we still must do to pursue healing. And it is important to recognize that healing is not a checkbox to be marked off. Healing is a cyclical, ongoing journey through generations and time.

Indigenous kinship, community networks, and protocols are essential to understanding Indian boarding schools and to the ongoing journeys of healing and reconciliation. There are many different tribal nations and Indigenous communities, including some that are intertribal in urban settings. Every specific context and Indigenous community and kinship networks must be connected hand-in-hand with these initiatives to address the effects of Indian boarding schools. The National Native American Boarding School Healing Coalition and so many others have been paving the way for this truth, healing, and reconciliation. My friends Marsha Small and Preston McBride have been working on finding and accounting for the lost boarding schoolchildren, including those in unmarked mass graves, who did not survive Indian boarding schools. We are collaborating on providing guides to Indigenous protocols based on our experiences and work.

We need to support one another in these efforts to acknowledge and learn of the truths, perspectives, and experiences of Indian boarding schools; to stop the boarding school legacy of genocidal practices and approaches that seek to eradicate Indigeneity; and to embrace and support Indigenous sovereignties, ways of knowing, and education. Value Indigenous stories, histories, and lives. Actions reveal these values. We can return the lost boarding schoolchildren home by finding them, learning about them, and supporting and connecting with their families and Indigenous communities that include boarding school survivors.

My forthcoming book that I am coauthoring with Mike Taylor and James Swensen is tentatively called Returning Home because of such interconnections of healing and reconciling Indian boarding school pasts with Indigenous communities today and their futures. Please continue the languages that the children were punished for speaking; be sure the sick, hungry, and homeless of Indigenous communities can receive care and support; teach all about Indigenous histories from Indigenous perspectives and voices; and listen to Indigenous communities, following their directions and guidance toward healing. These are only some beginning steps, but we all need to begin somewhere step by step. Boarding school history matters because Native American families have paid far too great a price to educate their children, and they continue to pay that price to this day.

Dr. Farina King is assistant professor of history and affiliate of the Cherokee and Indigenous Studies Department at Northeastern State University, Tahlequah, Oklahoma.

Why the Confederacy Lost—and Why We Should Care Today (part 1)

Christian B. Keller, editor of Southern Strategies: Why the Confederacy Failed

In this era, when things Confederate have fallen out of vogue with both the general public and academia, and rebel monuments seemingly topple to the ground everywhere, it may appear a bit untimely to consider why the history of the short-lived Confederacy matters for modern senior leaders. But precisely because it was such a fleeting experiment—confronted with inherent political contradictions, overwhelming survival challenges, and a lethal adversary bent on its destruction—that its value as a basis for thinking about modern strategic problems is especially salient. People may recoil in repugnance today at the thought of pondering the slave-holding republic, but we need to move past any presentist, politically motivated agendas and take a clear-eyed, historically contextual look at why the Confederacy failed and what we can learn from it. In so doing, we may be surprised what we can glean to improve our thinking about current issues.

In my recent book, Southern Strategies: Why the Confederacy Failed, I and my contributors, all faculty, former faculty, or former students at the U.S. Army War College in Carlisle, PA, wrestle with what I call the “big questions” of rebel defeat that dwell at the strategic (i.e., the “war-winning and war-losing”) level of war. We dip down into operational and even tactical history as necessary, especially when key campaigns, such as Antietam or Gettysburg, represented significant contingency points in the course of the Civil War. We root our analyses in classical and modern strategic theory, incorporating Carl von Clausewitz’s timeless maxims, the ends-ways-means-risk paradigm, and the DIME construct (Diplomatic, Informational, Military, and Economic instruments of national power) as interpretative lenses to better evaluate our chosen topics. They range from the criticality of competent military leaders, to macroeconomic and diplomatic reasons for Southern failure, to abysmal lapses in intelligence. And we give due credence to the better decision-making and more adroit application of power of the secessionist South’s Federal opponents.

Let’s be perfectly clear: the subject of why the Confederacy lost the war is well-plowed intellectual ground. All the great Civil War historians of our time, from Bruce Catton onward, have thought hard and written much about it. As I state in the introduction of our book, we simply want to join the debate and possibly reframe it a bit, not arrive at “final” answers, which of course is almost impossible to do. If we’ve done our job well, our readers ought to be well-equipped to think critically and creatively about their own big problems and use the Confederacy’s downfall as a useful, if inexact, case study.

The first point we make is that leaders matter a lot, both for the chances of historical rebel independence and the achievement of national, corporate, or civic objectives today. Contrary to popular belief, the Confederacy possessed a very limited bench of gifted strategic-level military leaders and fewer good operational ones than generally perceived. Essentially, the rebellion started out with Robert E. Lee, Albert Sydney Johnston, and Braxton Bragg as potential strategically minded generals, and ended up with only one of them meeting expectations: Lee.

Johnston fell early at Shiloh and may or may not have matured to the high levels of command foretold for him had he lived, whereas Bragg could think strategically but was a weak executor. There were no other generals who could rise to the highest level of war, but Stonewall Jackson, James Longstreet, Joseph E. Johnston, and possibly P. G. T. Beauregard and Kirby Smith were all able operational leaders who could think, if not execute, strategically. Various reasons explain why this was so for each of these individuals, but suffice it to say for our purposes here that they, too, were limited in number and, once incapacitated or killed, were of no further use to their respective armies. This reality was particularly damaging to the efficacy of the Army of Northern Virginia, which boasted the command team of Lee, Jackson, Longstreet, and J. E. B. Stuart from spring 1862 to spring 1863. Their Federal adversary was very hard pressed to match the almost-unbeatable combination of leadership qualities offered by these men, who came close to attaining victory for the South in the Eastern Theater. I argue in my essay that the Lee-Jackson partnership, in particular, was very valuable and nigh irreplaceable for the Confederacy, and that Stonewall’s death after Chancellorsville permanently damaged that potentially war-winning team.

The Union also had a limited bench of strategic-level generals, namely McClellan, Rosecrans, Grant, Sherman, and Thomas, but none of them died (although some were sacked), and the North enjoyed a far greater selection of competent operational leaders who could implement their chiefs’ intent, and, occasionally, both think and execute at the strategic level. Men like Meade, Sheridan, McPherson, Porter, and Schofield helped Grant and Sherman win, whereas Leonidas Polk, William Hardee, Richard Ewell, and A. P. Hill proved hard for Bragg and Lee to manage and couldn’t think beyond their own corps’ purviews.

Most modern organizations today, whether they be the U.S. Army, the Internal Revenue Service, the Pennsylvania Department of Transportation, the Hershey Corporation, or the town council of Carlisle, possess limited means in the form of good extant and potential senior leaders. Identifying them early, grooming them for higher-level responsibility, and finding ways to avoid their attrition are critical ways to ensure the future of any organization. Competent leaders—especially at the strategic level—are especially scarce, and, when expended, are not easily replaced. We must learn to take care of them, educate and mentor them properly, expose them to the necessary professional experiences, and do what it takes to elevate them to the billets they deserve to occupy.

In the forthcoming Part 2 of this essay, we will examine the diplo-economic reasons for Confederate defeat and their modern applications.

Dr. Christian B. Keller is professor of history and director of the military history program at the U.S. Army War College in Carlisle, PA. He is the author, co-author, or editor of six books on the U.S. Civil War, most recently, Southern Strategies: Why the Confederacy Failed (University Press of Kansas, 2021).

Texas, Oklahoma, and the Long Saga of Athletic Conference Realignment

by Brian M. Ingrassia, author of The Rise of Gridiron University: Higher Education’s Uneasy Alliance with Big-Time Football

In July 2021, the Universities of Texas and Oklahoma announced they would leave the Big 12 Conference by 2025 and join the Southeastern Conference (SEC). In doing so, they will push that college-sports juggernaut up to sixteen members, casting aside virtually all doubt that the SEC is the biggest football conference in the land. Certainly, the SEC plays the most profitable brand of the collegiate gridiron game, with instant name recognition and huge venues. Four SEC stadiums currently top 100,000 seats each, while another nine are bigger than every single Major League Baseball stadium. Texas and Oklahoma, with stadiums topping 100,000 and 80,000 seats, respectively, will fit right in. The SEC, it seems, is poised for dominance.

This is clearly a momentous time in college sports, but what does it all mean? Why, in short, do athletic conferences realign?

The athletic conference is an old institution, dating to the turn-of-the-century Progressive Era, and it has shifted over time. Arguably, the first conference was the Intercollegiate Conference of Faculty Representatives, later known as the Big Nine or Big Ten. This is the conference that, as of 2021, has fourteen members and a logo that reads as “B1G”—which, of course, looks like it could easily be “Big 16” or some other indeterminate (yet large and impressive) number.

The recent Texas-Oklahoma-SEC realignment appears to throw American college athletics into turmoil. Historian Andrew McGregor astutely points out that the SEC now seems poised to challenge the power of the National Collegiate Athletic Association (NCAA) in an era when “the amateur model is no longer the status quo.”[2] After all, the US Supreme Court recently ruled that college athletes can receive greater compensation for athletic services.[3] Now, more than ever, is a time suited for organizations primed to make money off of college sports, rather than for an organization whose stated raison d’être is to ensure that college athletes remain amateurs

The NCAA has been around since football’s crisis era of 1905–1906, and ever since World War II it has played an increasingly large role in regulating collegiate sport. In the 1950s, the NCAA even created the term student-athlete as a way of dodging worker-compensation laws.[4] Now, big-time athletic programs are poised to take advantage of the erosion of the NCAA hegemony.

Although the present is a unique moment in the history of college sports, it is not without precedent. Historically, athletic conferences have realigned at times when new media or transportation technologies—and their implications for the profitability of college athletics—intersect with changing ideas about higher education. At such moments of disruption, some colleges and universities start to perceive themselves differently from other institutions, or at least come to believe they can find a better way of regulating and administering sport.

When conferences, including the Big Ten and Missouri Valley, first formed at the turn of the twentieth century, universities were forming rules regarding player eligibility and athletic commercialization. How could a college enrolling only four-year liberal arts students compete against a school admitting a wider range of pupils, including those in technical programs? How could a college prevent its football team from traveling so far that players would miss weeks of classes? Agreeing to schedule games only against like-minded institutions—say, major state universities within a 350-mile radius—was one way to handle this dilemma.

Over time, more concerns arose. By the 1930s, some small, private institutions started seeing themselves as liberal arts colleges. They were not in the business of big lecture courses or graduate education, and they also were not in the business of funding lavish athletic spectacles for tens of thousands of spectators or millions of radio listeners. For such schools, it no longer made sense to play sports against big universities, teachers colleges, or junior colleges.

During the Great Depression, declining incomes made it harder for smaller colleges to survive, while government programs funded expansion of public junior colleges and teachers colleges. Small, private colleges wanted sport, but only as an educational activity for students. As a result, conferences fragmented. In Illinois, for example, a conference with over twenty members split when ten liberal arts colleges decided to form their own circuit. In Iowa, a similar realignment ensued when small colleges sought to distance themselves from bigger, state-funded institutions.

Seen in this light, what has prompted the most recent wave of realignment, especially the Texas-Oklahoma exodus? The reasons are more complex than can be realized in the moment, but it certainly has something to do with the massive amounts of money generated by electronic media.

In the early 1980s, at a time when the NCAA only allowed one televised football game per week, the Universities of Georgia and Oklahoma sued the NCAA. The Supreme Court ruled in NCAA v. Board of Regents of the University of Oklahoma (1984) that the NCAA had violated antitrust law.[5] The limit on televised games was now gone. It should come as no surprise that around the same time, coaches’ salaries started going through the roof. Although student-athletes were still bound by NCAA regulations, nothing stopped coaches from becoming free agents.

By the early twenty-first century, conferences created their own media empires. The Big Ten Network, for instance, debuted in 2006, and the University of Texas Longhorn Network started five years later. Any conference that could gobble up the most successful and storied programs—or at least those in the biggest media markets—could make a mint. It was an athletic director’s dream come true, and clear evidence that the media tail was wagging the athletic-conference dog.

The so-called power conferences—namely, the SEC, Big Ten, Big 12, Pac-12, and ACC—are made up primarily of big universities that have, historically, seen public outreach as part of their mission. For them, sport is not merely an educational activity, the way it might be for a liberal arts college. I argue in The Rise of Gridiron University: Higher Education’s Uneasy Alliance with Big-Time Football that in the early 1900s some universities so valued intercollegiate athletics that they created athletic departments, led by professional athletic directors and coaches, to administer sports.[6] Subsequently, many larger universities did not retreat from athletic commercialism, but actually embraced it. Arguably, what we now see in 2021 is universities with big-time athletic departments hoping to profit by joining new conferences to form ever-larger sporting economies of scale. Conferences, meanwhile, are happy to oblige.

Where will it stop? No one can predict the future with certainty, but it seems as if the era of athletic-conference octopuses—with tentacles reaching out to gobble up fan bases and media markets all around the nation—has not yet come to a close. Perhaps someday the SEC will no longer be focused on the Southeast, or the Big Ten will reach a point when it can be no bigger. In the meantime, no doubt, plenty of games will be watched on television and streaming services.

Brian M. Ingrassia is Associate Professor of History at West Texas A&M University in Canyon, Texas.

[2] https://www.washingtonpost.com/outlook/2021/08/02/oklahoma-texas-just-dealt-ncaa-major-blow/

[3] https://www.espn.com/college-sports/story/_/id/31679946/supreme-court-sides-former-players-dispute-ncaa-compensation

[4] https://www.oah.org/tah/issues/2016/august/the-job-is-football-the-myth-of-the-student-athlete/

[5] https://www.oyez.org/cases/1983/83-271

[6] https://kansaspress.ku.edu/978-0-7006-2139-2.html

 

 

The Day That Shook America – Introduction

J. Samuel Walker’s The Day That Shook America: A Concise History of 9/11, offers a long perspective and draws on recently opened records to provide an in-depth analysis of the approaches taken by the Clinton and Bush administrations toward terrorism in general and Al-Qaeda in particular. The book is a passion project for Sam and we are honored to offer the Introduction below.

INTRODUCTION

When Charles Falkenberg, his wife Leslie Whittington, and their daughters Zoe, age eight, and Dana, age three, boarded United Airlines Flight 77 on the morning of September 11, 2001, they were embarking on what promised to be a once-in-a-lifetime family adventure. Their flight from Dulles International Airport, which served Washington, DC, to Los Angeles was the first leg of a trip to Australia, where Whittington had been awarded a two-month fellowship at the Australian National University in Canberra. She was a professor and associate dean at Georgetown University in Washington and had played a key role in building the school’s public policy program. She was an economist whose research interests centered on the impact of tax policies on families. Leslie was a “fun person” with an easy laugh who was exceptionally warm and outgoing. She was a highly regarded teacher who, a former student recalled, “could find humor in economics—which can be rare.”

Charlie Falkenberg was a software engineer who designed programs for analyzing scientific data, especially on environmental issues. One of his projects was collaborating on a study of the long-term effects of the massive Exxon Valdez oil spill that occurred in Alaska in 1989. While in Alaska, he had developed a taste for sockeye salmon, which he enjoyed cooking for friends and neighbors at the family home in University Park, Maryland. Charlie was as outgoing as his wife and was well-known as an organizer of community events, including work parties that periodically cleaned up the creek that ran through the town.

Charlie and Leslie were devoted parents. Zoe’s third-grade teacher, Michele Rowland, remembered her as a “delightful child.” She was an excellent student who loved ballet and participating in school plays. She was highly competitive in the sense that she always wanted to do well. On one occasion, she fell from her scooter and broke her elbow. She had an important standardized test coming up, and as she was being wheeled into surgery, she yelled at her mother, “What am I going to do about taking [the test]?” Dana was a curly haired charmer who customarily wore a smile that filled her face. She liked to dress up in outfits that ranged from a tutu to a feather boa with large sunglasses. She especially enjoyed riding on her father’s shoulders to the nearby elementary school to meet Zoe at the end of the day. Charlie and Leslie stood out among parents as strong supporters of the school. They were active in the PTA, and Leslie often provided crayons, pencils, and other supplies for all the children in Zoe’s class. The entire family was eagerly looking forward to exploring Australia, and the girls were excited about the prospect of seeing kangaroos and koala bears.

Also boarding Flight 77 on the morning of September 11 were five Saudi nationals who were embarking on an adventure of their own that was emphatically sinister. They were operatives of the terrorist network called Al-Qaeda, led by an exiled Saudi living in Afghanistan, Osama bin Laden. They intended to hijack the plane and smash it into the Pentagon building, just outside of Washington, as a way of expressing their hatred of the United States. Between 8:51 and 8:54 a.m., about a half hour after the plane took off, the men moved to carry out their plan. Brandishing knives and box cutters, they herded the flight attendants and passengers to the rear of the plane. They seized the cockpit and disabled—or, more likely—murdered the two pilots. One of the terrorists, Hani Hanjour, was a trained pilot, and he took over the controls. He turned the plane around from its westward course and headed east toward Washington. As he neared the Pentagon, he gunned the engines. At 9:38, the plane hit the ground floor of the west side of the building at a speed of about five hundred and thirty miles per hour. The impact of the crash killed everyone on board instantly and one hundred and twenty-five Pentagon workers.

In the wake of the disaster, a close friend of the Whittington-Falkenberg family, Patrice Pascual, lamented: “They were the kind of people who had no prejudices. That’s part of what makes this so horrible, because they spent their last minutes with people controlled by hatred.” Years later, Judy Feder, a colleague of Whittington at Georgetown, reflected on the fears that must have prevailed on Flight 77 and especially for Zoe and Dana Falkenberg. “I think from time to time—and then try not to think—what it must have been like to be on that plane,” she told a reporter. “I think about those intelligent and inquisitive little girls asking questions and how horrifying it must have been.” Recovery workers at the Pentagon never found Dana’s remains in a condition that could be “individually identified.” They recovered remains that were almost certainly those of Zoe, along with pajamas and a Barbie doll.

On the morning of September 11, I was on a research trip and, at least for a time, oblivious to the tragedies that were taking place at the Pentagon and the World Trade Center in New York City. I was driving from Davidson, North Carolina, to Atlanta, Georgia, by way of Aiken, South Carolina. I planned to have lunch in Aiken with friends and talk about the subject of my research, the Three Mile Island nuclear accident. As I drove through lightly populated areas of South Carolina, I found to my annoyance that there was nothing on the radio I liked. I turned it off and cruised in silence toward my destination. When I got bored enough, I decided to try the radio again in hopes of finding something interesting.

As soon as I turned on the radio, I knew from the tone of the announcer’s voice that something dreadful had happened. She was saying that President George W. Bush had been informed and had left the school he was visiting in Florida. Informed of what, I wondered? I soon found out when the station switched to reporters in New York who were describing the attacks on the World Trade Center. Information was sparse at that point, but it seemed clear that planes had deliberately smashed into the twin towers. As I tried to assimilate this story, the station switched to the news of the strike on the Pentagon. Within a short time, I listened with horror to the live account of the sudden collapse of the south tower. Beset with anxiety and incredulity, I tried to call my wife, who worked in downtown Washington, from a pay phone at a gas station (I had no cell phone). But phone lines were jammed and I could not get through.

There I was in the middle of nowhere, worried and helpless, with nothing to do but drive on. When I reached the home of my friends in Aiken, I was able to reach my wife. She had made it home safely, though her eight-mile trip had taken a very long time. She had talked with our children and other members of my family, and she could assure me that everyone was fine.

After lunch, I drove to Atlanta. I spent the evening watching news reports of the sorrowful events of September 11. The following day, I conducted research at the Jimmy Carter Presidential Library and then retreated to my hotel to catch up on the news. The television networks ran streamers that listed the names and hometowns of the victims of the terrorist attacks, and I was unpleasantly jolted to see that the list included a family from my hometown. I live in University Park, Maryland, and although I did not know Charlie Falkenberg, Leslie Whittington, or their girls personally, it was shocking and saddening to see their names. University Park is a small and close-knit community, and the deaths of neighbors who lived just two blocks away added a personal dimension to the melancholy story of 9/11.

At the time and in later years, I have been troubled by a number of questions about the disaster that occurred on “The Day That Shook America” (as the cover of People magazine labeled it). What were the purposes of the attacks? Why did US intelligence agencies and the Defense Department, with annual budgets in the hundreds of billions of dollars, fail to protect the country from a small band of terrorists who managed to hijack four airliners and take the lives of thousands of American citizens? What did responsible government agencies and officials know about Al-Qaeda and why did they not do more to head off the threat it posed? What were US policies toward terrorism, especially under Presidents Bill Clinton and Bush, and why did they fall so far short of defending against a series of attacks? Was the tragedy of 9/11 preventable? And what was the long-term impact of the strike against America on that terrible day? Those are the most important questions that this book tries to answer.

J. Samuel Walker is a professional historian and the author of, among other titles, Three Mile Island: A Nuclear Crisis in Historical Perspective; Prompt and Utter Destruction: Truman and the Use of Atomic Bombs against Japan; Most of 14th Street Is Gone: The Washington, DC Riots of 1968; and The Road to Yucca Mountain: The Development of Radioactive Waste Policy in the United States. He lives in the Washington, DC, area.

In a New York Minute: The Rise of Kathy Hochul, the First Female Governor of New York

by Kaitlin Sidorsky, author of All Roads Lead to Power: The Appointed and Elected Paths to Public Office for US Women

Governor Kathy Hochul was sworn in as the fifty-seventh governor of New York State, following in the footsteps of Theodore Roosevelt, Franklin Roosevelt, and Nelson Rockefeller. In fact, Governor Hochul follows an unbroken line of men: she is the first female governor in New York State’s history, joining thirty other states who have had a woman in the governor’s mansion. California, Florida, and Pennsylvania are among the nineteen states who have never elected a woman as governor, over one hundred years after women were given the right to vote. Like many female politicians, Governor Hochul has an impressive political résumé. Hochul began her political career on the Hamburg Town Board, then rose from the county clerkship of Erie County to become a member of the US House of Representatives, and, most recent, lieutenant governor of New York.

Hochul’s progression to governor following Andrew Cuomo’s resignation is significant because her highly visible position in New York State government provides opportunities for a different governing style and a focus on new issues; she also becomes a more noticeable role model to inspire young girls and women, who may now consider politics a viable career. Governor Hochul has already signaled that she plans on governing differently than her predecessor, particularly by making sure that women feel safe in her administration. In fact, Hochul’s formal ceremony welcoming her as the new governor was full of symbolism accentuating the historic moment of her ascension to the highest office in the state. As the New York Times reported, “In honor of the women’s suffrage movement, Ms. Hochul wore an all-white dress, as did her daughter, Katie, and her daughter-in-law. Judge DiFiore donned robes worn by the first woman to serve as a judge on the state Court of Appeals, Judith Kaye, Ms. Hochul noted. And the governor called on female reporters to ask the first three questions at the news conference.”

Hochul enters the governor’s office under challenging circumstances. A global pandemic rages, polarization runs rampant, and she is the head of a challenging, even combative, state government apparatus. This workplace friction is the antithesis of what we know of female officeholders, many of whom are more cooperative and compassionate than their male counterparts. Currently, all three of the top elected positions of New York State government are held by women (Hochul as governor, Andrea Stewart Cousins as acting lieutenant governor, and Tish James as attorney general), an incredibly rare circumstance in American government. Below I outline what we might expect from female leadership in the executive branch at the state level.

Although research is limited on the effect of gender vis-à-vis governing styles, we do know of a few differences between male and female legislators that may be relevant for female governors and state executives. Female legislators are typically more liberal than their male counterparts and focus more on “female” issues like education, welfare, and health care (Barrett 1995; Burrell 1997; Diamond 1977; Reingold 2000). One study (Heidbreder and Scheurer 2013) had similar conclusions for female governors between 2006 and 2008. In some ways this may be unavoidable for Hochul as she addresses the pandemic and its effects on the education system and welfare programs. The question becomes whether Hochul and her female counterparts respond differently than male governors to similar challenges across the fifty states.

When it comes to how gender may influence governors, we know that female governors are more likely to appoint women than male governors, which is important to understanding how women serve beyond elected office (Riccucci and Saidel 2001; Sidorsky 2019). In my 2019 book with Kansas, All Roads Lead to Power, I demonstrate the incredibly important role state-level appointees play in state government. This understudied population provides a wealth of services to a state, with more than a third of appointees having held some kind of public office prior to their current position. Hochul has already committed to appointing more women to high-profile positions, which may be key to helping her change the culture of New York State government.

One of the most positive effects of Hochul’s presence will be to promote politics as a career path for women. A few studies have been conducted asking whether a woman in political office becomes a role model for young girls and women. In one of the first studies on the role-model effect, David Campbell and Christina Wolbrecht found that visible and viable female candidates for high-level office result in more young girls wanting to be politically active (2006). Research from 2018 also showed that the presence of a woman as governor increases the numbers of women who run for the state legislature (Ladam, Harden, and Windett 2018), although this may only be true for the Democratic Party (Manento and Schenk 2021). In All Roads Lead to Power I find a deep-seated distaste for elected office among female appointees across all twenty states surveyed. The presence of a highly visible and successful woman in elected office may be needed to prove to women that government is a place where they can be successful.

New York State could definitely use more women in public office. Ranked sixteenth in the nation for women’s representation in the state legislature, the state is far from gender parity, with only 34.3 percent of legislators being women (Center for American Women and Politics 2021). Representation is even worse at the local level: women hold 28.6 percent of municipal positions in New York. Governor Hochul’s presence will not automatically fix the gender parity issue in New York, nor will it provide the changes needed in the government’s culture. However, her rise to governor is an important step in the fight for gender parity. Only future research on her tenure as governor can tell us whether her gender identity influenced her leadership style and legacy as the first female governor of New York State.

References and Recommendations for Further Reading

Barrett, Edith J. 1995. “The Policy Priorities of African American Women in State Legislatures.” Legislative Studies Quarterly 20, no. 2 (May): 223–247.

Burrell, Barbara 1997. “The Political Leadership of Women and Public Policymaking.” Policy Studies Journal 25 (4): 565–568.

Campbell, David E., and Christina Wolbrecht. 2006. “See Jane Run: Women Politicians as Role Models for Adolescents.” Journal of Politics 68, no. 2 (May): 233–247.

Diamond, Irene. 1977. Sex Roles in the State House. New Haven: Yale University Press.

Dickes, Lori A., and Elizabeth Crouch. 2015. “Policy Effectiveness of U.S. Governors: The Role of Gender and Changing Institutional Powers.” Women’s Studies International Forum 53 (November–December): 90–98.

Heidbreder, Brianne, and Katherine F. Scheurer. 2013. “Gender and the Gubernatorial Agenda.” State and Local Government Review 45, no. 1 (March): 3–13.

Ladam, Christina, Jeffrey J. Harden, and Jason H. Windett. 2018. “Prominent Role Models: High‐Profile Female Politicians and the Emergence of Women as Candidates for Public Office.” American Journal of Political Science 62, no. 2 (April): 369–381.

Manento, Cory, and Marie Schenk. 2021. “Role Models or Partisan Models? The Effect of Prominent Women Officeholders.” State Politics & Policy Quarterly 21, no. 3 (September): 221–242.

Reingold, Beth. 2000. Representing Women: Sex, Gender, and Legislative Behavior in Arizona and California. Chapel Hill: University of North Carolina Press.

Riccucci, Norma M., Judith R. Saidel. 2001. “The Demographics of Gubernatorial Appointees:

Toward an Explanation of Variation.” Policy Studies Journal 29 (1): 11–22.

Sidorsky, Kaitlin, 2019. All Roads Lead to Power: The Appointed and Elected Paths to Public Office for US Women. Lawrence: University Press of Kansas.

Kaitlin Sidorsky is assistant professor of politics at Coastal Carolina University, Conway, South Carolina. Her work has appeared in Political Research Quarterly.

George W. Bush: Public Defender

John Robert Greene, author of The Presidency of George W. Bush.

Until recently there has been an unwritten arrangement among the members of “The Club”—that exclusive group of, to this point, men who have served as president of the United States. That agreement, based on a fundamental understanding of the weight and responsibilities of the job, has kept a former president from criticizing any of his predecessors too openly or too sharply. Having walked a mile in his predecessor’s shoes, a former president agreed to be rarely seen—except when replenishing his coffers on the speaking circuit—and even more rarely heard in public. The Club’s tacit gag order does not extend to the rough and tumble of a presidential campaign—in the modern period, Harry Truman openly campaigned against Dwight D. Eisenhower in 1956; Gerald R. Ford against Jimmy Carter in 1980; and George H. W. Bush against Bill Clinton in 1996 and Barack Obama in both 2008 and 2012. Rather, it extends to that period when a predecessor was actually in office and affecting decision-making. The vast majority of our former presidents have honored that compact, allowing their successors to govern without having to endure much sniping from those who had done the job before. There were exceptions, of course, the most glaring being Herbert Hoover, who began his public criticism of the New Deal within days of Franklin D. Roosevelt’s March 1933 inauguration and continued without letup to criticize FDR and his policies for the entirety of his administration. The second exception was George W. Bush, the subject of my new book for the University Press of Kansas, to be released in the fall of this year.

Initially, Bush did not deviate from the expectations of postpresidential decorum. With but a few exceptions, Bush kept his distance from the Obama administration, choosing not to criticize his successor or his administration in public. Instead, Bush took a more solitary road. He moved to a $3 million home in a Dallas suburb and worked on the building of the George W. Bush Presidential Center (which opened to researchers on May 1, 2013). He also stretched his wings as a writer, penning a second volume of memoirs, Decision Points, in 2010 as well as a biography of his father in 2014. Bush then combined his interest in writing with a newfound retirement hobby—portrait painting—and in 2017 he released Portraits of Courage: A Commander in Chief’s Tribute to America’s Warriors, a collection of portraits of veterans of the War on Terror. Bush was also ubiquitous on the speaking circuit—between January 2009 and June 2015, he made at least 200 paid speeches, earning between $100,000 and $175,000 per appearance. But most of these speeches were delivered in private—conventions, meetings of businesses and organizations, and the like. One would be hard-pressed to find a public statement of opposition to Obama administration made during his presidency. Indeed, the two men became unexpectedly close, with Bush going out of his way to applaud Obama in the White House on May 31, 2012, when he and his wife, Laura, unveiled their official portraits; for his part, Obama used that occasion to be effusive in his praise of both his predecessor and his wife.

That would all change with Donald Trump, beginning with the 2016 primaries. In an effort to establish himself as the scion of the isolationists, Trump used Bush and his administration as a punching bag, blaming Bush for the attacks of September 11, 2001 (during a February debate: “The World Trade Center came down during the reign of George Bush. He kept us safe? That’s not safe.”) That, combined with many personal attacks on Jeb Bush, the former president’s brother and Trump’s opponent in the primaries (calling him “Low Energy Jeb”) earned for Trump the contempt of the forty-third president. Bush appeared with his brother in South Carolina in an attempt to save his campaign, but to no avail. In the fall campaign, as Trump was buffeted with charges of personal malfeasance, Bush chose to sit it out, perhaps assuming as he had in 2008 that Hillary Clinton would, as virtually every pollster predicted, emerge victorious. When she did not (according to one biographer, the senior Bush voted for Clinton, and the younger Bush voted for “None of the Above”), Bush dutifully attended Trump’s 2021 inauguration. But when Trump’s inflammatory inaugural address was concluded, Bush turned to the Clintons and said in a stage whisper, “That was some weird shit.”

It took Bush about a year and a half to publicly speak out against President Trump. But when he did, he became not only the first former president in almost twenty years to publicly criticize both his predecessor and his policies (making him the first president in modern memory to break the rules of “The Club”) but also the first former president in our history to speak out against a successor of his own party. There can be no doubt that this was personal. On October 19, 2017, Bush spoke at a conference sponsored jointly by the Penn Biden Center for Diplomacy and Global Engagement, the George W. Bush Institute, and the Freedom House. Without mentioning Trump by name, Bush denounced the “casual cruelty” of modern political dialogue and spared few words in showing his contempt for his predecessor: “Bullying and prejudice in our public life sets a national tone, provides permission for cruelty and bigotry and compromises the moral education of children.” But Bush’s criticism was also highly political, as he once again showed the pains that he went throughout his entire career to navigate the ever-widening gap between the moderate wing of his party (as evidenced by the policies of his father) and the GOP’s conservative wing. This led him to single out a Trump administration policy that troubled both moderates and conservatives alike—the administration’s intolerant xenophobia. Also at the above-mentioned conference: “We’ve seen sensationalism distorted with nativism, forgotten the dynamism that immigration has always brought to America. We see a fading confidence in the value of free markets and international trade, forgetting that conflict, instability, and poverty follow in the wake of protectionism. We’ve seen the return of isolationist sentiments, forgetting that American security is directly threatened by the chaos and despair of distant places.” For his part, Trump gave as good as he got. On April 13, 2018, he pardoned I. Lewis “Scooter” Libby, a key advisor to Vice President Dick Cheney who had been found guilty of lying to federal investigators and obstructing justice in connection with the Plame-Wilson affair. Cheney had lobbied Bush hard for a pardon of Libby, but Bush refused. As a result, Trump’s pardon of Libby was widely seen as just one more slap at Bush’s face.

But Bush was not done. In an April 28, 2020, interview for CBS News, Bush told Norah O’Donnell that “I think it’s undignified to want to see my name in print all the time” and that “to me, humility shows an understanding of self.” Less than a week later, in a recorded message, Bush called for national unity during the COVID-19 surge (“We are not partisan combatants . . . we’re human beings”). An obviously irritated Trump responded, as was his wont, with a tweet: “Oh, bye [sic] the way, I appreciated the message from former President Bush, but where was he during impeachment calling for putting partisanship aside. He was nowhere to be found ir [sic] speaking up against the greatest hoax in American history.” Bush would later tell an interviewer that in November 2020, he went to the polls and wrote in the name of Condoleezza Rice, his former national security advisor and secretary of state, as his choice for president.

The Bush-Trump feud did not end with Trump leaving office. On April 20, 2021, while promoting his new book Out of Many, One: Portraits of America’s Immigrants on NBC’s Today Show, Bush returned to his earlier criticism, describing the condition of his party as left by Trump as “isolationist, protectionist, and to a certain extent, nativist.” While he did not mention Trump by name, Bush said that the January 6, 2021, insurrection against the US Capitol “made me sick” and was a “terrible moment in our history.” When asked specifically if Trump was to blame for the riot, Bush claimed that “I’m not going to cast blame” but then went on to say that “It’s an easy issue to frighten some of the electorate. And I’m trying to have a different voice.” For some in the Republican Party, Bush’s criticism was too little, too late. His description of his party as “nativist” brought a tweeted response from Joe Walsh, a former member of the House from Illinois and frequent Trump critic: “What the f— [sic] George W. Bush? Like Boehner, you come out NOW and speak out against Trumpism? NOW? So many of us former Republicans lost everything publicly opposing Trump these past few years, yet you said and did nothing. And NOW you speak?”

But Trump is not Bush’s only presidential target. Where Bush largely gave Barack Obama a free pass, not so President Joe Biden. Indeed, Biden’s announcement that he was going to withdraw all American troops from Afghanistan, a withdrawal that was set to be completed on September 11, 2021, drew Bush’s immediate ire. On July 14, 2021, in an interview with Germany’s Deutsche Welle News, Bush called the withdrawal a “mistake.” He argued that the withdrawal would endanger countless civilians, noting that “Afghan women and girls are going to suffer unspeakable harm” and that they and others were “just going to be left behind to be slaughtered by these very brutal people, and it breaks my heart.” Bush was immediately criticized by those who believed that the policies of his administration were responsible for the disasters in Afghanistan and Iraq in the first place, and that Bush should not be criticizing any decision that would end what had become America’s longest war in Afghanistan. On Al Jazeera, Andrew Mitrovica pulled no punches: “Bush is a mass murderer. He should be sharing a bunk bed with Ratko Mladić at The Hague, not giving interviews on Afghanistan in Maine.”

All this brings one of my basic conclusions in The Presidency of George W. Bush into high relief: that some twelve years after the end of the Bush presidency, we continue to live in a world made by George W. Bush. When it comes to the personality of Donald Trump, as well as the policies of both the Trump administration and the Biden administration, Bush doesn’t seem to like the politics of that world very much. Indeed, his criticism of both Trump and Biden is sharper and more public than has been the criticism dished out by any other former president (save Hoover) on any other president. Bush has become his administration’s own public defender. But in so doing, he has run afoul not only of the opposition Democrats but also another opposition party—the Trump wing of his own Republican Party. Whether Bush’s attempt to defend the policies of his administration from assaults by both these opponents will be successful remains to be seen, as the nation gears itself up for the 2024 presidential election.

 

John Robert Greene is the Paul J. Schupf Professor of History and Humanities, Cazenovia College, and the author of I Like Ike: The Presidential Election of 1952; The Presidency of George H. W. Bush, Second Edition, Revised and Expanded; Betty Ford: Candor and Courage in the White House; and The Presidency of Gerald R. Ford, all from Kansas.

 

The All-Too-Predictable Afghanistan Outcome

by Paul Darling, author of Taliban Safari; One Day in the Surkhagan Valley

The meteoric collapse of the Afghan National Security Forces (ANSF) is cause for even the most pessimistic observers of Afghanistan to reflect upon the myriad of failures of NATO, and the United States in particular, that led to the disaster unfolding largely silently today. The media, shameless in their breathless and unending coverage of the Fall of Saigon nearly fifty years ago, are unsurprisingly restrained in their coverage today. Some networks have eschewed any coverage at all. The military leadership, too, sees no reason to discuss the current happenings as they are just as culpable, if not more so, than the political leadership. Indeed, Afghanistan, while unquestionably a massive failure at every level, has been of great benefit to those responsible for its lack of success.

In the course of two decades, captains have become colonels, colonels have become generals, and generals have become pundits, political players and board members of very successful defense contractors. One would have to search most diligently, and entirely in vain, to find a single military leader held accountable for their failure to win our nation’s longest war against our least capable foe. Yet we still must listen to their musings on how they will defeat China.

As a mere tactical player in yet another chapter of the great game played in the mountains and deserts in and around the Hindu Kush, I can state unequivocally that our tactical supremacy was unquestioned. The Taliban could never stand and fight and they rarely tried. But, as Sun Tzu so cleverly observed, tactics without strategy are simply the noise before defeat. So the question must be asked: Did we lack strategy or did we have the wrong one?

To fast forward a few thousand years, we must crawl into the depths of Clausewitz to tear apart the answer to that question. War is policy by other means. And strategy is the designs by which conflict (in this case, primarily armed conflict) enact that policy. If our policy was the defeat of the Taliban as a challenge to the Islamic Government of Afghanistan, then our strategy of building schools for girls, unquestionable support for corrupt governments in Kabul, and the forced multiculturalism in the heart of the Pashtun part of Afghanistan was the wrong strategy. It appears our concept of operations was to make Afghanistan an experimental playground for social engineers to create a model of Afghanistan based upon lofty ideals hatched in the halls of western universities rather than the dusty realities to be found in the obviously untraveled expanses outside Kabul.

We clung to Kilcullen’s myopically contrived theories with simplistic slogans like “hearts and minds,” thinking that the key to defeating the Taliban was ignoring them. Like a cargo cult, we created an Afghan army that had all the trappings of an effective fighting force save for the actual fighting. We gave them the planes, guns, helicopters, and armored vehicles that they quickly abandoned to an enemy equipped with seventy-year-old guns and one-dollar plastic shoes from China. And that is even with a 5–1 numerical advantage and supposedly fighting in the defense.

All insurgencies, inherent in their military weakness, hinge upon one inviolate requirement: a refuge. Mao spoke of the insurgent “swimming among the people like fish in the water.” For Mao, the population was his refuge. The Taliban had no need for such quaint slogans or ethereal concepts. Pakistan has stood for twenty years untouched as the requisite refuge for the Taliban. And we did nothing.

The idea that Pakistan was ever an ally or even a disinterested party is, in retrospect, an absolutely failed concept. We have fought a twenty-year war against Pakistan and paid them handsomely to do so. The inability or unwillingness of our political and senior military leaders to address this fact is a failure bordering on treason.

At this point, it should be intuitive to even the casual observer that what we are witnessing now is a Pakistan-led invasion of Afghanistan by not even proxy forces, but rather mercenary forces. Pakistan is paying the Taliban to fight. And fight they are. This is why the ANSF is crumbling across Afghanistan, and not only in Pashtun areas. The Taliban is the true multicultural army in Afghanistan; Tajiks and Pashtuns (with Uzbeks and Turkmen along for the ride) united by the understandable desire for money and future control of the various provinces.

But Pakistan was not alone in this venture to humiliate America. China has largely subsidized the substantial costs involved in defeating NATO and ISAF. Pakistan (who gave their geo-political ally a complete F-16 Fighting Falcon many years ago) and China are united in many factors. Primarily their distrust of India. Theirs is a natural partnership. The cold war dynamics of Pakistan allying with America to counter a then pro-Soviet India has been dead for thirty years. Tragically, the octogenarian “experts” who continue to opine on such subjects (Kissenger being foremost among these) live on along with their long-expired opinions masquerading as policies.

This may be but one of the whys. I believe it to be the primary one, but another lurks and must be addressed for it will be the point of our next inevitable policy failure. Pakistan is a nuclear state. While our unwillingness to address Pakistan’s aid in killing Americans may simply have been abject stupidity on the part of our diplomatic and political elites, it may well have been a fear that Pakistan’s nuclear arsenal might slip into the hands of the various Islamic terrorist organizations Pakistan (and its allies across the Persian Gulf) still openly supports. This is not an insignificant fear. However, if the possession of nuclear arms gives any nation so endowed the free rein to kill Americans, the current administration’s apparent desire to bequeath this capability to Iran (much as Obama’s administration wished to do) must be viewed as absolutely insane. Even if Pakistan’s carte blanche was not hinged upon their possession of atomic weapons, Iran will most certainly assume that it was. So not only are we apparently giving Iran a nuclear weapons capability, we are also, through our humiliating failure in Afghanistan, giving Iran the apparent green light to make good on their weekly prayer of “Death to America.”

Those who ignore history are doomed to repeat it. Our efforts in Afghanistan are moving quickly to an ignoble historical fact. Will our self-anointed “elites” learn from this oh-so-near history? I fear not.

Paul Darling, Lieutenant Colonel, US Army (retired), lives in Kansas City, Missouri and is both father and son of combat veterans. His writing has been published in various venues including Defense News, Proceedings, Military Review, Armed Forces Journal, and Air and Space Power Journal.

Still Fighting about Birthright Citizenship

by Carol Nackenoff & Julie Novkov, authors of American by Birth; Wong Kim Ark and the Battle for Citizenship

What is the importance of the Tenth Circuit’s June 15, 2021 decision in Fitisemanu v. United States, that American Samoans, who are residents of an “unincorporated territory” of the US, are not entitled to US citizenship? Is Fitisemanu the opening salvo in a broader attempt to get federal courts to revisit the issue of birthright citizenship that the Supreme Court appeared to settle in United States v. Wong Kim Ark (1898), and about which we have recently written in the University Press of Kansas’s (June 2021) American by Birth: Wong Kim Ark and the Battle for Citizenship?

Judge Lucero’s majority opinion in Fitisemanu (20-4017) held that the issue raised by the American Samoan individuals seeking citizenship was not resolved by the citizenship clause of the Fourteenth Amendment, which dictates that “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the state wherein they reside.” The Circuit Court’s majority opinion reversed a 2019 decision by the District Court of Utah that had been stayed pending the appeal. Judge Lucero held that Congress had the authority to determine the status of individuals born in territories acquired by the United States at the turn of the twentieth century, as it had when it granted most Hawaiians (not ethnic Chinese) US citizenship, or Puerto Ricans citizenship (and later, birthright citizenship). Congress had considered granting American Samoans US citizenship in the 1930s but had declined to do so—the measure to grant them citizenship passed the Senate but failed in the House. These Pacific Islanders are considered “non-citizen nationals,” and as such, they can live and work, but not vote, in the United States.

The lower court’s ruling, which granted birthright citizenship to individuals born in US territories, was based upon the Fourteenth Amendment. The District Court opinion identified Wong Kim Ark as the binding precedent. The opinion accepted the plaintiff’s argument that they were born “within” the United States and thus entitled to citizenship; the Tenth Circuit’s position was that the Constitution (and the Fourteenth Amendment) did not necessarily “follow the flag,” relying on the Insular Cases (1901). The University Press of Kansas has published an excellent treatment of the Supreme Court’s decisions surrounding rights of residents of territories acquired by the US in the Spanish-American War in Bartholomew H. Sparrow’s The Insular Cases and the Emergence of American Empire.

While the case itself deals with a small number of individuals in unusual circumstances, its implications are broader. Recent efforts by the Trump administration and immigration restriction advocates to limit birthright citizenship, especially for children born in the United States to undocumented residents, loom in the background. Through administrative action, the Trump administration tried to restrict “birth tourism,” the organized and lucrative practice of arranging visits that included shopping and trips to Disneyland for pregnant women so that they could give birth in the United States, conferring US citizenship on their newborns. Building a wall served as a symbol of the resolve to deter would-be US entrants from crossing the border, and the Trump administration’s decision to separate children and parents who did manage to cross the border was designed as another deterrent. But the executive branch is limited in what it can do to end birthright citizenship—for now.

Congress, as the Circuit Court majority in Fitisemanu pointed out, also has some authority over citizenship, especially for those born outside the territorial boundaries of the United States to non-citizen parents. Congress may decide who can naturalize and exercised this authority to deny this opportunity to the Chinese in America from 1882 until 1943. Some have argued that Congress should thus be able to give or withhold its consent to the incorporation of a group as citizens (see Elk v. Wilkins, a 1884 case involving Native Americans that states “no one can become a citizen of a nation without its consent,” the General Allotment Act of 1887, and the Indian Citizenship Act of 1924). This position has been staked out by Peter Schuck and Rogers Smith in their 1985 book, Citizenship without Consent, and a subsequent article, where they claim that since illegal aliens were not a recognizable category at the time the Fourteenth Amendment was adopted, and since we cannot completely recover the intent of the framers of the Constitution or of the Fourteenth Amendment on this matter, the decision should be left to the people’s elected representatives in Congress. They reason that, in a liberal polity, the people ought to be allowed to give or withhold consent to membership, and Congress is the appropriate institution to make these determinations.

Drawing an analogy between the processes by which Native Americans were made citizens (including the nation’s consent statement in Elk v. Wilkins) and the situation for those born in the United States to non-citizen parents is faulty for a couple of reasons, however. First, Native Americans born into a tribe were, since the time of Chief Justice John Marshall, considered as members of distinct political communities and members of domestic dependent nations. In the reasoning of nineteenth-century jurists, they were in a state of pupilage, and it required action by the United States to change that status. The lands on which tribal Indians lived were not simply part of the United States and moving to an individual homestead away from the tribe did not erase tribal membership (although the General Allotment Act, passed in part to remedy that situation, did envision citizenship for those Native Americans who took up individual land allotments and lived and worked on them). These points surely could not be made with regard to the sons and daughters of English, Swedish, or even Irish, Italian, and other immigrants who came to the United States in the nineteenth century and whose offspring, born on US soil, had long been considered birthright citizens, whether or not the parents naturalized. Congress did not find it necessary to make the American-born sons and daughters of Caucasian or white immigrants citizens by birth (although an act of Congress at one time denationalized white women who married foreign men, rendering some of them stateless). English common law—recognized in early US court cases—and precedent made them birthright citizens.

Second, it is important to note that Justice Gray, who wrote the majority opinion in Elk v. Wilkins that granted Congress the power to determine Native American citizenship, also wrote the majority opinion in Wong Kim Ark. As Gray, an acknowledged expert on the law of sovereignty explained, the fundamental principle of the common law with regard to English nationality was “birth within the allegiance . . . of the King.” He noted that this principle had been followed almost without exception by courts when dealing with controversies in the colonies and the United States, even before passage of the Fourteenth Amendment. And while all conceded that the purpose of the Fourteenth Amendment was to extend citizenship to formerly enslaved Black people and their heirs, Justice Gray’s opinion pointed out that the language was “general, not to say universal, restricted by place and jurisdiction and not by color or race.” The purpose of the text of the citizenship clause, he argued, was to carve out some very narrow exceptions to the general principle of extending jus soli citizenship (citizenship derived from birthplace)—exceptions chiefly relating to children of foreign ambassadors, children of military enemies, and children born at sea. Others born here were subject to the jurisdiction of the United States in the ordinary sense of being under obligation to obey US laws. And in the years after the decision, despite increasing concern about the new legal category of illegal immigration, Congress never passed any laws purporting to limit citizenship only to the descendants of legal US residents.

The federal courts have been key players in battles over citizenship. Following United States v. Wong Kim Ark, the Supreme Court upheld the principle of birthright citizenship twice again by the mid-twentieth century. Would a now-conservative Supreme Court overturn this decision? We think it highly unlikely. It is possible, however, that the current Court would consider an argument, if a case were presented, that Wong Kim Ark, whose parents were living and working in the US with permission, did not raise or cover the case of those whose parents were living here illegally.

In American by Birth, we examine the history of birthright citizenship in the United States. The book explains how the Fourteenth Amendment was read to codify that history, extending the principle of citizenship by birth in a nation’s territory even to the American-born sons and daughters of Chinese immigrants who were so vilified that the parents were barred from naturalization in the 1882 Chinese Exclusion Act. After being rebuffed in the federal court system, nativists and immigration restrictionists turned to other routes to keep undesirables from entry and subsequent citizenship. The Asiatic Barred Zones Act (1917) and the quota system adopted in immigration restriction measures in 1921 and 1924 were noteworthy; during the Depression, efforts also included ‘encouraging’ the out-migration of Mexicans (a not insignificant portion of whom were born here) who were seen as drains on public resources. The rigid application of the quota system was also an important factor behind US inaction in the face of the humanitarian refugee crisis during the Holocaust. The desire to keep America white strongly influenced how territorial acquisitions were treated. With a noteworthy increase in immigration from Mexico, Central America, and the Caribbean beginning around the 1970s, interest in policing the Southern border and in revisiting birthright citizenship increased. American by Birth brings the examination of birthright citizenship up through efforts during the Trump administration to change current law and practice—and the reading of the Fourteenth Amendment embraced in Wong Kim Ark. The United States is not alone among nations with generous citizenship provision in experiencing backlash, and in the final chapter, we consider the different methods by which American opponents of birthright citizenship are trying to effect change now.

Carol Nackenoff is the Richter Professor of Political Science at Swarthmore College

Julie Novkov is interim dean of Rockefeller College of Public Affairs & Policy and professor of political science and women’s, gender, and sexuality studies at the University at Albany