The U.S. Has Become the Cultural Equivalent of a Big Box Store. Localism Can Help Americans Feel More at Home. 

(Photo by Steve Banfield)

Athens, Georgia

There was a fun article in the Süddeutsche Zeitung earlier this month on immigrant nurses in Germany’s Swabian mountains.1  Germany has a severe nursing shortage and has been recruiting health care workers from around the world. The small town of Markgröningen, about a half hour car ride north from Stuttgart, has welcomed nurses from countries as far away as Kenya, India, and Mexico.  

While foreign recruits have long been obliged to take German language classes to prep for their jobs, regional medical authorities have recently added a three-day course in Swabian. That’s because nurses were having a hard time understanding patients who prefer to speak dialect to standard German. And not only do the nurses learn the Swabian words for “to prick” or “to annoy,” they’re also taught local proverbs and sayings that sound ludicrous to the foreign ear. 

According to a 2009 study by Mannheim University’s Institute for the German Language, about 60% of Germans speak one regional dialect or the other. (The definition of a dialect, by the way, is a language that doesn’t enjoy the support of an army or navy.)  Of those who speak dialect, almost half (45%) say they do so “always” or “often.”2

I love this story, because I’m a big believer in the virtues of localism. I  believe expressive freedom is found more in the small places out of sight and reach of national political power. I think the increasing centralization of government in the U.S. has diminished regional distinctiveness and turned the broader culture into the cultural equivalent of a big box store. 

Too many cultural dictates originate from the top of the political pyramid and that is exactly where a functioning culture should not be born. A healthy culture has little to do with what Americans tend to obsess over–identity, power, or pride. More fundamentally, culture is about belonging, intimacy, meaning, and enjoying a sense of place, your place, your neck of the woods, your little corner of the world.

One powerful example of a different way of envisioning the relationship between province and nation is that in German the word heimat –“homeland”–generally refers to one’s home region and not to the entire country. The term implies deep attachment and affection.  What this tells me is that federalism is not just a way of structuring layers of government, it can also be a model for how to understand the proper relationship between regional and national cultures.

The best way to encapsulate the meaning of federalism is that it “has to do with the need of people and polities to unite for common purposes yet remain separate to preserve their respective integrities. It is rather like wanting to have one’s cake and eat it too.”3

***

We’ve all heard of companies that are said to be too big to fail.  Well, I think the U.S. is too big to love.  Because more and more Americans are giving away their little pieces of cake, they now think American culture is either entirely about abstract ideals or resides in the District of Columbia. In either case, their idea of Americanness is increasingly divorced from the things and people and places with which they are most familiar, the very things that give them comfort and inspiration.

Sadly, the findings of the recent Gallup Poll on patriotism suggest that our affection for our country is dependent on our feelings for the current occupant of the White House. Is that all America is? Politics? A single political regime? That’s a remarkably flimsy basis upon which to base one’s feelings for his homeland. 

***

Over the past several years, I’ve spent a lot of time in the American South and am fascinated by the role their regional identity serves as a prism through which they see their Americaness. Their regional accent, which many see as a sign of their distinctiveness, is not entirely unlike a dialect in Germany. It’s a symbol of home, comfort, rootedness, and, if not class, then of a memory of simpler, premodern times. 

I’ve been struck by how many educated people I’ve met at and around the University of Georgia who either hide or only use their accents on certain occasions. Those who hide their accents tend to do so out of fear of drawing the condescension of non-Southerners. Those who use it selectively tend to do so when they’re tired, tipsy, sick, being affectionate, or trying to get out of a speeding ticket.

One young woman I know who grew up in a small farming town in central Georgia made a big effort to lose her thick drawl while she was an undergraduate. Not unlike an upwardly mobile child of immigrants, she sees her unaccented English as a sign of her education, as having become a mainstream middle-class American.  Still, like so many, she peppers her speech and text messages with “y’all and “‘ppreciate ya.” There are times she’ll jokingly put on a heavy drawl when saying certain words like “Southern” or “friend.” Her ambivalence about her regional accent is telling. A recent study suggests that in Georgia the Southern accent is fading. 

The point I’m making is that there is an idea in the land that 340 million Americans should adhere to a standardized mainstream culture. Even efforts to promote “diversity” or “inclusion” only seek to ensure statistical integration of various groups within a uniform cultural model. They don’t encourage a broader form of pluralism, one that allows institutions to emerge from the distinctive history—and demographic makeup of—a particular region. They promote diversity within organizations, not among them. Frameworks like DEI, for instance, don’t properly respect the significance of, say, non-diverse historically black colleges in the South. 

Indeed, whenever I hear the terms Diversity, Equity, and Inclusion, I think of that 1960s craze in which college students tried to pile as many people as they could into a Volkswagen.  But what about all the Americans who have no desire to squeeze into that Volkswagen? What about those who’d prefer to ride in a Hyundai, a Honda, or a roomier Lincoln Continental?

Standardization stifles creativity.  It breeds placelessness and alienation. Most importantly, it doesn’t nurture the kind of community that people need.  For a country whose political speak gives a lot of lip service to the importance of choice, America has become a one-size-fits-all culture. 

What I’m advocating is not multiculturalism in which distinct cultural or linguistic groups at large are encouraged to stay in their own lanes. To my mind, what America needs is more respect for regional integrity, to allow the people in its many places to nurture and maintain that which is distinctive about their homelands so when newcomers inevitably resettle there,  becoming American will also mean learning the ways of the locals.

  1. Christina Lopinski, “‘Mei Knie tut weh wie d’Sau’; Warum spricht der Patient im Bett von einem Schwein? Viele ausländische Pflegekräfte tun sich schwer mit dem Dialekt Ihrer Patienten. In Markgröningen lernen sie jetzt Schwäbisch,” Süddeutsche Zeitung, July 19, 2025. ↩︎
  2. Ludwig M. Eichinger et al, “Aktuelle Spracheinstellungen in Deutschland: Erste Ergebnisse einer bundesweiten Repräsentativumfrage,” Institute for the German Language, University of Mannheim, 2009. 11,16. ↩︎
  3. Daniel J. Elazar, Exploring Federalism (Tuscaloosa: The University of Alabama Press, 1987), 33. ↩︎

Don’t Let the Constitution’s Universalist Language Fool You.  It’s a Political, Not a Sacred Document. 

The Cloister of the Basílica de San Isidoro, León, Spain. (Photo by Gregory Rodriguez)

Madrid, Spain

You know how learning about ancient cultures can give you insights into the mysterious habits of contemporary humans?  Well, the same goes for learning about complex modern institutions.  Pondering their pre-modern origins can help bring their fundamental mechanics to light. 

It was in this spirit that I boarded a train to visit a medieval sandstone cloister in the northwestern Spanish city of León.  Believe it or not, I was hoping my quick trip to the Basílica de San Isidoro would teach me a few things, about, well, modern democracy.  


Americans tend to sanctify their Constitution in ways that obscure its real value and purpose. I suppose this sanctification is a way of symbolically embracing a country that lacks as firm an ethnic or cultural anchor as, say, any European nation.  When it was ratified, America’s Constitution was indeed the opposite of an anchor.  In 1787,  Francis Hopkinson, the Pennsylvania lawyer who attended the Constitutional Convention in addition to signing the Declaration of Independence and designing the American flag, called it a “roof” that united “the strength of 13 rafters.”1 In other words, the Framers created an overarching legal structure–a roof–long before a unifying cultural foundation developed on the ground.  Similarly, historian Daniel Boorstin has written that Americans view the Constitution as an “exoskelton,” something akin to a lobster’s shell that we filled out over time.2 Through the generations, it was the human interaction created by commerce and migration that forged the cultural ties that bound a heterogenous people together into a nation. 

The sanctification of the Constitution has also involved no small amount of ethnic chauvinism.  The blessings of American democracy, we’ve been taught, are inherited exclusively from England, where the very first written constitution, the Magna Carta, was issued in 1215 and the first parliament was called fifty years later.

By the mid-19th century, many white Americans were so convinced that they were the sole inheritors of the love of liberty and the habits of self-governance that they imagined it to be an almost racial trait that was passed down from one generation to the next. In January of 1845, Democratic Representative Alexander Duncan of Ohio told his colleagues that he thought “there seems to be something in our laws and institutions peculiarly adapted to our Anglo-Saxon-American race, under which they will thrive and prosper, but under which all others wilt and die.”3 In the early 20th century, a prominent Oxford historian proudly proclaimed that “parliamentary institutions” were “incomparably the greatest gift of the English people to the civilization of the world.”4

Now, I realize that this type of sanctification of the Constitution has had its purposes. Racial aspects aside, it fosters the reverence sometimes required to abide by its more absurd and out-dated elements. But treating America’s legal framework as some sort of mystical tablet also obscures our understanding of its critical role as social contract and source of our rights. The other problem, of course, is that the myth is based on a false premise. 

In 2009, in his book The Life and Death of Democracy, Australian political theorist John Keane “politely questioned”—in his words—“this English prejudice.”5 His research had led him to conclude that in 1188, a generation before the Magna Carta, Alfonso IX, the newly-crowned seventeen-year-old monarch of the Kingdom of León, had convened Europe’s very first parliament, or cortes in Spanish, within the cloisters of León’s Romanesque Basílica de San Isidoro.

Of course, it had not been unusual for Europe’s kings to gather with lords and bishops.  But Alfonso did something entirely new for European royalty, which was to invite representatives from the towns. This was the first recorded gathering of all three estates—nobility, the Church, and burghers.  The most basic definition of a parliament is an assembly involving various social groups of the realm, including representatives of towns.

But why would a king whose power was said to be granted directly by God seek to hold discussions with townsmen?  For one, he needed money to fight back the encroaching Muslim armies and plenty of Leonese were unhappy with the imposition of new taxes.  Secondly, Alfonso may have feared that an alliance between angry nobles and townsfolk might form against him.  In any case, according to Keane, the king was determined “to defend and expand his kingdom, even if that meant making political compromises that might dilute his kingly powers.”6 

Operating in the spirit of compromise, the king secured the backing of the three estates in exchange for his promise to “not wage war nor make peace or make any agreement without the counsel of bishops, nobles and good men,” which referred to leading citizens of the towns.7 In a series of what may have been up to 15 documents collectively called the Decreta, the king agreed that private property and personal domiciles were inviolable, that justice would be upheld in a routine, predictable manner including that any charges against a person must be backed by evidence and that detainees had the right to be defended by a third party.

What’s even more significant here, to me at least, is that the citizens of León did not secure new liberties out of some abstract reverence for rights. The social gains they made were the byproduct of a monarch’s inability to raise taxes, maintain peace in the realm, and otherwise rule his kingdom without the cooperation of the three estates.

As John Keane has eloquently put it, it was out of self-interest that Alfonso IX invented “a new mechanism for resolving disputes and striking bargains among interested parties who felt they had a common interest in reaching compromise.”8 The king’s baseline understanding of his kingdom, then, was not of a society requiring indivisible political community, but one comprised of competing and sometimes conflicting interests. To resolve inevitable conflicts, the Decreta contained an agreement that there would be future assemblies of the king involving representatives from the three estates. A regular parliament offered “the possibility of turning disagreements about reality into binding agreements in support of a common good.”9

Even more amazing is that the representatives of the towns who attended the cortes in the cloister of San Isidoro had been elected by the citizens of their towns.  While it isn’t known by what method they were elected nor exactly how many were present, historian Joseph F. O’Callaghan concluded that their “numbers in attendance must have been quite large.”10 

The Reconquista placed pressures on other monarchs in the Iberian Peninsula.  As they sought to strip people and territory from Muslim control, Christian kings had to “compete with the more advanced Muslim kingdoms in the south for the favours of the merchants and farmers,” and thus were “prepared to respect their property rights and grant them … privileges.”11 In 1126, sixty-two years before the first cortes in León, King Alfonso I of Aragon granted a charter of liberties to “Christians whom I brought, with the help of God out of the power of the Saracens and led into the lands of the Christians. . . . Because you left your homes and your estates for the name of Christ and out of love for me and came with me to populate my lands, I grant you good customs throughout your realm.”12


It would, of course, be naive to draw too straight a line between the medieval origins of the parliament and the spread of modern written constitutions in the mid-18th century. As representative assemblies—and the societies from which they emerged—became more complex, so too did the theories of governance and philosophical worldviews that came to animate western politics.

At the same time, however, as Spanish legal historian Aniceto Masferrer has argued, the enormous differences between the two eras notwithstanding, it is important to make connections between the medieval documents like the Decreta and the emergence of liberal governance six centuries later. If nothing else, Alfonso IX’s savvy political bargaining shows “how medieval Europe started to be aware of the convenience of limiting political power through law.”13


By the time the Framers sat down in 1787 to hammer out the U.S. Constitution, the English House of Commons had long since invented the idea of popular sovereignty as a way to challenge monarchical power. The ideology was developed, as historian Edmund Morgan wrote, to “justify a government in which the authority of kings stood below that of the people.”14  Of course, shifting the locus of sovereignty from the king to the people was not actually designed to put power in the hands of the people, but rather in those of the members of parliament.  

For their part, America’s Federalists—who could better be described as nationalists—embraced the idea of popular sovereignty as a way to weaken the power of individual states.  Why?  Because legislative majorities in most states had passed debt relief laws that the Framers felt threatened the property rights of creditors. (The minority the Framers sought to protect, then, was the wealthy, a class to which most belonged.) The idea–put forth by James Madison—was that the authority granted to the new national government would rest on the power of “the people” at large rather than on the collective authority of the states themselves. In short, he had invented “a sovereign American people to overcome the sovereign states.”15

Popular sovereignty, of course, was as much a “fiction” as was the divine right of kings.16 Indeed, the way it was “publicly presented” in America, political scientist Stephen Holmes has written, bore “a striking resemblance to proclamations in which absolute monarchs [once declared] their sovereign will.” What this meant was that rather than being perceived as an exchange of promises “between classes or factions or territorial subunits,” the U.S. Constitution was portrayed as a charter that “‘we the people’ [gave] ourselves.”17 

Paradoxically perhaps, it was this fiction that led the Federalists to argue against the inclusion of a Bill of Rights to the Constitution. When Antifederalists, those who opposed the ratification of the new Constitution for fear that it would give the national government too much power, first demanded that the document include a list of protected rights, Federalists called the request a quaint throwback to the time when kings granted concessions to their subjects. 

If the government derived its power directly from the people, they argued, then what sense would it make to have the people make concessions to themselves? Because America’s constitution could not be considered an agreement between or among parties, “neither concession nor contract was possible because people and government were one and the same.”18

Conversely, since it was “We the People” who conveyed specific powers to the national government, Federalists could argue, as one did, that the Constitution itself was “nothing more than a bill of rights—a declaration of the people in what manner they choose to be governed.”19

Evidently offended by the idea that a convention had been convened to hash out a mere compact, North Carolina’s James Iredell, a leading Federalist who would become one of the first justices of the U.S. Supreme Court, proclaimed that America’s government was  “founded on much nobler principles.” 20

Fortunately for Americans, the Antifederalists were not swayed by those nobler principles and did not give up on the idea that the Constitution was, as one delegate at a state ratifying convention put it, “a compact, agreement, covenant, bargain,” that required the government to put concessions in writing.21 James Madison, of course, ultimately relented to these demands out of political expediency. While he was himself a believer in rights, he nonetheless saw the addition of constitutional guarantees as a political rather than an ideological act. His reason for drafting the Bill of Rights, historian Pauline Maier has concluded, was “less to secure rights,” than to subdue opposition to the Constitution.22  If, as the Framers believed, one of the primary goals of the new Constitution was to protect the property rights of the wealthy minority, then adding amendments to safeguard such popular rights as speech, religion, press, and assembly, was a worthwhile compromise. 

Still, the Federalists’ fiction of popular sovereignty—and a unified American people—lived on in the way Americans think about their country.  Each school day,  American school children pledge allegiance to their “indivisible” nation. Even when we know that ugly, divisive presidential elections can be won by mere percentage points, we continue to refer reverently to the the voice of “the people.” When we ask a restaurant waiter what’s good on the menu, they’re likely to tell you what sells most as if majority opinion is the voice of good taste and wisdom.

If Medieval “Standestaat”—a state of estates—were thought to be divisible into three separate groups, then contemporary Americans tend to see the primary division in our national political community as being between the few and the many, the majority versus the minority, which they sometimes translate as the strong and the weak.23 So even as we herald the wisdom of the majority, we hail the genius of a Constitution that protects the minority. Indeed, Columbia University political scientist Giovanni Sartori once argued that the only reason to believe in constitutions at all is if “we think that somebody needs protection from somebody else.”24

This insight helps explain why Americans tend to think of rights in moral terms, sacred protections that are heroically demanded and/or benevolently bestowed. This weak/strong dynamic injects no small amount of paternalism into a political process we otherwise think of in terms of bargaining, sausage-making, and horse-trading. Rather than being perceived as a political compromise made to maintain social tranquility, the granting of rights is often portrayed as if it were a morality tale.  Which brings me back to Alfonso IX, who clearly saw it as a necessary element of a mutually beneficial exchange.

In his 2012 essay, “Constitutions and Constitutionalism,” NYU’s Stephen Holmes urged Americans to start thinking about rights more through the lens of realism than idealism.  Claiming this his observations should be interpreted as instructive rather than cynical, he argued the “democratic constitutions emerge and survive” when society’s “most powerful social forces find that they can promote their own interests most effectively by simultaneously promoting the interests of, and sharing political influence with, less powerful but not utterly powerless swaths of the population.”25

Why? Because it is only “when the powerful discover the advantages they can reap from making their own behavior predictable,” do they “voluntarily submit to constitutional constraints.” Put even more bluntly, when non-elites bring incentives to the bargaining table, “elites respond opportunistically by granting legal protections and participatory rights in exchange for cooperation indispensable to elite projects.”26

Holmes wasn’t the first theorist to make this argument.  In 1919, Max Weber, one of the founders of modern sociology, argued that much of modern western democracy was itself a product of national elites’ need for disciplined soldiers to fight wars.  It was military necessity, then, that compelled them “to secure the cooperation of the non-aristocratic masses and hence put arms, and along with arms political power, into their arms.”27 While Weber was not referring to the U.S. Constitution, he was nonetheless recognizing the existence of political bargaining as the essence of constitutionalism in general.  

Historian Linda Colley concurs with—and expands on–Weber in her remarkable 2021 book, The Gun, the Ship, and The Pen: Warfare, Constitutions, and the Making of the Modern World. The rash of new constitutions in the 18th century was, in part, a product of the rise in the number of bloody and expensive imperial, transcontinental wars. The new countries that emerged from this warfare “progressively elected to experiment with written constitutions as a means to reorder government, mark out and lay claim to contested boundaries” as well as to “legitimize their systems of government anew.” These new constitutions—including that of the United States–helped to “rally wider support and justify expanding fiscal and manpower demands.” These documents sometimes “functioned in effect and in part as bargains on paper. Male inhabitants of a state might be offered certain rights, including admission to the franchise, as a quid pro quo for accepting higher taxes and/or military conscription.”28 

This description is not pretty. It’s not mystical. Nor does it pretend that all parties to the negotiation are equal.  But it does provide a framework with which we can think about rights in terms of compromise and mutual benefit rather than merely in sacred principles and abstractions. This doesn’t mean that the Framers didn’t infuse the document with a desire for reform or utopian hopes, just that the harsh realities of geopolitics—particularly threats from Britain to the north, Spain on the Mississippi, and Native Americans throughout the inland frontier—were never far from their minds. The Constitution that was drafted in Philadelphia during the summer of 1787, writes Colley, “was often approached at the time less as a ‘blueprint of a liberal democratic society,’ . . . than as a grimly necessary plan for a more effective and defendable union.”29


So what did I learn in León?  I learned that from the very beginning constitutions are practical political documents that formalize the results of bargaining between competing sectors of a given political community; that they are amoral rule books that set the boundaries of future debate, establish the obligations each sector owes to the other, and constrain the actions of members of signatory groups long after the signers of the parchment are dead. Most importantly, I learned that if they are to survive, all parties must continue to persuade the other that they can more effectively get what they want if they agree to support each other.  And finally, that, while inspiring, America’s universalist language of rights can be deceiving; that our civil rights—liberties derived from membership in a particular polity—are a far more powerful source of freedom than human rights–liberties that all persons should theoretically enjoy.

  1. Paul M. Zall, ed., Comical Spirit of Seventy-Six: The Humor of Francis Hopkinson (San Marino: The Huntington Library, 1976), 191. ↩︎
  2. Daniel J. Boorstin, The Genius of American Politics (Chicago: The University of Chicago Press, 1953, 191. ↩︎
  3. Reginald Horsman, Race and Manifest Destiny: The Origins of American Racial Anglo-Saxonism (Cambridge: Harvard University Press, 1981), 227. ↩︎
  4. A.F. Pollard, The Evolution of Parliament (London: Longmans, Green & Company, 1920), 3. ↩︎
  5. John Keane, “The Future of Parliaments,” (keynote address, EU Global Project to Strengthen the Capacity of Parliaments, León, Spain, June 30, 2023). ↩︎
  6. John Keane, The Shortest History of Democracy: 4,000 Years of Self-Government—A Retelling for Our Times (New York: The Experiment, 2022). 79. ↩︎
  7. María Esther Seijas Villadangos, “Origin of Parliamentarism: An Historical Review of its Crisis: León (Spain) as Cradle of Parliamentarism,” Revista Acadêmica da Faculdade de Direito do Recife 88, no. 2., (July/December 2016): 22. ↩︎
  8. John Keane, The Life and Death of Democracy (London: Simon & Schuster, 2009), 176. ↩︎
  9. Keane, “The Future of Parliaments”. ↩︎
  10. Joseph F. O’Callaghan, “The Beginnings of the Cortes of León-Castile,” The American Historical Review 74, no. 5 (June 1969): 1514. ↩︎
  11. Jan Luiten van Zanden, Eltjo Buringh, and Maarten Bosker, “The Rise and Decline of European Parliaments, 1188-1789,” The Economic History Review 65, no. 3 (August 2012): 839. ↩︎
  12. Joseph F. O’Callaghan, History of Medieval Spain (Ithaca,NY: Cornell University Press, 1975), 285. ↩︎
  13. Aniceto Masferrer, “The Spanish Origins of Limiting Royal Power in the Medieval Western World: The Córtes of León and Their Decreta (1188),” in Golden Bulls and Chartas: European Medieval Documents of Liberties, ed. Elemér Balogh (Budapest: Central European Academic Publishing, 2023), 31. ↩︎
  14. Edmund S. Morgan, Inventing the People: The Rise of Popular Sovereignty in England and America (New York: W.W. Norton, 1989), 56. ↩︎
  15. Morgan, Inventing the People, 267. ↩︎
  16. Morgan, Inventing the People, 13. ↩︎
  17. Stephen Holmes, “Precommitment and the Paradox of Democracy,” in Passions and Constraint: On the Theory of Liberal Democracy (Chicago: The University of Chicago Press, 1995,) 146. ↩︎
  18. Morgan, Inventing the People, 283. ↩︎
  19. Morgan, Inventing the People, 283. ↩︎
  20. Gordon S. Wood, The Creation of the American Republic, 1776-1787 (Chapel Hill: The University of North Carolina Press, 1969), 541-542. ↩︎
  21. Wood, The Creation of the American Republic, 541. ↩︎
  22. Pauline Maier, Ratification: The People Debate the Constitution, 1787-1788 (New York: Simon & Schuster, 2010), 446. ↩︎
  23. Daniel Chirot, “The Rise of the West,” American Sociological Review 50, no. 2 (April 1985): 185. ↩︎
  24. Giovanni Sartori, “Constitutionalism: A Preliminary Discussion,” The American Political Science Review 56, no. 4 (December 1962): 855. ↩︎
  25. Stephen Holmes, “Constitutions and Constitutionalism,” in The Oxford Handbook of Comparative Constitutional Law, ed. Michel Rosenfeld and András Sajó (Oxford, UK: Oxford University Press, 2012), 215. ↩︎
  26. Holmes, “Constitutions and Constitutionalism,” 214-215. ↩︎
  27. Max Weber, General Economic History, trans. Frank H. Knight (Glencoe, IL: The Free Press, 1950), 325. ↩︎
  28. Linda Colley, The Gun, The Ship, and The Pen: Warfare, Constitutions, and the Making of the Modern World (New York: Liveright Publishing Corporation, 2021), 7. ↩︎
  29. Colley, The Gun, The Ship, and The Pen, 121. ↩︎

The Marriage of Wokeness and Cash

Why America Has Seen an Explosion in Public Recrimination

(Image by Jørgen Carling)

At its best, Wokeness is an awareness of how race and gender bias can produce societal inequalities. At its worst, it’s a racket in which upper-middle-class college graduates wield victimhood status in a bid for financial gain or career advancement. 

When the Civil Rights Act was passed in 1964, victims of workplace discrimination had recourse to only a few remedies, such as back pay, restoration of a promotion or benefits, or job reinstatement.  A few years later, newly minted federal Affirmative Action programs held that members of minority groups that had a history of facing discrimination could now benefit from preferential hiring schemes. In the first instance, an employee had to prove discrimination had occurred, in the second, a minority job applicant was assumed to need protection from discrimination in any hiring process.

While remedies for workplace discrimination were not controversial because they were focused on making the victim whole, preferential hiring–as well as college admissions—was problematic by virtue of the fact that it meant that a non-minority’s application could be thrown out to make room for the minority applicant.  Not surprisingly, the issue reached the Supreme Court, and in 1978 the justices reached a convoluted split decision that upheld Affirmative Action, made racial quotas illegal, said race could nonetheless be considered in college admissions, and failed to determine how applicants would have to prove past discrimination in order to receive protected or preferential status.

A little more than a decade later, Congress upped the ante by passing the Civil Rights Act of 1991, which for the first time explicitly allowed employees to seek compensatory and punitive damages in both racial and gender discrimination cases.  In essence, this updating of the Civil Rights Act transformed the anti-discrimination statute from one whose remedy was equitable relief—restoring the complainant to his or her employment situation to what it had been before the offending act occurred—to something more like a tort—a wrongful or negligent act—which allows for trials in which juries could order parties they found liable to cough up damages.

In short, the CRA of 1991 fundamentally changed the way discrimination was to be remedied. Before 1991, anti-discrimination statutes were based on a traditional labor model that sought conciliation in employer-employee relations. After 1991, employment disputes would more and more be resolved through litigation—or the threat of it— involving compensation for the victim and punishment for the offender.  Whereas in the 1960s civil rights advocates had wanted anti-discrimination enforcement to revolve around a New-Deal style government authority, over time they came to embrace the benefits of private enforcement in which every complainant was a potential plaintiff.

Now that large sums of money were potentially involved, most observers anticipated an increase in discrimination litigation. And, of course, that came to pass. Indeed, the new law incentivized the filing of complaints, and changed how Americans approached their interactions with one another in the workplace and beyond.  In the same way that the explosion in personal injury lawsuits turned every fall in a shopping center into a potential lawsuit, so too did making discrimination into a tort encourage Americans to see every slight and professional setback as potential sources of compensation.

Just as importantly, the new law made discrimination cases, which had once been considered too much work for too little pay off, more attractive to attorneys. To sweeten the deal, the law now authorized plaintiffs to recover attorney’s fees if they won their cases.  One could say that if Martin Luther King, Jr., was among the inspirations for the passage of the Civil Rights Act of 1964, then the flamboyant, scandal-plagued San Francisco attorney Melvin Belli, the so-called “King of Torts,” was the spiritual father of the Civil Rights Act of 1991. More than anyone in America, Belli was responsible for the successful post-war push by trial lawyers to increase damages awards in personal injury cases. Thus, the Civil Rights Act of 1991 married the sacrifices of the 1960s struggle for social justice with the ethos of an era best symbolized by a scene in the 1987 movie Wall Street in which Michael Douglas–playing the fictional corporate raider Gordon Gekko–famously proclaimed that “Greed is good.”

The term “woke,” which became a watch word among African American activists as early as the first half of the 20th-Century, was catapulted into the mainstream in 2014 in the wake of the shooting of Michael Brown in Ferguson, Missouri. While initially associated with the Black Lives Matter movement, it was quickly adopted by white Progressives committed to fighting inequality. The term then quickly came to encompass a constellation of ideas, slogans, and programs that sought to liberate a growing number of marginalized groups from a structurally biased system.

There was nothing particularly revolutionary about these activists’ stated concerns.  What was distinctive was both the tone of their rhetoric and their favored solutions for addressing disparities. Whereas the civil rights movement of the early 1960s inspired some of the nation’s greatest legislative successes through a spirit of forgiveness and reconciliation, the rhetoric of Wokeness was full of resentment, vengeance, and demands for immediate reparation.

While the definition of discrimination had been expanding steadily since 1964, suddenly Americans were being told that there were a whole lot more ways to offend and oppress their fellow Americans than they had ever imagined. Even as most Americans thought that the lives of minorities and women had improved significantly since the 1960s, the bar for what constitutes–and the standard of evidence to prove–discrimination was being lowered.  University administrators and company human resources experts now considered American society so fundamentally prejudiced and hostile that they introduced new ways of protecting people–such as trigger warnings and safe spaces–from even the slightest faux pax, which were now inelegantly called microaggressions. 

Meantime, those who failed to comply with the growing list of social infractions risked being “called out,” “cancelled,” fired, sued, and otherwise painted with a scarlet letter. To a sober observer, the sheer number and frequency of these auto-da fés seemed more than a little exaggerated. There’s no denying that discrimination occurs in America, but why all of a sudden was there an explosion of accusations and public recriminations, particularly in universities and among the educated upper middle classes, people who are among the most privileged humans on the planet? And why did so many people who were not directly involved in the incidents join in the angry choruses while others remained silent in the face of what were objectively disturbing spectacles?

Greg Lukianoff and Jonathan Haidt, the authors of The Coddling of the American Mind, have called the driving impulse of this era of discontent “vindictive protectiveness,” the practice of publicly shame and punishing those accused of having said or done anything to harm a member of a protected group.  Because even casual defenders of the accused are not immune to these mob attacks, those uncomfortable with the idea of public stoning tend to keep their objections to themselves.  This behavior, Lukianoff and Haidt argue, created “a culture in which everyone must think twice before speaking up, lest they face charges of insensitivity, aggression, or worse.” 

Vindictive protectiveness, Lukianoff and Haidt argue, arose from a style of fearful and overprotective parenting that educated middle-class and above parents began to practice in the 1980s and ’90s in an effort to give their children a competitive edge in life.  Wanting only the best, these parents sought to cultivate their children’s talents while erasing all potential sources of risk and adversity in their environments.  Overscheduled, over supervised, and left with precious little time to play and explore on their own, these children eventually arrived on university campuses believing that the world was dangerous, bad people should be removed from their presence, and that the institutions around them should protect them in the way their parents had.

The universities, of course, obliged. As higher education has become big business, students have been transformed into customers, and we all know that the customer is always right. Furthermore, Lukianoff and Haidt suggest, the rise of vindicative protectiveness may also “be related to recent changes in the interpretation of federal antidiscrimination statutes.”  Not only do university administrators seek to avoid lawsuits from students, they also want to avert any investigations by the Department of Justice into their civil rights compliance. Ironically, that’s why they developed “bias incident reporting” systems that allow students to report anonymously on anyone they feel has caused them or anyone else to experience any type of bias.  At Cornell University, for example, a bias incident is something done or said “that one could reasonably and prudently conclude is motivated, in whole or in part, by the alleged offender’s bias against an actual or perceived aspect of diversity, including, but not limited to, age, ancestry or ethnicity, color, creed, disability, gender, gender identity or expression, height, immigration or citizenship status, marital status, national origin, race, religion, religious practice, sexual orientation, socioeconomic status, or weight.”

In short, a combination of the desire to protect students as well as to remain in compliance with civil rights statutes and regulations turned campuses into places where students are encouraged to report on professors, staff, subcontractors, and, of course, one another. In response to professor pushback at the anonymous reporting system at Stanford, a university spokesperson insisted that the “process aims to promote a climate of respect.” Still, there is a growing realization that such reporting systems can be both easily abused and limit free speech. Nonetheless, as students graduate from college, they take the expectations these systems foster with them into the labor force.

The commentariat is telling us that Trump’s victory spells the end of Woke.  But too many election post-mortems treat Wokeness as a purely political phenomenon. Yes, it employs left-wing ideologies that treat identity and victimization as sources of resistance and power.  But Wokeness would never have become so pernicious outside the university had there not been the temptation (and fear) of financial gain (and loss).  In addition to money, cancellation also holds out the promise of professional advancement.  Campaigning to remove one’s colleague or boss–and even helping others cancel theirs–is also a way for young people to remove their supervisors, clear the field of competition and move up the ranks.

Ask any cynical political hack how to decode the power and drive of any politicized trend and they’re likely to tell you to follow the money.  Wokeness is no different. After you find out who all has profited from this phenomenon, you’ll come face to face with the absurd fact that one of the most diverse countries in the world has chosen to encourage private litigation and the threat of financial damages to curb discrimination and promote a more just and cooperative society.  What could possibly go wrong?

Gregory Rodriguez is the author of Whiteness: An American Tragedy and Other Essays and Mongrels, Bastards, Orphans, and Vagabonds: Mexican Immigration and the Future of Race in America. He is working on a book on the rise and fall of rights-based liberalism.

The Minority Voters Who Exposed the Politics of Benevolence

(Photo by Vetustense Photorogue)

It only took a little more than a week for Democrats to go from crying foul over a pro-Trump comedian’s insulting remarks on Puerto Rico to hurling insults at minorities whose electorates had shifted to the right on Election Day. 

Other than old fashion revenge, what much of the invective had in common was a desire to see how poorly those ungrateful minorities would fare without the protection of newly jilted liberals.

“I guess those Latinos [who voted Republican] will enjoy watching the Trump Presidency from wherever it is he deports them to,” wrote one Air Force veteran who calls herself a Democrat.  “Fu** Latinos and Arabs,” wrote another man who publicly identified as LGBTQ and pro-Black Lives Matter.  “There I said it. Hope you all get deported and banned.” Even a top writer for “Mother Jones,” a leading liberal magazine, joined the fray, writing, “Perhaps massive deportations will affect how they see Trump.”

Never mind the fact that voters are, by definition, U.S. citizens and therefore not subject to deportation. What matters is the source of this bitterness and what it tells us about the state of contemporary American liberalism. 

If asked, a political scientist would tell you that politics in a democracy is simply the struggle among competing interests. And they’d be right. Group A is likely to have different interests than Group B and, since neither group is powerful enough to grab hold of the levers of government by itself, each will weigh whether joining forces with Group C or D or E would likely get them whatever they’re after.  The groups don’t all have the same priorities. Indeed, some of their issues may be in direct conflict. But the hope is that all groups will get at least some of what they want from the party they support.

The currencies any given group can bring to a political coalition are many. Some groups bring a lot of money–cash from large donors helps to get the word out; another group may have privileged access to media sources; another may have a network of political organizations that can help push people to the polls, and some groups have lots of actual voters who can punch cards or tap touch screens at campaign’s end.  Put the resources of the coalition’s groups together and maybe they’ll carry each other over the finish line.

But particularly since the 1960s, when a combination of the civil rights movement and the New Left injected a stronger current of morality into modern politics, another more subtle cultural currency came into play in political alliances, the politics of benevolence. While politics in America has always involved some level of pretense that high-minded principles trumped material interests, of the two major parties the Democrats became more convinced that politics was, first and foremost, the collective expression of virtue.

Talk to a Democrat from the Westside of Los Angeles and they’re more likely to mention their party’s generous social policies aimed at the less fortunate than its approaches to the economic sector from which they earn a living.  They may even explicitly claim that they vote against their own interests. But that’s not true.

Let’s say our hypothetical benevolent voter belongs to Group A, maybe he or she works or invests in the tech sector.  Group A’s political alliance with the less fortunate members of Group H or J can bring their chosen candidate more voters, because let’s face it, Group A is a smaller demographic group than Groups H and J. 

But such an alliance can also supply members of Group A with a sense of moral satisfaction.  Because multi-group coalitions are inevitably hierarchical, members of the more privileged group can feel that they’ve protected the interests of groups who reside much lower than them on the totem pole.  And in return, whether they admit or not, members of Group A often expect gratitude and party loyalty from members of Groups H and J.

But ironically, when people believe their political power is derived from their benevolence, it behooves them to maintain the social hierarchy from which their sense of righteousness arose.  When sufficient numbers of Groups H and J abandon their alliance with Group A and join a new coalition, they not only undermine the privileged group’s belief in its own benevolence, they threaten its political power. In other words, if enough members of Group H and J abandon the coalition, Group A doesn’t get its own interests met. 

Thus, the vitriol being directed at apostate minority voters is a rearguard effort to reassert hierarchy in a faltering coalition, to bully Groups H and J back into their place at the bottom of the totem pole.  In addition to the revenge posts on X, more than a few analysts have suggested–without any evidence–that minorities who voted for Trump did so purely out of animus towards women or African Americans. All that proves is that while terms like racist or misogynist may have come into common use as part of a good faith effort to reform social attitudes, today–with their meanings now stretched beyond recognition–they are just as likely to be used to impose social control over recalcitrant groups and individuals. 

One thing is for sure, however, the nasty aftermath of this ugly election has already proven that there were always interests lurking beneath the Democrats’ politics of benevolence. 

Hillbilly Elegy as Tragedy

Gnadenhutten Park & Museum, Gnadenhutten, Ohio. (Photo by Gregory Rodriguez)

In 2016, before he got elected to the United States Senate, the Republican vice-presidential nominee, J.D. Vance, published a memoir exploring what he thought was wrong with working-class white culture. On the one hand, Hillbilly Elegy is a tale of economic instability, addiction, and cultural decline. On the other, it’s a classic story of American upward mobility wherein Vance shakes off his people’s pathologies in order to climb the social ladder. It’s a story that’s been told countless times by Americans who’ve climbed their way out of ghettos and barrios. Some authors blame the system. Others focus more on the self-destructive behaviors of those caught at the bottom. Vance tended toward the latter, so much so that The New York Times called it “tough love” while other critics accused him of “blaming the victim.” Meantime, more than one left-wing reviewer resented the fact that Vance painted a segment of the white population as pathological minorities. They evidently thought that he had crept onto their turf. 

I read Hillbilly Elegy as part of my research into white people. While his memoir didn’t particularly impress me, I appreciated Vance’s choice of looking at his Appalachian roots through an ethnic, rather than a racial lens. His book put a human face on Trump’s politics of white grievance. The primary difference was that Vance emphasized the need for cultural renewal, while Trump is always hammering away on who is to blame. 

For a generation now, newspapers have been beating the drums of demographic change as if they were looking forward to the day when whites become a minority. And here we are, if not numerically, then culturally. Frankly, I was already tired of the politics of grievance, but now we have a new minority eager to play. But, alas, grievance is the language America speaks, and whoever thinks violence is foreign to any aspect of life in America has never actually been to America.

Three years ago this week, I was on a research trip in Vance’s home state of Ohio when I stumbled on a sign marking the birthplace of the first white child in the state. The finding was made all the more unsettling given that I was in Gnadenhutten, where, in 1782, 160 Pennsylvania militiamen massacred 96 pacifist Christian Indians.

Vance called his bestseller an elegy. I called my book of essays on white people a tragedy.  It’s time to take white anger seriously or it will burn us all up. 

You can find Whiteness: An American Tragedy on Amazon. 

When Satire is Too on the Nose

I finally saw American Fiction tonight. It is a mildly amusing, and all too accurate, satire about a black fiction writer who, in a fit of pique, submits to an intellectual marketplace that puts a premium on minstrelsy by writing a novel that trades in racial stereotypes. The movie ridicules the liberal white reading public for thinking “they want the truth” but really want to “feel absolved.” So trapped in their virtuous imaginations, so thoroughly cleansed of sin, these readers look to minorities to provide them with “authentic” and “raw” stories to make them feel alive, or at least a little unsafe. And so the marketplace demands stories that draw on the ancient trope of the savage, both noble and not-so-noble, victim and victimizer.  Plus, these days, it’s just so important to “listen to black voices” and “center diversity,” even if the desired product is mere “trauma porn.”

The movie mocks the publishing world for indulging in racial fads–or “reckonings” as the journalists call them–in an effort to remain current and to not get canceled.  For reasons I suspect are calculated, the movie holds its fire at minority writers who play this demeaning game for the accolades and compensation.  I mean, they, too, have to get paid. 

The bigger message of American Fiction, however, may be that fighting against white expectations can sometimes twist your own identity into knots.  Unfortunately, there’s no resolution in the end, which was disappointing.  Maybe the director saw the problem so clearly that he just figured it was insurmountable.  Or maybe, like his lead character, he was trying too hard not to give us the ending he feared his white audience wanted.  Still, the movie is a worthwhile critique of the freshly inclusive creative marketplace that needs members of every group to play their assigned roles now more than ever. 

Gentlemen, Include Me Out

(Photo by Quinn Dombrowski)

July 4 is the commemoration of the victory of the colonial periphery over the imperial center, not in an effort to reform or reinvent the empire, but to leave it, and start a new one. 

In these contentious times, when Americans are at loggerheads over control of government and culture, it might be helpful to remember that sometimes—not always— independence is the answer.

Independence means freeing yourself from the clutches of decadent institutions. It means getting a chance to put your heads together to establish new ones, and mustering all your courage and creativity to forge new destinies on our own terms.  Oh, and in the spirit of freedom, it also means letting others do the same. 

On this Fourth of July, I wish for all my compatriots a renewed belief in independence, in breaking away, in starting anew. 

Today, I will lift a hot dog in all our honor and recall the quintessentially American wisdom of Samuel Goldwyn when he said, “Gentlemen, include me out.” 

Happy Fourth. 

Where Liberals Come From

(Photo by José Antonio Cartelle)

Cádiz, Spain

It might surprise you to learn that the first time the term “liberal” was ever used to describe a political group or agenda was not in France, England, or even the United States.  It might surprise you even more to learn that it was first used here in Andalucía, Spain, in 1810, in the ancient port city of Cádiz. 

Last Friday, I caught an early train to Cádiz not simply to escape the heat in Madrid. Sure the coastal breeze has its charms, but what I was really after was a glimpse of the church that hosted the cortes —the representative assembly–that drafted and approved the Constitution of 1812, what was then the most liberal governing document of its time.

In 1807, the Spanish Crown allowed Napoleon’s troops to pass through Spain on their way to invade Portugal. But that double-crossing Napoleon wound up occupying most of the peninsula, setting up his older brother Joseph as the king of Spain, and placing Spanish King Fernando VII under house arrest in a chateau in the Loire Valley. Despite laying siege to Cádiz, however, he could not bring this dynamic, international city built on imperial trade to its knees.

In the absence of a legitimate monarch, this is where nationalist leaders ultimately hunkered down to form a resistance government during what became an all-out war to push out the invading French.  In what must have been a moment of inspiration, they chose to resurrect the cortes, the medieval precursor to the modern democratic parliament that had not been used for centuries, to create a written constitution to govern the Spanish Empire in a dire situation. 

It may or may not come as another surprise to learn that the modern parliamentary assembly was invented in León in northern Spain in the late 12th Century.  During the Middle Ages, various Spanish kingdoms convened similar assemblies.  According to Australian political theorist John Keane, the “modern practice of parliamentary representation” was “born of despondency” during the struggle between Christians and Muslims over the Iberian Peninsula. 

King Alfonso IX of León knew he couldn’t continue to impose taxes to pay for battles to push back Muslim armies without making compromises to his realm’s most powerful estates that would inevitably dilute his powers. So, in 1188 he assembled a parliament of representatives made up of nobles, bishops, and wealthy citizens. This assembly in León was “of profound importance,” writes Keane, because visitors to the court–the origin of the term cortes–were no longer expected to simply vow allegiance to their sovereign’s will.  They could now demand that their interests be taken into consideration if the monarch wanted political and financial support for his policies. 

Given the state of the Spanish Crown during the War of Independence, the government council knew that they had to root the legitimacy of the monarchy—they continued to support King Fernando VII in exile—in the people of Spain rather than in God. They were also responding to the incipient independence movements in Latin America. Thus, for the first time in Spanish history, deputies were elected from across the empire—the Iberian Peninsula, Latin America, and the Philippines—to make decisions on behalf and for the future of the monarchy.

The liberales was the name given to the group of political and economic reformers who made up a narrow majority of the Cortes de Cádiz.  What did they believe in?  Mexican political theorist Roberto Breña argues that “the first Spanish liberalism was a mixture of traditional and revolutionary elements.” It placed individual liberty at the center of Spain’s political design for the first time in its history.  The liberals’ handwork can be found in the most enlightened articles of the document, including one that protects individual rights, another that insists that the purpose of government is to care for the wellbeing of the individuals that make up the nation, and the right to free expression.  The 1812 Constitution also called for the division of powers, freedom of the press, the privacy of the home, universal manhood suffrage, and significant restrictions on the power of the king. 

As fate would have it, Fernando VII returned to Spain in 1814 whereupon he rejected the constitutional monarchy established by the Cortes of Cádiz and reestablished the absolute monarchy he had left in 1808.  But the Constitution of Cádiz lived on. In 1854, no less a figure than Karl Marx observed that “far from being a servile copy of the French Constitution of 1791, it was a genuine and original offspring of Spanish intellectual life.”  And that hints at why the story of liberals in Cádiz is so important. While they did not carry the day politically in their own time, their document lived on to become an extraordinary symbol to reformers in Spain, its former colonies, and beyond for decades and centuries to come. 

Liberals–and liberalism–have come a long way and taken on many forms since the term was first used in 1810.  For instance, the liberals in Cádiz were proud Catholics and supportive of a constitutional monarchy while other forms of liberalism have been decidedly republican and anti-religious.  Likewise, contemporary American liberalism, which prioritizes individual rights and fairness, is very different than the liberalism of the New Deal era, which focused on reigning in the excesses of capitalism. 

The story of the Constitution of Cádiz reminds us that liberalism started as a movement to both include people in–and liberate them from–government. Today’s resurgence of populism is a byproduct of the imbalance between the desire to empower people versus the desire to free them–between democracy and liberalism.  

Over the past few generations, liberalism has forgotten the importance of listening to people. Contemporary liberals have not only become much too dogmatic but also way too comfortable using governmental power to achieve their goals, whether the public wants them or not. That’s literally the definition of undemocratic. We’ve even seen the recent emergence of a punitive lock ’em up–or cancellation–liberalism, which is arguably not very liberal at all. 

Of all the books and essays I’ve read on the subject recently, perhaps none has done a better job reminding me of liberalism’s potential for renewal than one written for The American Scholar in 1955 by the late U.S. Vice President and Minnesota Senator Hubert H. Humphrey.

Liberalism, he wrote, “lacks the finality of a creed, and thus it is without the allure of those dogmas which attract the minds of men by purporting to embody final truth.” If that weren’t reassuring enough, Humphrey insisted that even as liberalism must “preserve the spirit and fact of dissent in the political community,” it must also “recognize its ultimate loyalty to a majority-rule society and to the protection of all the factors which make such a society possible.”

While liberalism and democracy are always in tension, we sometimes forget that the former should always be in the service of the latter. If today’s populist surge is ever going to be defeated, liberals will have to recapture the spirit of liberalism from when it was first born.

The Democrats’ Deplorable “Deplorables” Strategy

For four years the Democrats’ strategy was to have Trump disqualified first as an “insurrectionist” and then as a “convicted felon,” while Biden played what has become standard liberal interest group politics. (A little something for this group. A little something for that.) At the same time, the Dems’ allies in the media played their roles by routinely belittling Trump’s plainly idiotic and unpatriotic supporters. It was a multi-pronged “basket of deplorables” strategy that wreaked of elitism, legal gamesmanship, and finger wagging.

In other words, Biden had four years to connect with middle America–maybe go to a Waffle House before the debate!–but his team was more obsessed with disqualifying their rival and punching down at his supporters.  Why does that matter? Because Thursday night’s face off  was only meaningful because of how tight this depressing rematch had become. 

That said, there were no surprises last night in Atlanta. Trump was a lying fool and Biden a bumbling one. It was a train wreck four years in the making. But all the slow motion in the world apparently couldn’t awaken the Democrats to the simple fact that “saving democracy”–in their breathless words–might require more from their party than cancellation and condescension.

My Mother’s Independent, American Journey

Emilie Cacho, Mission San Antonio de Pala, April 14, 2012. (Photo by Gregory Rodriguez)

For reasons I will never know, my maternal grandparents chose to live among the Pala Indians in northern San Diego County when they first arrived in the United States in 1928. The liberal part of me likes to think it was because they had some affinity for indigenous people.  Born in San Francisco Peribán, Michoacán, a ranchería–now a village which two years ago voted to govern itself as an autonomous indigenous community–my grandmother spoke some Purépecha. But the realist in me says their choice was more a reflection of their status as very poor, recently-arrived pickers from Mexico. In any case, Pala, California, is where my mother, Emilie Cacho, was born 90 years ago this week.

In 2012, a decade before her death, my mom asked me to take her to the place of her birth. If you knew my mom, a thoroughly modern woman not given to looking back with anything approximating nostalgia, you’d realized how unusual this request was. In a word, mom was ambitious, intelligent, quick-witted, and prone to fight with anyone who’d get in her way. She also had the best taste in the world. Her life story was congruent with her worldview.  She believed that a person’s past was not her destiny, which means, yes, she was a licensed therapist. 

Mom grew up in a Victorian farm house on the other side of San Diego County, in Otay, a stone’s throw from the U.S-Mexico border.  Although she had to help her parents on the family farm–my grandparents soon purchased  their own land–mom got herself elected class treasurer in high school, where her grades were good enough to get her accepted into UCLA. In the mid-1950s, she’d often say, Westwood was light years from Chula Vista. Phone calls back then were still long distance and therefore too expensive.  In her four years of college, her parents didn’t visit her once.  The upshot, however, was that mom had almost single-handedly launched herself into America’s educated upper-middle class. So, again, when mom asked me to pick her up from her elegantly-decorated house in the foothills of the Verdugo Mountains and drive her down to Pala, there was no saying no.

I had no idea how large Pala loomed in mom’s imagination until we took our road trip there.  The 1816 chapel where she was baptized, San Antonio de Pala, is the only church in the California Mission system that still serves a Mission Indian tribe. She had told me that she was born in a shack–seconds before her twin sister– on an avocado plantation less than a mile away.  But it was only when we drove to the property and a white Range Rover exited the plantation’s automatic gates that I fully realized that she was born on the plantation owner’s private property. My grandparents worked and lived on their boss’s land.  Mom, “I blurted out, “were you born into slavery?” “Mijo, that’s not funny!” she said, not at all joking.

From family lore, she learned that when it was time for my grandmother to give birth, my grandfather sent word to the Anglo doctor near the Mission, who was known to be a notorious drunk. Apparently, he arrived at the plantation in his usual state, and after helping deliver twins for a woman who already had two children, he offered to take one of the newborn girls with him. “Apparently, he thought my mom and dad were too poor to take care of all of us,” she said. “But my mother told him that there was no way she was going to let him take either one of us.”

On our quiet drive back to Los Angeles, I started to realize how much the story of mom’s infancy in Pala influenced her life. It was shocking, not just for its depiction of the helplessness of life in deep poverty, but also for the indignant refusal to be at the mercy of a pathetic white authority figure who still had the gall to pass himself off as benevolent. It helps explain why mom never trusted many people, why she never took for granted the beautiful things she surrounded herself with all her life. It was those things–her lovely home, the exquisite art on the wall–that kept her at a safe distance from the terrifying insecurity her parents had experienced in their first years in the United States. The house where mom lived by herself for three decades was a symbol of her independence, of her refusal to live either according to someone else’s rules or off the supposed kindness of others. By the time we got back to her home, I realized that mom had wanted to visit Pala to remind herself of how astonishingly far she had traveled in her life.  

Verified by MonsterInsights