Does Diversity Require the Policing of Speech? Reflections on the State of America’s Integrationist Nationalism

“Mistrust” (Photo by Christopher Cotrell)

Surabaya, East Java, Indonesia

So pervasive is the integrationist logic of post-civil rights America that it’s sometimes instructive–if not a little jarring–to talk to minorities abroad who could not imagine abiding by its rules.

To that end, I conducted an interview with a thirty-something ethnic Chinese professional to get her perspective on interethnic relations in this thriving multicultural East Javanese port city of 3 million inhabitants. Surabaya is the second largest city in Indonesia, which, with almost 300 ethnicities, is one of the most diverse countries in the world. While the level of ethnic mixing varies across the country, the overall intermarriage rate is low. A 2020 study found that almost 90% of Indonesians marry within their ethnic group. The ethnic Chinese are thought to be the least likely to outmarry.1

Given the sensitivity of the topic and the recent political turmoil, my interviewee asked that I not publish her name. We talked over a poolside table at a luxury hotel, where I had thought she could speak honestly about her experiences as an ethnic Chinese Indonesian.

Although there were few people around, my interlocutor still looked around to make sure that no ethnic Javanese hotel guests or hotel workers could overhear her. She’d occasionally break into a whisper. Once, when asked what her parents had taught her to think of the Javanese, she refused to cite specifics. I then nudged her to tell me just two things they had said. She obliged. Just two. I can only assume that she thought that whatever else her parents had said would either have reflected poorly on them or have been just too insulting to the Javanese.

Ethnic Chinese in Indonesia are widely known to enjoy a higher economic status than other ethnicities in Indonesia. They are what author Amy Chua has called “market-dominant minorities,” ethnic groups “who for widely varying reasons, tend under market conditions to dominate economically, often to a startling extent, the ‘indigenous’ majorities around them.”2

The resentment this economic imbalance inevitably creates has periodically erupted in anti-Chinese scapegoating and even violence. In 1998, during the Asian financial crisis, some indigenous Indonesians blamed the ethnic Chinese minority for the nation’s economic plight. This sparked two days of large-scale rioting in Jakarta in which Chinese-owned businesses were looted and burned and dozens of Chinese Indonesian women were raped.

That level of violence pushed some wealthy Chinese Indonesians to emigrate to Singapore, but it also ultimately led to the collapse of President Suharto’s authoritarian regime, a watershed moment that has since led to growing democratization and acceptance of ethnic pluralism in Indonesia.

***

Under Suharto’s 32 year rule, the government’s cultural policy was stridently assimilationist. Although “Unity in Diversity” was the national motto, government policy promoted cultural homogenization. In August of 1967, Suharto called for the complete assimilation of Indonesians of foreign heritage. Ethnic Chinese were pressured to assume Indonesian names and abandon Chinese customs. Chinese-language newspapers were banned with the exception of one that was published by the government. While private Chinese groups could still establish schools, Chinese-language instruction was prohibited. Enforcement of assimilation laws was uneven and sometimes nonexistent, but the message was clear. Under Suharto, Chinese Indonesians developed multiple strategies to obscure their distinctiveness while preserving their networks.

All that changed when the regime collapsed in 1998. Decentralization and democratization allowed both ethnic and regional identities to reemerge. After more than a generation of restrictions, “Chinese Indonesians were now allowed to publicly display their religions, beliefs and customs and to start civil rights groups to reassess their position in society.”3 Of course, the new openness also enabled greater public expression of simmering ethnic tensions.

If my interlocutor’s story suggests anything beyond the anecdotal, it’s that the ethnic Chinese residents of Surabaya are eager to remain separate yet not necessarily very traditionally Chinese. I recount our conversation mostly because I think it sheds light on the very different ways Americans perceive and manage ethnic difference.

***

In both the governmental and the cultural spheres, diversity is generally celebrated as a positive social benefit in the United States, At the same time, anti-discrimination laws combined with strong contemporary social dictates on what is acceptable to say or discuss make many Americans reluctant to speak openly about ethnic and racial differences. So, ironically, what is presented in the abstract as an overall social good is also seen as a minefield that’s best to avoid.

The civil rights era push for racial integration also had the ancillary effect of encouraging ethnic assimilation. Starting with the 1954 Brown v. Board decision, which, as historian Anders Walker has written, “rested on the unsupportable assumption that black history, black traditions, and black institutions were inferior and should be destroyed, erased in favor of assimilating blacks into mainstream white America” had a profound effect on how Americans viewed the integration of all non-white groups.4 While diversity in America–as in all multiethnic nations–was always challenging, a desire to remain separate from the Anglo-centric mainstream was now viewed by both conservatives and liberals as being contrary to the judicially-sanctioned national drive to a harmonious future.

Ethnicity, which was once understood to be a more or less inevitable, if unfortunate, part of being an immigrant nation, came to be seen as a vestigial remnant of a primordial past, one characterized by ancient grudges, irrational loyalties, and primitive religious rituals and beliefs. The new integrationist nationalism now saw the embrace of ethnic identities as old-fashioned tribalism, which ran counter to the modern goal of national unity.

By the early 1970s, however, evolving anti-discrimination law and the advent of affirmative action–or positive discrimination as the British call it–incentivized the claiming of minority identities. Ethnic pressure groups lobbied the government to be granted protected status on the grounds that they had collectively suffered historic discrimination. Gaining protected status ensured that their co-ethnics benefitted from anti-discrimination laws and strategic advantages in college admissions and job applications.

This new regimen did not replace integrationist nationalism. It lived uncomfortably alongside it. While integrationism pushed assimilation, the civil rights regime encouraged continued ethnic identification. These conflicting regimes forged a new dynamic in which, in order to implement the new civil rights policies, government bureaucracies began to categorize members of specific national origin groups into larger administratively-convenient aggregate categories. Just as early modern governments in Europe once impose standardized weights and measurements on distant villages that had long since developed their own varied techniques, the U.S. government imposed new groupings in order to better keep track of the nation’s population.

What this meant was that growing numbers of immigrants from Asia and Latin American were now funneled into a new system in which, upon arrival, they were now categorized not as, say, Chinese or Korean, but as “Asian” and then assessed as to whether they could be designated as “protected minorities,”

The conflict between the ideologies of integrationism and minority protection forged a new type of assimilation, one that encouraged the children and grandchildren of immigrants to abandon the premodern elements of their heritage while remaining identified with one of the government-created categories. A Mexican Catholic, for instance, would transform into a “Hispanic,” with all the warmth, history, texture, and ancient religious customs that this cold bureaucratic term implies. The post civil rights American ethnic, in other words, was encouraged to drop the specific texture of their heritages while remaining vaguely distinctive from the majority. They were expected to strip themselves of ancient roots while integrating into a deracinated, aggregate “ethnic” category. In short, foreign-born parents could remain Korean while their children were transformed into “Asian Americans.”

The one exception to this rule was reserved for indigenous peoples largely because progressive whites saw their primordial identities as a powerful symbolic challenge to the prevailing norms of “western civilization.” Whereas Catholicism, for example, was viewed part and parcel of an oppressive western order, ancient indigenous beliefs were seen as fundamentally innocent, a quality many late 20th century American whites began to crave for themselves.

In any case, by the late 20th century, genuine cultural pluralism, the acceptance–however grudgingly or even racist–that all groups did not act, believe, or see the world the same as one another had given way to a conflicted ideology of national unity/homogeneity wrapped in faux, officially-sanctioned diversity. At the same time, Americans received constant warnings to watch what they said and did around protected minorities–or at least those with powerful allies–because any missteps might land them in a heap of trouble.

***

My interlocutor, I’ll call her Angela, is a third-generation Chinese Indonesia. Her grandparents were born in the Fujian Province in southeastern China and mostly spoke Hokkien. The street she grew up on was entirely Chinese. She estimates that her childhood neighborhood was maybe 90% Chinese. She attended all Chinese schools but the instruction was in Bahasa Indonesia, a standardized version of Malay, which was once the language of a tiny minority that for political and linguistic reasons was chosen to be the country’s official language upon independence in 1945. (Not entirely unlike Standard High German, Bahasa Indonesia was made the official national language in an effort to unify a nation of many tongues). Every single one of her friends is Chinese. She dated only Chinese men and eventually married one.

Thanks to a class she took for a summer in China, she speaks a little Mandarin, but not very well. Still, she says, the Chinese business people who come to Surabaya consider her fully Chinese. And, with China’s growing commercial presence in the region—it is, by far, Indonesia’s largest trading partner—that’s a big advantage.

When she was a little girl, Angela’s parents made it clear that the Javanese were not like her. Not only were they not at their same “cultural level,” but when they got paid they just frittered away their money. “They don’t know how to save,” she said in a concerned tone. She wouldn’t tell me what else her parents told her.

Other than language, the biggest difference Angela saw between herself and her parents and grandparents was religion. A few years ago, she said she began to question why her elders “worshipped their ancestors rather than God.” She converted to Catholicism not long after. While Christian missionaries had been proselytizing the Chinese in Indonesia since at least the 19th century, the number of conversions exploded under the Suharto regime. Some converted as a way of obscuring their ethnicity, their way of complying with the national policy of assimilation. Conversion was “not a matter of what they believed,” writes sociologist Andreas Susanto, “but what they perceived was safe.” So why not convert to Islam, the majority religion? It may have been a reflection of their “reluctance to assimilate into the indigenous society,” of keeping “their distance” from the Indonesian majority.5  Converting to Christianity also gave Chinese Indonesians a sense of belonging to a global network. Today, almost half are either Protestant or Catholic.

But Angela isn’t living under a coercive assimilationist policy, and other than her apparent disdain for what she called “ancestor worship,” I didn’t get a full grasp of why she converted. It’s quite possible that she did so religious reasons, that she was looking for some meaning in her life. Still, her parish is 60 to 70% Chinese, but the rest are mostly from migrants from outlying Nusa Tenggara and Madura islands. It’s the first time she has ever chosen to be in an ethnically-mixed private setting. She said she enjoyed it.

Without language and religion, it’s not entirely clear what Chineseness means to Angela other than the networks, attitudes, and behaviors that make for success in a global marketplace.

Before we said our goodbyes, we discussed the ethnic tensions that still exist between the Chinese and Javanese. We talked about the growing upward mobility among the Javanese. I asked her whether that was a good thing, thinking that perhaps Javanese economic success could undercut some of the jealousy and resentment some had felt for the Chinese. She disagreed. She didn’t like the trend. She said, “It just means there’s less for us Chinese.”

***

I have no idea whether Angela’s opinions are reflective of Chinese Indonesians at large. All I can say is that her honesty rattled me. It bothered me. I wasn’t used to it.

Sure, I’ve heard Americans utter all sorts of raw things on racial and ethnic matters over the years, but rarely so openly, and never to a complete stranger with a notebook in his hand. It made me wonder how much of America’s post-civil rights era “unity in diversity” regime has been based on the policing of speech, whether free intellectual–and cultural–expression had been among the costs of America’s much celebrated diversity. I wonder whether the price of social peace means we actually have little idea what Americans are thinking when no one is around. And, if so, can that really be considered peace?

  1. Raka Ibrahim, “Marrying into Chinese-Indonesian Families: Stories of Interethnic Relationships,” The Jakarta Post, January 31, 2022. https://www.thejakartapost.com/culture/2022/01/30/marrying-into-chinese-indonesian-families-stories-of-interethnic-relationships.html.   ↩︎
  2. Amy Chua, World on Fire : How Exporting Free Market Democracy Breeds Ethnic Hatred and Global Instability, (New York: Doubleday, 2003), 6. ↩︎
  3. Marleen Dieleman et al., “Chinese Indonesians and the Regime Change: Alternative Perspectives,” in Chinese Indonesians and Regime Change, ed. Marleen Dieleman, Juliette Koning, and Peter Post, (Leiden/Boston: Brill, 2011), 3. ↩︎
  4. Anders Walker, The Burning House: Jim Crow and the Making of Modern America, (New Haven: Yale University Press, 2018), 232-233. ↩︎
  5. Andreas Susanto, “Diversity in Compliance: Yogyakarta Chinese and the New Order Assimilation Policy,” in Chinese Indonesians and Regime Change, ed Marleen Dieleman, Juliette Koning, and Peter Post, (Leiden/Boston, 2011), 80-81. ↩︎

Don’t Let the Constitution’s Universalist Language Fool You.  It’s a Political, Not a Sacred Document. 

The Cloister of the Basílica de San Isidoro, León, Spain. (Photo by Gregory Rodriguez)

Madrid, Spain

You know how learning about ancient cultures can give you insights into the mysterious habits of contemporary humans?  Well, the same goes for learning about complex modern institutions.  Pondering their pre-modern origins can help bring their fundamental mechanics to light. 

It was in this spirit that I boarded a train to visit a medieval sandstone cloister in the northwestern Spanish city of León.  Believe it or not, I was hoping my quick trip to the Basílica de San Isidoro would teach me a few things, about, well, modern democracy.  


Americans tend to sanctify their Constitution in ways that obscure its real value and purpose. I suppose this sanctification is a way of symbolically embracing a country that lacks as firm an ethnic or cultural anchor as, say, any European nation.  When it was ratified, America’s Constitution was indeed the opposite of an anchor.  In 1787,  Francis Hopkinson, the Pennsylvania lawyer who attended the Constitutional Convention in addition to signing the Declaration of Independence and designing the American flag, called it a “roof” that united “the strength of 13 rafters.”1 In other words, the Framers created an overarching legal structure–a roof–long before a unifying cultural foundation developed on the ground.  Similarly, historian Daniel Boorstin has written that Americans view the Constitution as an “exoskelton,” something akin to a lobster’s shell that we filled out over time.2 Through the generations, it was the human interaction created by commerce and migration that forged the cultural ties that bound a heterogenous people together into a nation. 

The sanctification of the Constitution has also involved no small amount of ethnic chauvinism.  The blessings of American democracy, we’ve been taught, are inherited exclusively from England, where the very first written constitution, the Magna Carta, was issued in 1215 and the first parliament was called fifty years later.

By the mid-19th century, many white Americans were so convinced that they were the sole inheritors of the love of liberty and the habits of self-governance that they imagined it to be an almost racial trait that was passed down from one generation to the next. In January of 1845, Democratic Representative Alexander Duncan of Ohio told his colleagues that he thought “there seems to be something in our laws and institutions peculiarly adapted to our Anglo-Saxon-American race, under which they will thrive and prosper, but under which all others wilt and die.”3 In the early 20th century, a prominent Oxford historian proudly proclaimed that “parliamentary institutions” were “incomparably the greatest gift of the English people to the civilization of the world.”4

Now, I realize that this type of sanctification of the Constitution has had its purposes. Racial aspects aside, it fosters the reverence sometimes required to abide by its more absurd and out-dated elements. But treating America’s legal framework as some sort of mystical tablet also obscures our understanding of its critical role as social contract and source of our rights. The other problem, of course, is that the myth is based on a false premise. 

In 2009, in his book The Life and Death of Democracy, Australian political theorist John Keane “politely questioned”—in his words—“this English prejudice.”5 His research had led him to conclude that in 1188, a generation before the Magna Carta, Alfonso IX, the newly-crowned seventeen-year-old monarch of the Kingdom of León, had convened Europe’s very first parliament, or cortes in Spanish, within the cloisters of León’s Romanesque Basílica de San Isidoro.

Of course, it had not been unusual for Europe’s kings to gather with lords and bishops.  But Alfonso did something entirely new for European royalty, which was to invite representatives from the towns. This was the first recorded gathering of all three estates—nobility, the Church, and burghers.  The most basic definition of a parliament is an assembly involving various social groups of the realm, including representatives of towns.

But why would a king whose power was said to be granted directly by God seek to hold discussions with townsmen?  For one, he needed money to fight back the encroaching Muslim armies and plenty of Leonese were unhappy with the imposition of new taxes.  Secondly, Alfonso may have feared that an alliance between angry nobles and townsfolk might form against him.  In any case, according to Keane, the king was determined “to defend and expand his kingdom, even if that meant making political compromises that might dilute his kingly powers.”6 

Operating in the spirit of compromise, the king secured the backing of the three estates in exchange for his promise to “not wage war nor make peace or make any agreement without the counsel of bishops, nobles and good men,” which referred to leading citizens of the towns.7 In a series of what may have been up to 15 documents collectively called the Decreta, the king agreed that private property and personal domiciles were inviolable, that justice would be upheld in a routine, predictable manner including that any charges against a person must be backed by evidence and that detainees had the right to be defended by a third party.

What’s even more significant here, to me at least, is that the citizens of León did not secure new liberties out of some abstract reverence for rights. The social gains they made were the byproduct of a monarch’s inability to raise taxes, maintain peace in the realm, and otherwise rule his kingdom without the cooperation of the three estates.

As John Keane has eloquently put it, it was out of self-interest that Alfonso IX invented “a new mechanism for resolving disputes and striking bargains among interested parties who felt they had a common interest in reaching compromise.”8 The king’s baseline understanding of his kingdom, then, was not of a society requiring indivisible political community, but one comprised of competing and sometimes conflicting interests. To resolve inevitable conflicts, the Decreta contained an agreement that there would be future assemblies of the king involving representatives from the three estates. A regular parliament offered “the possibility of turning disagreements about reality into binding agreements in support of a common good.”9

Even more amazing is that the representatives of the towns who attended the cortes in the cloister of San Isidoro had been elected by the citizens of their towns.  While it isn’t known by what method they were elected nor exactly how many were present, historian Joseph F. O’Callaghan concluded that their “numbers in attendance must have been quite large.”10 

The Reconquista placed pressures on other monarchs in the Iberian Peninsula.  As they sought to strip people and territory from Muslim control, Christian kings had to “compete with the more advanced Muslim kingdoms in the south for the favours of the merchants and farmers,” and thus were “prepared to respect their property rights and grant them … privileges.”11 In 1126, sixty-two years before the first cortes in León, King Alfonso I of Aragon granted a charter of liberties to “Christians whom I brought, with the help of God out of the power of the Saracens and led into the lands of the Christians. . . . Because you left your homes and your estates for the name of Christ and out of love for me and came with me to populate my lands, I grant you good customs throughout your realm.”12


It would, of course, be naive to draw too straight a line between the medieval origins of the parliament and the spread of modern written constitutions in the mid-18th century. As representative assemblies—and the societies from which they emerged—became more complex, so too did the theories of governance and philosophical worldviews that came to animate western politics.

At the same time, however, as Spanish legal historian Aniceto Masferrer has argued, the enormous differences between the two eras notwithstanding, it is important to make connections between the medieval documents like the Decreta and the emergence of liberal governance six centuries later. If nothing else, Alfonso IX’s savvy political bargaining shows “how medieval Europe started to be aware of the convenience of limiting political power through law.”13


By the time the Framers sat down in 1787 to hammer out the U.S. Constitution, the English House of Commons had long since invented the idea of popular sovereignty as a way to challenge monarchical power. The ideology was developed, as historian Edmund Morgan wrote, to “justify a government in which the authority of kings stood below that of the people.”14  Of course, shifting the locus of sovereignty from the king to the people was not actually designed to put power in the hands of the people, but rather in those of the members of parliament.  

For their part, America’s Federalists—who could better be described as nationalists—embraced the idea of popular sovereignty as a way to weaken the power of individual states.  Why?  Because legislative majorities in most states had passed debt relief laws that the Framers felt threatened the property rights of creditors. (The minority the Framers sought to protect, then, was the wealthy, a class to which most belonged.) The idea–put forth by James Madison—was that the authority granted to the new national government would rest on the power of “the people” at large rather than on the collective authority of the states themselves. In short, he had invented “a sovereign American people to overcome the sovereign states.”15

Popular sovereignty, of course, was as much a “fiction” as was the divine right of kings.16 Indeed, the way it was “publicly presented” in America, political scientist Stephen Holmes has written, bore “a striking resemblance to proclamations in which absolute monarchs [once declared] their sovereign will.” What this meant was that rather than being perceived as an exchange of promises “between classes or factions or territorial subunits,” the U.S. Constitution was portrayed as a charter that “‘we the people’ [gave] ourselves.”17 

Paradoxically perhaps, it was this fiction that led the Federalists to argue against the inclusion of a Bill of Rights to the Constitution. When Antifederalists, those who opposed the ratification of the new Constitution for fear that it would give the national government too much power, first demanded that the document include a list of protected rights, Federalists called the request a quaint throwback to the time when kings granted concessions to their subjects. 

If the government derived its power directly from the people, they argued, then what sense would it make to have the people make concessions to themselves? Because America’s constitution could not be considered an agreement between or among parties, “neither concession nor contract was possible because people and government were one and the same.”18

Conversely, since it was “We the People” who conveyed specific powers to the national government, Federalists could argue, as one did, that the Constitution itself was “nothing more than a bill of rights—a declaration of the people in what manner they choose to be governed.”19

Evidently offended by the idea that a convention had been convened to hash out a mere compact, North Carolina’s James Iredell, a leading Federalist who would become one of the first justices of the U.S. Supreme Court, proclaimed that America’s government was  “founded on much nobler principles.” 20

Fortunately for Americans, the Antifederalists were not swayed by those nobler principles and did not give up on the idea that the Constitution was, as one delegate at a state ratifying convention put it, “a compact, agreement, covenant, bargain,” that required the government to put concessions in writing.21 James Madison, of course, ultimately relented to these demands out of political expediency. While he was himself a believer in rights, he nonetheless saw the addition of constitutional guarantees as a political rather than an ideological act. His reason for drafting the Bill of Rights, historian Pauline Maier has concluded, was “less to secure rights,” than to subdue opposition to the Constitution.22  If, as the Framers believed, one of the primary goals of the new Constitution was to protect the property rights of the wealthy minority, then adding amendments to safeguard such popular rights as speech, religion, press, and assembly, was a worthwhile compromise. 

Still, the Federalists’ fiction of popular sovereignty—and a unified American people—lived on in the way Americans think about their country.  Each school day,  American school children pledge allegiance to their “indivisible” nation. Even when we know that ugly, divisive presidential elections can be won by mere percentage points, we continue to refer reverently to the the voice of “the people.” When we ask a restaurant waiter what’s good on the menu, they’re likely to tell you what sells most as if majority opinion is the voice of good taste and wisdom.

If Medieval “Standestaat”—a state of estates—were thought to be divisible into three separate groups, then contemporary Americans tend to see the primary division in our national political community as being between the few and the many, the majority versus the minority, which they sometimes translate as the strong and the weak.23 So even as we herald the wisdom of the majority, we hail the genius of a Constitution that protects the minority. Indeed, Columbia University political scientist Giovanni Sartori once argued that the only reason to believe in constitutions at all is if “we think that somebody needs protection from somebody else.”24

This insight helps explain why Americans tend to think of rights in moral terms, sacred protections that are heroically demanded and/or benevolently bestowed. This weak/strong dynamic injects no small amount of paternalism into a political process we otherwise think of in terms of bargaining, sausage-making, and horse-trading. Rather than being perceived as a political compromise made to maintain social tranquility, the granting of rights is often portrayed as if it were a morality tale.  Which brings me back to Alfonso IX, who clearly saw it as a necessary element of a mutually beneficial exchange.

In his 2012 essay, “Constitutions and Constitutionalism,” NYU’s Stephen Holmes urged Americans to start thinking about rights more through the lens of realism than idealism.  Claiming this his observations should be interpreted as instructive rather than cynical, he argued the “democratic constitutions emerge and survive” when society’s “most powerful social forces find that they can promote their own interests most effectively by simultaneously promoting the interests of, and sharing political influence with, less powerful but not utterly powerless swaths of the population.”25

Why? Because it is only “when the powerful discover the advantages they can reap from making their own behavior predictable,” do they “voluntarily submit to constitutional constraints.” Put even more bluntly, when non-elites bring incentives to the bargaining table, “elites respond opportunistically by granting legal protections and participatory rights in exchange for cooperation indispensable to elite projects.”26

Holmes wasn’t the first theorist to make this argument.  In 1919, Max Weber, one of the founders of modern sociology, argued that much of modern western democracy was itself a product of national elites’ need for disciplined soldiers to fight wars.  It was military necessity, then, that compelled them “to secure the cooperation of the non-aristocratic masses and hence put arms, and along with arms political power, into their arms.”27 While Weber was not referring to the U.S. Constitution, he was nonetheless recognizing the existence of political bargaining as the essence of constitutionalism in general.  

Historian Linda Colley concurs with—and expands on–Weber in her remarkable 2021 book, The Gun, the Ship, and The Pen: Warfare, Constitutions, and the Making of the Modern World. The rash of new constitutions in the 18th century was, in part, a product of the rise in the number of bloody and expensive imperial, transcontinental wars. The new countries that emerged from this warfare “progressively elected to experiment with written constitutions as a means to reorder government, mark out and lay claim to contested boundaries” as well as to “legitimize their systems of government anew.” These new constitutions—including that of the United States–helped to “rally wider support and justify expanding fiscal and manpower demands.” These documents sometimes “functioned in effect and in part as bargains on paper. Male inhabitants of a state might be offered certain rights, including admission to the franchise, as a quid pro quo for accepting higher taxes and/or military conscription.”28 

This description is not pretty. It’s not mystical. Nor does it pretend that all parties to the negotiation are equal.  But it does provide a framework with which we can think about rights in terms of compromise and mutual benefit rather than merely in sacred principles and abstractions. This doesn’t mean that the Framers didn’t infuse the document with a desire for reform or utopian hopes, just that the harsh realities of geopolitics—particularly threats from Britain to the north, Spain on the Mississippi, and Native Americans throughout the inland frontier—were never far from their minds. The Constitution that was drafted in Philadelphia during the summer of 1787, writes Colley, “was often approached at the time less as a ‘blueprint of a liberal democratic society,’ . . . than as a grimly necessary plan for a more effective and defendable union.”29


So what did I learn in León?  I learned that from the very beginning constitutions are practical political documents that formalize the results of bargaining between competing sectors of a given political community; that they are amoral rule books that set the boundaries of future debate, establish the obligations each sector owes to the other, and constrain the actions of members of signatory groups long after the signers of the parchment are dead. Most importantly, I learned that if they are to survive, all parties must continue to persuade the other that they can more effectively get what they want if they agree to support each other.  And finally, that, while inspiring, America’s universalist language of rights can be deceiving; that our civil rights—liberties derived from membership in a particular polity—are a far more powerful source of freedom than human rights–liberties that all persons should theoretically enjoy.

  1. Paul M. Zall, ed., Comical Spirit of Seventy-Six: The Humor of Francis Hopkinson (San Marino: The Huntington Library, 1976), 191. ↩︎
  2. Daniel J. Boorstin, The Genius of American Politics (Chicago: The University of Chicago Press, 1953, 191. ↩︎
  3. Reginald Horsman, Race and Manifest Destiny: The Origins of American Racial Anglo-Saxonism (Cambridge: Harvard University Press, 1981), 227. ↩︎
  4. A.F. Pollard, The Evolution of Parliament (London: Longmans, Green & Company, 1920), 3. ↩︎
  5. John Keane, “The Future of Parliaments,” (keynote address, EU Global Project to Strengthen the Capacity of Parliaments, León, Spain, June 30, 2023). ↩︎
  6. John Keane, The Shortest History of Democracy: 4,000 Years of Self-Government—A Retelling for Our Times (New York: The Experiment, 2022). 79. ↩︎
  7. María Esther Seijas Villadangos, “Origin of Parliamentarism: An Historical Review of its Crisis: León (Spain) as Cradle of Parliamentarism,” Revista Acadêmica da Faculdade de Direito do Recife 88, no. 2., (July/December 2016): 22. ↩︎
  8. John Keane, The Life and Death of Democracy (London: Simon & Schuster, 2009), 176. ↩︎
  9. Keane, “The Future of Parliaments”. ↩︎
  10. Joseph F. O’Callaghan, “The Beginnings of the Cortes of León-Castile,” The American Historical Review 74, no. 5 (June 1969): 1514. ↩︎
  11. Jan Luiten van Zanden, Eltjo Buringh, and Maarten Bosker, “The Rise and Decline of European Parliaments, 1188-1789,” The Economic History Review 65, no. 3 (August 2012): 839. ↩︎
  12. Joseph F. O’Callaghan, History of Medieval Spain (Ithaca,NY: Cornell University Press, 1975), 285. ↩︎
  13. Aniceto Masferrer, “The Spanish Origins of Limiting Royal Power in the Medieval Western World: The Córtes of León and Their Decreta (1188),” in Golden Bulls and Chartas: European Medieval Documents of Liberties, ed. Elemér Balogh (Budapest: Central European Academic Publishing, 2023), 31. ↩︎
  14. Edmund S. Morgan, Inventing the People: The Rise of Popular Sovereignty in England and America (New York: W.W. Norton, 1989), 56. ↩︎
  15. Morgan, Inventing the People, 267. ↩︎
  16. Morgan, Inventing the People, 13. ↩︎
  17. Stephen Holmes, “Precommitment and the Paradox of Democracy,” in Passions and Constraint: On the Theory of Liberal Democracy (Chicago: The University of Chicago Press, 1995,) 146. ↩︎
  18. Morgan, Inventing the People, 283. ↩︎
  19. Morgan, Inventing the People, 283. ↩︎
  20. Gordon S. Wood, The Creation of the American Republic, 1776-1787 (Chapel Hill: The University of North Carolina Press, 1969), 541-542. ↩︎
  21. Wood, The Creation of the American Republic, 541. ↩︎
  22. Pauline Maier, Ratification: The People Debate the Constitution, 1787-1788 (New York: Simon & Schuster, 2010), 446. ↩︎
  23. Daniel Chirot, “The Rise of the West,” American Sociological Review 50, no. 2 (April 1985): 185. ↩︎
  24. Giovanni Sartori, “Constitutionalism: A Preliminary Discussion,” The American Political Science Review 56, no. 4 (December 1962): 855. ↩︎
  25. Stephen Holmes, “Constitutions and Constitutionalism,” in The Oxford Handbook of Comparative Constitutional Law, ed. Michel Rosenfeld and András Sajó (Oxford, UK: Oxford University Press, 2012), 215. ↩︎
  26. Holmes, “Constitutions and Constitutionalism,” 214-215. ↩︎
  27. Max Weber, General Economic History, trans. Frank H. Knight (Glencoe, IL: The Free Press, 1950), 325. ↩︎
  28. Linda Colley, The Gun, The Ship, and The Pen: Warfare, Constitutions, and the Making of the Modern World (New York: Liveright Publishing Corporation, 2021), 7. ↩︎
  29. Colley, The Gun, The Ship, and The Pen, 121. ↩︎

The Marriage of Wokeness and Cash

Why America Has Seen an Explosion in Public Recrimination

(Image by Jørgen Carling)

At its best, Wokeness is an awareness of how race and gender bias can produce societal inequalities. At its worst, it’s a racket in which upper-middle-class college graduates wield victimhood status in a bid for financial gain or career advancement. 

When the Civil Rights Act was passed in 1964, victims of workplace discrimination had recourse to only a few remedies, such as back pay, restoration of a promotion or benefits, or job reinstatement.  A few years later, newly minted federal Affirmative Action programs held that members of minority groups that had a history of facing discrimination could now benefit from preferential hiring schemes. In the first instance, an employee had to prove discrimination had occurred, in the second, a minority job applicant was assumed to need protection from discrimination in any hiring process.

While remedies for workplace discrimination were not controversial because they were focused on making the victim whole, preferential hiring–as well as college admissions—was problematic by virtue of the fact that it meant that a non-minority’s application could be thrown out to make room for the minority applicant.  Not surprisingly, the issue reached the Supreme Court, and in 1978 the justices reached a convoluted split decision that upheld Affirmative Action, made racial quotas illegal, said race could nonetheless be considered in college admissions, and failed to determine how applicants would have to prove past discrimination in order to receive protected or preferential status.

A little more than a decade later, Congress upped the ante by passing the Civil Rights Act of 1991, which for the first time explicitly allowed employees to seek compensatory and punitive damages in both racial and gender discrimination cases.  In essence, this updating of the Civil Rights Act transformed the anti-discrimination statute from one whose remedy was equitable relief—restoring the complainant to his or her employment situation to what it had been before the offending act occurred—to something more like a tort—a wrongful or negligent act—which allows for trials in which juries could order parties they found liable to cough up damages.

In short, the CRA of 1991 fundamentally changed the way discrimination was to be remedied. Before 1991, anti-discrimination statutes were based on a traditional labor model that sought conciliation in employer-employee relations. After 1991, employment disputes would more and more be resolved through litigation—or the threat of it— involving compensation for the victim and punishment for the offender.  Whereas in the 1960s civil rights advocates had wanted anti-discrimination enforcement to revolve around a New-Deal style government authority, over time they came to embrace the benefits of private enforcement in which every complainant was a potential plaintiff.

Now that large sums of money were potentially involved, most observers anticipated an increase in discrimination litigation. And, of course, that came to pass. Indeed, the new law incentivized the filing of complaints, and changed how Americans approached their interactions with one another in the workplace and beyond.  In the same way that the explosion in personal injury lawsuits turned every fall in a shopping center into a potential lawsuit, so too did making discrimination into a tort encourage Americans to see every slight and professional setback as potential sources of compensation.

Just as importantly, the new law made discrimination cases, which had once been considered too much work for too little pay off, more attractive to attorneys. To sweeten the deal, the law now authorized plaintiffs to recover attorney’s fees if they won their cases.  One could say that if Martin Luther King, Jr., was among the inspirations for the passage of the Civil Rights Act of 1964, then the flamboyant, scandal-plagued San Francisco attorney Melvin Belli, the so-called “King of Torts,” was the spiritual father of the Civil Rights Act of 1991. More than anyone in America, Belli was responsible for the successful post-war push by trial lawyers to increase damages awards in personal injury cases. Thus, the Civil Rights Act of 1991 married the sacrifices of the 1960s struggle for social justice with the ethos of an era best symbolized by a scene in the 1987 movie Wall Street in which Michael Douglas–playing the fictional corporate raider Gordon Gekko–famously proclaimed that “Greed is good.”

The term “woke,” which became a watch word among African American activists as early as the first half of the 20th-Century, was catapulted into the mainstream in 2014 in the wake of the shooting of Michael Brown in Ferguson, Missouri. While initially associated with the Black Lives Matter movement, it was quickly adopted by white Progressives committed to fighting inequality. The term then quickly came to encompass a constellation of ideas, slogans, and programs that sought to liberate a growing number of marginalized groups from a structurally biased system.

There was nothing particularly revolutionary about these activists’ stated concerns.  What was distinctive was both the tone of their rhetoric and their favored solutions for addressing disparities. Whereas the civil rights movement of the early 1960s inspired some of the nation’s greatest legislative successes through a spirit of forgiveness and reconciliation, the rhetoric of Wokeness was full of resentment, vengeance, and demands for immediate reparation.

While the definition of discrimination had been expanding steadily since 1964, suddenly Americans were being told that there were a whole lot more ways to offend and oppress their fellow Americans than they had ever imagined. Even as most Americans thought that the lives of minorities and women had improved significantly since the 1960s, the bar for what constitutes–and the standard of evidence to prove–discrimination was being lowered.  University administrators and company human resources experts now considered American society so fundamentally prejudiced and hostile that they introduced new ways of protecting people–such as trigger warnings and safe spaces–from even the slightest faux pax, which were now inelegantly called microaggressions. 

Meantime, those who failed to comply with the growing list of social infractions risked being “called out,” “cancelled,” fired, sued, and otherwise painted with a scarlet letter. To a sober observer, the sheer number and frequency of these auto-da fés seemed more than a little exaggerated. There’s no denying that discrimination occurs in America, but why all of a sudden was there an explosion of accusations and public recriminations, particularly in universities and among the educated upper middle classes, people who are among the most privileged humans on the planet? And why did so many people who were not directly involved in the incidents join in the angry choruses while others remained silent in the face of what were objectively disturbing spectacles?

Greg Lukianoff and Jonathan Haidt, the authors of The Coddling of the American Mind, have called the driving impulse of this era of discontent “vindictive protectiveness,” the practice of publicly shame and punishing those accused of having said or done anything to harm a member of a protected group.  Because even casual defenders of the accused are not immune to these mob attacks, those uncomfortable with the idea of public stoning tend to keep their objections to themselves.  This behavior, Lukianoff and Haidt argue, created “a culture in which everyone must think twice before speaking up, lest they face charges of insensitivity, aggression, or worse.” 

Vindictive protectiveness, Lukianoff and Haidt argue, arose from a style of fearful and overprotective parenting that educated middle-class and above parents began to practice in the 1980s and ’90s in an effort to give their children a competitive edge in life.  Wanting only the best, these parents sought to cultivate their children’s talents while erasing all potential sources of risk and adversity in their environments.  Overscheduled, over supervised, and left with precious little time to play and explore on their own, these children eventually arrived on university campuses believing that the world was dangerous, bad people should be removed from their presence, and that the institutions around them should protect them in the way their parents had.

The universities, of course, obliged. As higher education has become big business, students have been transformed into customers, and we all know that the customer is always right. Furthermore, Lukianoff and Haidt suggest, the rise of vindicative protectiveness may also “be related to recent changes in the interpretation of federal antidiscrimination statutes.”  Not only do university administrators seek to avoid lawsuits from students, they also want to avert any investigations by the Department of Justice into their civil rights compliance. Ironically, that’s why they developed “bias incident reporting” systems that allow students to report anonymously on anyone they feel has caused them or anyone else to experience any type of bias.  At Cornell University, for example, a bias incident is something done or said “that one could reasonably and prudently conclude is motivated, in whole or in part, by the alleged offender’s bias against an actual or perceived aspect of diversity, including, but not limited to, age, ancestry or ethnicity, color, creed, disability, gender, gender identity or expression, height, immigration or citizenship status, marital status, national origin, race, religion, religious practice, sexual orientation, socioeconomic status, or weight.”

In short, a combination of the desire to protect students as well as to remain in compliance with civil rights statutes and regulations turned campuses into places where students are encouraged to report on professors, staff, subcontractors, and, of course, one another. In response to professor pushback at the anonymous reporting system at Stanford, a university spokesperson insisted that the “process aims to promote a climate of respect.” Still, there is a growing realization that such reporting systems can be both easily abused and limit free speech. Nonetheless, as students graduate from college, they take the expectations these systems foster with them into the labor force.

The commentariat is telling us that Trump’s victory spells the end of Woke.  But too many election post-mortems treat Wokeness as a purely political phenomenon. Yes, it employs left-wing ideologies that treat identity and victimization as sources of resistance and power.  But Wokeness would never have become so pernicious outside the university had there not been the temptation (and fear) of financial gain (and loss).  In addition to money, cancellation also holds out the promise of professional advancement.  Campaigning to remove one’s colleague or boss–and even helping others cancel theirs–is also a way for young people to remove their supervisors, clear the field of competition and move up the ranks.

Ask any cynical political hack how to decode the power and drive of any politicized trend and they’re likely to tell you to follow the money.  Wokeness is no different. After you find out who all has profited from this phenomenon, you’ll come face to face with the absurd fact that one of the most diverse countries in the world has chosen to encourage private litigation and the threat of financial damages to curb discrimination and promote a more just and cooperative society.  What could possibly go wrong?

Gregory Rodriguez is the author of Whiteness: An American Tragedy and Other Essays and Mongrels, Bastards, Orphans, and Vagabonds: Mexican Immigration and the Future of Race in America. He is working on a book on the rise and fall of rights-based liberalism.

Verified by MonsterInsights