The Kettle, the Pot, and the Culture of Cancel

(Photo by Chris Christian)

Progressives are right to be up in arms about the right’s efforts to have people fired for speech they deem inappropriate. But they’d have a whole lot more credibility if they had spoken up any time over the past decade when folks on the left kept plenty busy employing the very same tactics.

Center-left columnist Noah Smith has a smart essay up on how progressives “got addicted” to the once “invincible weapon” that had the leadership of “every organization in the country” fearing that “disgruntled or opportunistic subordinates would take their grievances online and summon the dreaded cancel-mob against their superiors.

Smith contends that the “threat of progressive cancel culture in America has been defused,” but only because the right, which currently holds the balance of political power, has adopted the strategy for itself.

He argues that “cancel culture” has harmed the country by substituting “nastiness for persuasion” and “robbed America of the incisive commentary of which intelligent progressives would otherwise be capable.” But I’d add that it has also hurt the country in additional, more profound ways. In an effort to protect their own hides from the mob, countless numbers of Americans—particularly those in the world of ideas—allowed themselves to be cowed into silent complicity. And there is no intellectual integrity in cowardice.

If liberals don’t come clean about their own complicity in the culture of cancel, then we’re probably looking at a future of alternating cycles of crackdowns, with each side justifying firings and the suppression of speech as a legitimate means to protect their party’s preferred groups and favored minorities.

So, yes, the Trump/Vance administration does pose a threat to free speech in America. And I hope that liberals in the business of thinking and writing can figure out effective means of combatting that threat. But their arguments won’t fix anything if they don’t first acknowledge that they, too, sat by silently while the left-wing fringe ran around punishing apostates like a secular version of the Taliban Morality Police. No matter which side is doing the flogging, cancel culture is bad for America.

Allowing Human Suffering to Break Your Heart

This morning, at my regular coffee place, a middle-aged woman fell to the ground and began writhing and groaning in pain. The cries she emitted for at least the next half hour were heartbreaking, and, at times, terrifying.

To be in the presence of so much pain is overwhelming. If you can ignore the temptation to turn away, it can shake you to your core, and inspire a mix of humility and gratitude. If you allow it to, it can wipe your mind of petty gripes. There but for the grace of God …

I’m pretty sure that nobody present asked what the poor woman had done or said to deserve such pain. No one got up to leave. We all seemed to be waiting with her for the ambulance to arrive. No one was thinking what they had in common with her, what she believed in, what she did for a living, or whether she or whatever group she associated with had harmed anyone else. All we knew was that she was a human in great pain. And her suffering was causing the friends she had been sitting with another type of pain.

I’m pretty sure everyone at the café wished for her to find relief, consolation, peace. Once you’ve decided not to look away, it’s hard to see fellow humans in great pain, unless, of course, you’ve convinced yourself that that those humans and their loved ones deserve such pain.

But to do that, you’d first have to embrace a rationale that gives you permission not to feel compassion in the face of human suffering. And to not feel compassion in the face of human suffering is literally the definition of being inhumane.

The U.S. Has Become the Cultural Equivalent of a Big Box Store. Localism Can Help Americans Feel More at Home. 

(Photo by Steve Banfield)

Athens, Georgia

There was a fun article in the Süddeutsche Zeitung earlier this month on immigrant nurses in Germany’s Swabian mountains.1  Germany has a severe nursing shortage and has been recruiting health care workers from around the world. The small town of Markgröningen, about a half hour car ride north from Stuttgart, has welcomed nurses from countries as far away as Kenya, India, and Mexico.  

While foreign recruits have long been obliged to take German language classes to prep for their jobs, regional medical authorities have recently added a three-day course in Swabian. That’s because nurses were having a hard time understanding patients who prefer to speak dialect to standard German. And not only do the nurses learn the Swabian words for “to prick” or “to annoy,” they’re also taught local proverbs and sayings that sound ludicrous to the foreign ear. 

According to a 2009 study by Mannheim University’s Institute for the German Language, about 60% of Germans speak one regional dialect or the other. (The definition of a dialect, by the way, is a language that doesn’t enjoy the support of an army or navy.)  Of those who speak dialect, almost half (45%) say they do so “always” or “often.”2

I love this story, because I’m a big believer in the virtues of localism. I  believe expressive freedom is found more in the small places out of sight and reach of national political power. I think the increasing centralization of government in the U.S. has diminished regional distinctiveness and turned the broader culture into the cultural equivalent of a big box store. 

Too many cultural dictates originate from the top of the political pyramid and that is exactly where a functioning culture should not be born. A healthy culture has little to do with what Americans tend to obsess over–identity, power, or pride. More fundamentally, culture is about belonging, intimacy, meaning, and enjoying a sense of place, your place, your neck of the woods, your little corner of the world.

One powerful example of a different way of envisioning the relationship between province and nation is that in German the word heimat –“homeland”–generally refers to one’s home region and not to the entire country. The term implies deep attachment and affection.  What this tells me is that federalism is not just a way of structuring layers of government, it can also be a model for how to understand the proper relationship between regional and national cultures.

The best way to encapsulate the meaning of federalism is that it “has to do with the need of people and polities to unite for common purposes yet remain separate to preserve their respective integrities. It is rather like wanting to have one’s cake and eat it too.”3

***

We’ve all heard of companies that are said to be too big to fail.  Well, I think the U.S. is too big to love.  Because more and more Americans are giving away their little pieces of cake, they now think American culture is either entirely about abstract ideals or resides in the District of Columbia. In either case, their idea of Americanness is increasingly divorced from the things and people and places with which they are most familiar, the very things that give them comfort and inspiration.

Sadly, the findings of the recent Gallup Poll on patriotism suggest that our affection for our country is dependent on our feelings for the current occupant of the White House. Is that all America is? Politics? A single political regime? That’s a remarkably flimsy basis upon which to base one’s feelings for his homeland. 

***

Over the past several years, I’ve spent a lot of time in the American South and am fascinated by the role their regional identity serves as a prism through which they see their Americaness. Their regional accent, which many see as a sign of their distinctiveness, is not entirely unlike a dialect in Germany. It’s a symbol of home, comfort, rootedness, and, if not class, then of a memory of simpler, premodern times. 

I’ve been struck by how many educated people I’ve met at and around the University of Georgia who either hide or only use their accents on certain occasions. Those who hide their accents tend to do so out of fear of drawing the condescension of non-Southerners. Those who use it selectively tend to do so when they’re tired, tipsy, sick, being affectionate, or trying to get out of a speeding ticket.

One young woman I know who grew up in a small farming town in central Georgia made a big effort to lose her thick drawl while she was an undergraduate. Not unlike an upwardly mobile child of immigrants, she sees her unaccented English as a sign of her education, as having become a mainstream middle-class American.  Still, like so many, she peppers her speech and text messages with “y’all and “‘ppreciate ya.” There are times she’ll jokingly put on a heavy drawl when saying certain words like “Southern” or “friend.” Her ambivalence about her regional accent is telling. A recent study suggests that in Georgia the Southern accent is fading. 

The point I’m making is that there is an idea in the land that 340 million Americans should adhere to a standardized mainstream culture. Even efforts to promote “diversity” or “inclusion” only seek to ensure statistical integration of various groups within a uniform cultural model. They don’t encourage a broader form of pluralism, one that allows institutions to emerge from the distinctive history—and demographic makeup of—a particular region. They promote diversity within organizations, not among them. Frameworks like DEI, for instance, don’t properly respect the significance of, say, non-diverse historically black colleges in the South. 

Indeed, whenever I hear the terms Diversity, Equity, and Inclusion, I think of that 1960s craze in which college students tried to pile as many people as they could into a Volkswagen.  But what about all the Americans who have no desire to squeeze into that Volkswagen? What about those who’d prefer to ride in a Hyundai, a Honda, or a roomier Lincoln Continental?

Standardization stifles creativity.  It breeds placelessness and alienation. Most importantly, it doesn’t nurture the kind of community that people need.  For a country whose political speak gives a lot of lip service to the importance of choice, America has become a one-size-fits-all culture. 

What I’m advocating is not multiculturalism in which distinct cultural or linguistic groups at large are encouraged to stay in their own lanes. To my mind, what America needs is more respect for regional integrity, to allow the people in its many places to nurture and maintain that which is distinctive about their homelands so when newcomers inevitably resettle there,  becoming American will also mean learning the ways of the locals.

  1. Christina Lopinski, “‘Mei Knie tut weh wie d’Sau’; Warum spricht der Patient im Bett von einem Schwein? Viele ausländische Pflegekräfte tun sich schwer mit dem Dialekt Ihrer Patienten. In Markgröningen lernen sie jetzt Schwäbisch,” Süddeutsche Zeitung, July 19, 2025. ↩︎
  2. Ludwig M. Eichinger et al, “Aktuelle Spracheinstellungen in Deutschland: Erste Ergebnisse einer bundesweiten Repräsentativumfrage,” Institute for the German Language, University of Mannheim, 2009. 11,16. ↩︎
  3. Daniel J. Elazar, Exploring Federalism (Tuscaloosa: The University of Alabama Press, 1987), 33. ↩︎

In Defense of Immigrants, Beware the Politics of Pity

(Photo by Ross Pollack)

Photos and videos of masked ICE agents dragging away undocumented immigrants are chilling on multiple levels. They at once inspire fear of uncontrolled government coercion and sorrow for the hapless victims taken down in a court house or doctor’s oce. 

Outrage over the images is understandable and may even be responsible for a shift in public opinion in favor of immigrants. But the reaction from the liberal political/intellectual class has, at times, been both undisciplined and irresponsible. Their raw rhetoric may be emotionally satisfying, but it just reinforces the racial hierarchy that led to the enactment of this kind of policy in the first place.

I’ve heard liberals compare ICE agents to the Gestapo. That must feel good, but it betrays a stunning ignorance of the horrors of the mass murder of millions of Jews, Sinti, and Roma. I would recommend they read historian Christopher Browning’s account of how a group of Hamburg policemen became desensitized to shooting children in the back of the head.  

Then last month, a venerable voice of America’s labor-left, Harold Meyerson, penned a piece in the Financial Times in which he compared the City of Los Angeles’ resistance to enforcing federal immigration laws to the northern states’ refusal to abide by the Fugitive Slave Act of 1850, which legally compelled northerners to cooperate with slave hunters and federal authorities. “America has been here before,” he intoned.1

So why the moral hyperbole? Why compare the tragic consequences of decades of woeful immigration policy to even greater historical evils? 

Because it is easy and politically expedient. Distribute disturbing images of round ups and compare U.S. immigration enforcement agents to Nazis and slave hunters and the public will become so revolted that they’ll refuse to have these acts done in their name.

But there’s also something else at play here that’s become so characteristic of American progressive politics. It’s the deployment of the self-serving politics of pity.

***

It wasn’t until the French Revolution that the concern for distant others entered the realm of politics in the West.

The Jacobins injected the idea of virtue into the political realm, and virtue to them meant subduing one’s individual political desires to those of the will of the people. The people, however, meant something very different to French revolutionaries than it did to the framers of the American Constitution. 

To James Madison, the term was a nod to a broad imaginary consensus among white people across the states, not unlike the meaning of “Peoria” to a Hollywood producer. To Robespierre, however, the meaning of le peuple was “born out of compassion,”as political theorist Hannah Arendt observed, forged “by those who were exposed to the spectacle of the people’s sufferings, which they themselves did not share.” Le peuple—in her words—referred to “more than those who did not participate in government,” not merely to citizens, but to “the low people.”2 For a member of the revolutionary political class to prove his worthiness to represent the unhappy people—as Robespierre called them—he had to prove his ability to share their pain. Hence, the highest political virtue for more privileged revolutionaries became the ability to show compassion for the most unfortunate.

Arendt’s use of “spectacle” doesn’t suggest there’s anything outrageous involved, but simply that there is distance between the observer and the sufferer; that to the extent that feelings are shared between them, they are based purely on sight rather than on experience. Compassion–shared feeling–occurs when individuals experience pain together, or at least when the observer can imagine feeling the same pain. But the greater the distance between observer and sufferer, the more likely the compassion devolves into pity, a sentiment which implies hierarchy, even superiority. While most people would welcome the compassion of others, few wish to be pitied. To inspire pity makes one pitiful, and the pitiful, those who are deemed incapable of helping themselves, can inspire contempt. 

***

Manipulating outrage and white compassion for oppressed African-Americans worked well for civil rights advocates in the 1950s and the early 1960s. It was images of police brutality, German Shepherds, and firehoses that caught the attention of John F. Kennedy.  A decade earlier, it was testimony that segregation damaged the personalities of black children that led to the decision in Brown v. Board. Yet, however effective politically, damage imagery—as sociologists call it—has its costs. For one, what appeals to white benevolence isn’t always what’s best for the group in question.  

The great civil rights strategist Bayard Rustin insisted that it was “the virtues praised by the black preacher” that “ultimately became the strengths of the civil rights movement.” More than anything else, it was the “perseverance and courage—characteristics extolled from the pulpit each Sunday” that helped give ordinary people the bravery and self-control to face “the clubs, the firehoses, and the dogs.”3 

***

News stories carrying images of ICE agents aggressively nabbing immigrants sometimes come with quotes from psychologists warning of the long-term psychological trauma the children of the deported are sure to face. Sympathetic media invariably paint the very immigrants who had the courage and resourcefulness to uproot their lives and take their chances in a foreign land, as timid innocents. 

Even the highest ranking defender of immigrants, U.S. Senator Alex Padilla shared his trauma in an emotional floor speech. He fought back tears as he recounted being handcuffed and dragged out of a press conference after he challenged the secretary of Homeland Security on ICE’s aggressive tactics.

***

There’s little doubt that victimhood carries persuasive moral authority in American politics. It’s an inevitable part of political strategy.  But it cannot be allowed to diminish the dignity of the very people who need support.  Sociologist Jerry Gafio Watts argued that victimhood is ultimately a “parasitic status,” entirely dependent on the compassion–virtue–of the distant observer. 4 

Fifty years ago, historian and sociologist Orlando Patterson warned that gains born of the politics of pity can wind up being Pyrrhic victories. “As long as one is in the position where one has to appeal to the moral sense and mercy of another person,” he wrote, “one remains, almost by definition, his moral inferior.”5 That’s because in a crisis like this, the moral superiority of the virtuous observers is elevated more than that of the victims themselves, which only reinforces the racial hierarchy all Latinos–not just the foreign born, and not just the undocumented–must contend with every day. 

In 1963, novelist Ralph Ellison responded to a white literary critic’s contention that since the “real” black experience was one of unrelenting suffering, the black writer must always be commensurately outraged. Taking offense, Ellison responded by referring to an African American tradition that “teaches to deflect racial provocation and to master and contain pain.” It is a tradition, he wrote, “which abhors as obscene any trading on one’s own anguish for gain or sympathy; which springs not from a desire to deny the harshness of existence but from a will to deal with it as men at their best have always done.”6 I promise you, immigrants of all backgrounds share that same ethos.

When Donald Trump or his allies have demonized Latino and other immigrants, they paint them in exaggerated negative caricatures.  Surely the defense of immigrants–undocumented or otherwise–should not resort to employing equally distorted depictions that ultimately serve to feed the moral superiority of the political/intellectual class.

Because in the end, if legislation is ever passed that will overcome America’s dishonest immigration policy–one that allows the U.S. to benefit from the work of people to whom it refuses to grant legal status–it will not be due to compassion.  It’ll be because both sides of the aisle finally transcend moral hyperbole and admit that America needs tough, driven, resilient people to keep its economy growing.

  1. Harold Meyerson, “Trump is Provoking L.A. to Fire Up His Base,” Financial Times, June 9, 2025. ↩︎
  2. Hannah Arendt, On Revolution (New York: Penguin, 1990), 75. ↩︎
  3. Bayard Rustin, Strategies for Freedom: The Changing Patterns of Black Protest (New York: Columbia University Press, 1976), 40. ↩︎
  4. Jerry Gafio Watts, Heroism and the Black Intellectual: Ralph Ellison, Politics, and Afro-American Intellectual Life (Chapel Hill: The University of North Carolina Press, 1994), 10. ↩︎
  5. Orlando Patterson, “The Moral Crisis of the Black American,” The Public Interest 32, (Summer 1973): 52. ↩︎
  6. Ralph Ellison, “The World and the Jug,” in Shadow and Act (New York: Vintage Books, 1972), 111. ↩︎

Don’t Let the Constitution’s Universalist Language Fool You.  It’s a Political, Not a Sacred Document. 

The Cloister of the Basílica de San Isidoro, León, Spain. (Photo by Gregory Rodriguez)

Madrid, Spain

You know how learning about ancient cultures can give you insights into the mysterious habits of contemporary humans?  Well, the same goes for learning about complex modern institutions.  Pondering their pre-modern origins can help bring their fundamental mechanics to light. 

It was in this spirit that I boarded a train to visit a medieval sandstone cloister in the northwestern Spanish city of León.  Believe it or not, I was hoping my quick trip to the Basílica de San Isidoro would teach me a few things, about, well, modern democracy.  


Americans tend to sanctify their Constitution in ways that obscure its real value and purpose. I suppose this sanctification is a way of symbolically embracing a country that lacks as firm an ethnic or cultural anchor as, say, any European nation.  When it was ratified, America’s Constitution was indeed the opposite of an anchor.  In 1787,  Francis Hopkinson, the Pennsylvania lawyer who attended the Constitutional Convention in addition to signing the Declaration of Independence and designing the American flag, called it a “roof” that united “the strength of 13 rafters.”1 In other words, the Framers created an overarching legal structure–a roof–long before a unifying cultural foundation developed on the ground.  Similarly, historian Daniel Boorstin has written that Americans view the Constitution as an “exoskelton,” something akin to a lobster’s shell that we filled out over time.2 Through the generations, it was the human interaction created by commerce and migration that forged the cultural ties that bound a heterogenous people together into a nation. 

The sanctification of the Constitution has also involved no small amount of ethnic chauvinism.  The blessings of American democracy, we’ve been taught, are inherited exclusively from England, where the very first written constitution, the Magna Carta, was issued in 1215 and the first parliament was called fifty years later.

By the mid-19th century, many white Americans were so convinced that they were the sole inheritors of the love of liberty and the habits of self-governance that they imagined it to be an almost racial trait that was passed down from one generation to the next. In January of 1845, Democratic Representative Alexander Duncan of Ohio told his colleagues that he thought “there seems to be something in our laws and institutions peculiarly adapted to our Anglo-Saxon-American race, under which they will thrive and prosper, but under which all others wilt and die.”3 In the early 20th century, a prominent Oxford historian proudly proclaimed that “parliamentary institutions” were “incomparably the greatest gift of the English people to the civilization of the world.”4

Now, I realize that this type of sanctification of the Constitution has had its purposes. Racial aspects aside, it fosters the reverence sometimes required to abide by its more absurd and out-dated elements. But treating America’s legal framework as some sort of mystical tablet also obscures our understanding of its critical role as social contract and source of our rights. The other problem, of course, is that the myth is based on a false premise. 

In 2009, in his book The Life and Death of Democracy, Australian political theorist John Keane “politely questioned”—in his words—“this English prejudice.”5 His research had led him to conclude that in 1188, a generation before the Magna Carta, Alfonso IX, the newly-crowned seventeen-year-old monarch of the Kingdom of León, had convened Europe’s very first parliament, or cortes in Spanish, within the cloisters of León’s Romanesque Basílica de San Isidoro.

Of course, it had not been unusual for Europe’s kings to gather with lords and bishops.  But Alfonso did something entirely new for European royalty, which was to invite representatives from the towns. This was the first recorded gathering of all three estates—nobility, the Church, and burghers.  The most basic definition of a parliament is an assembly involving various social groups of the realm, including representatives of towns.

But why would a king whose power was said to be granted directly by God seek to hold discussions with townsmen?  For one, he needed money to fight back the encroaching Muslim armies and plenty of Leonese were unhappy with the imposition of new taxes.  Secondly, Alfonso may have feared that an alliance between angry nobles and townsfolk might form against him.  In any case, according to Keane, the king was determined “to defend and expand his kingdom, even if that meant making political compromises that might dilute his kingly powers.”6 

Operating in the spirit of compromise, the king secured the backing of the three estates in exchange for his promise to “not wage war nor make peace or make any agreement without the counsel of bishops, nobles and good men,” which referred to leading citizens of the towns.7 In a series of what may have been up to 15 documents collectively called the Decreta, the king agreed that private property and personal domiciles were inviolable, that justice would be upheld in a routine, predictable manner including that any charges against a person must be backed by evidence and that detainees had the right to be defended by a third party.

What’s even more significant here, to me at least, is that the citizens of León did not secure new liberties out of some abstract reverence for rights. The social gains they made were the byproduct of a monarch’s inability to raise taxes, maintain peace in the realm, and otherwise rule his kingdom without the cooperation of the three estates.

As John Keane has eloquently put it, it was out of self-interest that Alfonso IX invented “a new mechanism for resolving disputes and striking bargains among interested parties who felt they had a common interest in reaching compromise.”8 The king’s baseline understanding of his kingdom, then, was not of a society requiring indivisible political community, but one comprised of competing and sometimes conflicting interests. To resolve inevitable conflicts, the Decreta contained an agreement that there would be future assemblies of the king involving representatives from the three estates. A regular parliament offered “the possibility of turning disagreements about reality into binding agreements in support of a common good.”9

Even more amazing is that the representatives of the towns who attended the cortes in the cloister of San Isidoro had been elected by the citizens of their towns.  While it isn’t known by what method they were elected nor exactly how many were present, historian Joseph F. O’Callaghan concluded that their “numbers in attendance must have been quite large.”10 

The Reconquista placed pressures on other monarchs in the Iberian Peninsula.  As they sought to strip people and territory from Muslim control, Christian kings had to “compete with the more advanced Muslim kingdoms in the south for the favours of the merchants and farmers,” and thus were “prepared to respect their property rights and grant them … privileges.”11 In 1126, sixty-two years before the first cortes in León, King Alfonso I of Aragon granted a charter of liberties to “Christians whom I brought, with the help of God out of the power of the Saracens and led into the lands of the Christians. . . . Because you left your homes and your estates for the name of Christ and out of love for me and came with me to populate my lands, I grant you good customs throughout your realm.”12


It would, of course, be naive to draw too straight a line between the medieval origins of the parliament and the spread of modern written constitutions in the mid-18th century. As representative assemblies—and the societies from which they emerged—became more complex, so too did the theories of governance and philosophical worldviews that came to animate western politics.

At the same time, however, as Spanish legal historian Aniceto Masferrer has argued, the enormous differences between the two eras notwithstanding, it is important to make connections between the medieval documents like the Decreta and the emergence of liberal governance six centuries later. If nothing else, Alfonso IX’s savvy political bargaining shows “how medieval Europe started to be aware of the convenience of limiting political power through law.”13


By the time the Framers sat down in 1787 to hammer out the U.S. Constitution, the English House of Commons had long since invented the idea of popular sovereignty as a way to challenge monarchical power. The ideology was developed, as historian Edmund Morgan wrote, to “justify a government in which the authority of kings stood below that of the people.”14  Of course, shifting the locus of sovereignty from the king to the people was not actually designed to put power in the hands of the people, but rather in those of the members of parliament.  

For their part, America’s Federalists—who could better be described as nationalists—embraced the idea of popular sovereignty as a way to weaken the power of individual states.  Why?  Because legislative majorities in most states had passed debt relief laws that the Framers felt threatened the property rights of creditors. (The minority the Framers sought to protect, then, was the wealthy, a class to which most belonged.) The idea–put forth by James Madison—was that the authority granted to the new national government would rest on the power of “the people” at large rather than on the collective authority of the states themselves. In short, he had invented “a sovereign American people to overcome the sovereign states.”15

Popular sovereignty, of course, was as much a “fiction” as was the divine right of kings.16 Indeed, the way it was “publicly presented” in America, political scientist Stephen Holmes has written, bore “a striking resemblance to proclamations in which absolute monarchs [once declared] their sovereign will.” What this meant was that rather than being perceived as an exchange of promises “between classes or factions or territorial subunits,” the U.S. Constitution was portrayed as a charter that “‘we the people’ [gave] ourselves.”17 

Paradoxically perhaps, it was this fiction that led the Federalists to argue against the inclusion of a Bill of Rights to the Constitution. When Antifederalists, those who opposed the ratification of the new Constitution for fear that it would give the national government too much power, first demanded that the document include a list of protected rights, Federalists called the request a quaint throwback to the time when kings granted concessions to their subjects. 

If the government derived its power directly from the people, they argued, then what sense would it make to have the people make concessions to themselves? Because America’s constitution could not be considered an agreement between or among parties, “neither concession nor contract was possible because people and government were one and the same.”18

Conversely, since it was “We the People” who conveyed specific powers to the national government, Federalists could argue, as one did, that the Constitution itself was “nothing more than a bill of rights—a declaration of the people in what manner they choose to be governed.”19

Evidently offended by the idea that a convention had been convened to hash out a mere compact, North Carolina’s James Iredell, a leading Federalist who would become one of the first justices of the U.S. Supreme Court, proclaimed that America’s government was  “founded on much nobler principles.” 20

Fortunately for Americans, the Antifederalists were not swayed by those nobler principles and did not give up on the idea that the Constitution was, as one delegate at a state ratifying convention put it, “a compact, agreement, covenant, bargain,” that required the government to put concessions in writing.21 James Madison, of course, ultimately relented to these demands out of political expediency. While he was himself a believer in rights, he nonetheless saw the addition of constitutional guarantees as a political rather than an ideological act. His reason for drafting the Bill of Rights, historian Pauline Maier has concluded, was “less to secure rights,” than to subdue opposition to the Constitution.22  If, as the Framers believed, one of the primary goals of the new Constitution was to protect the property rights of the wealthy minority, then adding amendments to safeguard such popular rights as speech, religion, press, and assembly, was a worthwhile compromise. 

Still, the Federalists’ fiction of popular sovereignty—and a unified American people—lived on in the way Americans think about their country.  Each school day,  American school children pledge allegiance to their “indivisible” nation. Even when we know that ugly, divisive presidential elections can be won by mere percentage points, we continue to refer reverently to the the voice of “the people.” When we ask a restaurant waiter what’s good on the menu, they’re likely to tell you what sells most as if majority opinion is the voice of good taste and wisdom.

If Medieval “Standestaat”—a state of estates—were thought to be divisible into three separate groups, then contemporary Americans tend to see the primary division in our national political community as being between the few and the many, the majority versus the minority, which they sometimes translate as the strong and the weak.23 So even as we herald the wisdom of the majority, we hail the genius of a Constitution that protects the minority. Indeed, Columbia University political scientist Giovanni Sartori once argued that the only reason to believe in constitutions at all is if “we think that somebody needs protection from somebody else.”24

This insight helps explain why Americans tend to think of rights in moral terms, sacred protections that are heroically demanded and/or benevolently bestowed. This weak/strong dynamic injects no small amount of paternalism into a political process we otherwise think of in terms of bargaining, sausage-making, and horse-trading. Rather than being perceived as a political compromise made to maintain social tranquility, the granting of rights is often portrayed as if it were a morality tale.  Which brings me back to Alfonso IX, who clearly saw it as a necessary element of a mutually beneficial exchange.

In his 2012 essay, “Constitutions and Constitutionalism,” NYU’s Stephen Holmes urged Americans to start thinking about rights more through the lens of realism than idealism.  Claiming this his observations should be interpreted as instructive rather than cynical, he argued the “democratic constitutions emerge and survive” when society’s “most powerful social forces find that they can promote their own interests most effectively by simultaneously promoting the interests of, and sharing political influence with, less powerful but not utterly powerless swaths of the population.”25

Why? Because it is only “when the powerful discover the advantages they can reap from making their own behavior predictable,” do they “voluntarily submit to constitutional constraints.” Put even more bluntly, when non-elites bring incentives to the bargaining table, “elites respond opportunistically by granting legal protections and participatory rights in exchange for cooperation indispensable to elite projects.”26

Holmes wasn’t the first theorist to make this argument.  In 1919, Max Weber, one of the founders of modern sociology, argued that much of modern western democracy was itself a product of national elites’ need for disciplined soldiers to fight wars.  It was military necessity, then, that compelled them “to secure the cooperation of the non-aristocratic masses and hence put arms, and along with arms political power, into their arms.”27 While Weber was not referring to the U.S. Constitution, he was nonetheless recognizing the existence of political bargaining as the essence of constitutionalism in general.  

Historian Linda Colley concurs with—and expands on–Weber in her remarkable 2021 book, The Gun, the Ship, and The Pen: Warfare, Constitutions, and the Making of the Modern World. The rash of new constitutions in the 18th century was, in part, a product of the rise in the number of bloody and expensive imperial, transcontinental wars. The new countries that emerged from this warfare “progressively elected to experiment with written constitutions as a means to reorder government, mark out and lay claim to contested boundaries” as well as to “legitimize their systems of government anew.” These new constitutions—including that of the United States–helped to “rally wider support and justify expanding fiscal and manpower demands.” These documents sometimes “functioned in effect and in part as bargains on paper. Male inhabitants of a state might be offered certain rights, including admission to the franchise, as a quid pro quo for accepting higher taxes and/or military conscription.”28 

This description is not pretty. It’s not mystical. Nor does it pretend that all parties to the negotiation are equal.  But it does provide a framework with which we can think about rights in terms of compromise and mutual benefit rather than merely in sacred principles and abstractions. This doesn’t mean that the Framers didn’t infuse the document with a desire for reform or utopian hopes, just that the harsh realities of geopolitics—particularly threats from Britain to the north, Spain on the Mississippi, and Native Americans throughout the inland frontier—were never far from their minds. The Constitution that was drafted in Philadelphia during the summer of 1787, writes Colley, “was often approached at the time less as a ‘blueprint of a liberal democratic society,’ . . . than as a grimly necessary plan for a more effective and defendable union.”29


So what did I learn in León?  I learned that from the very beginning constitutions are practical political documents that formalize the results of bargaining between competing sectors of a given political community; that they are amoral rule books that set the boundaries of future debate, establish the obligations each sector owes to the other, and constrain the actions of members of signatory groups long after the signers of the parchment are dead. Most importantly, I learned that if they are to survive, all parties must continue to persuade the other that they can more effectively get what they want if they agree to support each other.  And finally, that, while inspiring, America’s universalist language of rights can be deceiving; that our civil rights—liberties derived from membership in a particular polity—are a far more powerful source of freedom than human rights–liberties that all persons should theoretically enjoy.

  1. Paul M. Zall, ed., Comical Spirit of Seventy-Six: The Humor of Francis Hopkinson (San Marino: The Huntington Library, 1976), 191. ↩︎
  2. Daniel J. Boorstin, The Genius of American Politics (Chicago: The University of Chicago Press, 1953, 191. ↩︎
  3. Reginald Horsman, Race and Manifest Destiny: The Origins of American Racial Anglo-Saxonism (Cambridge: Harvard University Press, 1981), 227. ↩︎
  4. A.F. Pollard, The Evolution of Parliament (London: Longmans, Green & Company, 1920), 3. ↩︎
  5. John Keane, “The Future of Parliaments,” (keynote address, EU Global Project to Strengthen the Capacity of Parliaments, León, Spain, June 30, 2023). ↩︎
  6. John Keane, The Shortest History of Democracy: 4,000 Years of Self-Government—A Retelling for Our Times (New York: The Experiment, 2022). 79. ↩︎
  7. María Esther Seijas Villadangos, “Origin of Parliamentarism: An Historical Review of its Crisis: León (Spain) as Cradle of Parliamentarism,” Revista Acadêmica da Faculdade de Direito do Recife 88, no. 2., (July/December 2016): 22. ↩︎
  8. John Keane, The Life and Death of Democracy (London: Simon & Schuster, 2009), 176. ↩︎
  9. Keane, “The Future of Parliaments”. ↩︎
  10. Joseph F. O’Callaghan, “The Beginnings of the Cortes of León-Castile,” The American Historical Review 74, no. 5 (June 1969): 1514. ↩︎
  11. Jan Luiten van Zanden, Eltjo Buringh, and Maarten Bosker, “The Rise and Decline of European Parliaments, 1188-1789,” The Economic History Review 65, no. 3 (August 2012): 839. ↩︎
  12. Joseph F. O’Callaghan, History of Medieval Spain (Ithaca,NY: Cornell University Press, 1975), 285. ↩︎
  13. Aniceto Masferrer, “The Spanish Origins of Limiting Royal Power in the Medieval Western World: The Córtes of León and Their Decreta (1188),” in Golden Bulls and Chartas: European Medieval Documents of Liberties, ed. Elemér Balogh (Budapest: Central European Academic Publishing, 2023), 31. ↩︎
  14. Edmund S. Morgan, Inventing the People: The Rise of Popular Sovereignty in England and America (New York: W.W. Norton, 1989), 56. ↩︎
  15. Morgan, Inventing the People, 267. ↩︎
  16. Morgan, Inventing the People, 13. ↩︎
  17. Stephen Holmes, “Precommitment and the Paradox of Democracy,” in Passions and Constraint: On the Theory of Liberal Democracy (Chicago: The University of Chicago Press, 1995,) 146. ↩︎
  18. Morgan, Inventing the People, 283. ↩︎
  19. Morgan, Inventing the People, 283. ↩︎
  20. Gordon S. Wood, The Creation of the American Republic, 1776-1787 (Chapel Hill: The University of North Carolina Press, 1969), 541-542. ↩︎
  21. Wood, The Creation of the American Republic, 541. ↩︎
  22. Pauline Maier, Ratification: The People Debate the Constitution, 1787-1788 (New York: Simon & Schuster, 2010), 446. ↩︎
  23. Daniel Chirot, “The Rise of the West,” American Sociological Review 50, no. 2 (April 1985): 185. ↩︎
  24. Giovanni Sartori, “Constitutionalism: A Preliminary Discussion,” The American Political Science Review 56, no. 4 (December 1962): 855. ↩︎
  25. Stephen Holmes, “Constitutions and Constitutionalism,” in The Oxford Handbook of Comparative Constitutional Law, ed. Michel Rosenfeld and András Sajó (Oxford, UK: Oxford University Press, 2012), 215. ↩︎
  26. Holmes, “Constitutions and Constitutionalism,” 214-215. ↩︎
  27. Max Weber, General Economic History, trans. Frank H. Knight (Glencoe, IL: The Free Press, 1950), 325. ↩︎
  28. Linda Colley, The Gun, The Ship, and The Pen: Warfare, Constitutions, and the Making of the Modern World (New York: Liveright Publishing Corporation, 2021), 7. ↩︎
  29. Colley, The Gun, The Ship, and The Pen, 121. ↩︎

The Marriage of Wokeness and Cash

Why America Has Seen an Explosion in Public Recrimination

(Image by Jørgen Carling)

At its best, Wokeness is an awareness of how race and gender bias can produce societal inequalities. At its worst, it’s a racket in which upper-middle-class college graduates wield victimhood status in a bid for financial gain or career advancement. 

When the Civil Rights Act was passed in 1964, victims of workplace discrimination had recourse to only a few remedies, such as back pay, restoration of a promotion or benefits, or job reinstatement.  A few years later, newly minted federal Affirmative Action programs held that members of minority groups that had a history of facing discrimination could now benefit from preferential hiring schemes. In the first instance, an employee had to prove discrimination had occurred, in the second, a minority job applicant was assumed to need protection from discrimination in any hiring process.

While remedies for workplace discrimination were not controversial because they were focused on making the victim whole, preferential hiring–as well as college admissions—was problematic by virtue of the fact that it meant that a non-minority’s application could be thrown out to make room for the minority applicant.  Not surprisingly, the issue reached the Supreme Court, and in 1978 the justices reached a convoluted split decision that upheld Affirmative Action, made racial quotas illegal, said race could nonetheless be considered in college admissions, and failed to determine how applicants would have to prove past discrimination in order to receive protected or preferential status.

A little more than a decade later, Congress upped the ante by passing the Civil Rights Act of 1991, which for the first time explicitly allowed employees to seek compensatory and punitive damages in both racial and gender discrimination cases.  In essence, this updating of the Civil Rights Act transformed the anti-discrimination statute from one whose remedy was equitable relief—restoring the complainant to his or her employment situation to what it had been before the offending act occurred—to something more like a tort—a wrongful or negligent act—which allows for trials in which juries could order parties they found liable to cough up damages.

In short, the CRA of 1991 fundamentally changed the way discrimination was to be remedied. Before 1991, anti-discrimination statutes were based on a traditional labor model that sought conciliation in employer-employee relations. After 1991, employment disputes would more and more be resolved through litigation—or the threat of it— involving compensation for the victim and punishment for the offender.  Whereas in the 1960s civil rights advocates had wanted anti-discrimination enforcement to revolve around a New-Deal style government authority, over time they came to embrace the benefits of private enforcement in which every complainant was a potential plaintiff.

Now that large sums of money were potentially involved, most observers anticipated an increase in discrimination litigation. And, of course, that came to pass. Indeed, the new law incentivized the filing of complaints, and changed how Americans approached their interactions with one another in the workplace and beyond.  In the same way that the explosion in personal injury lawsuits turned every fall in a shopping center into a potential lawsuit, so too did making discrimination into a tort encourage Americans to see every slight and professional setback as potential sources of compensation.

Just as importantly, the new law made discrimination cases, which had once been considered too much work for too little pay off, more attractive to attorneys. To sweeten the deal, the law now authorized plaintiffs to recover attorney’s fees if they won their cases.  One could say that if Martin Luther King, Jr., was among the inspirations for the passage of the Civil Rights Act of 1964, then the flamboyant, scandal-plagued San Francisco attorney Melvin Belli, the so-called “King of Torts,” was the spiritual father of the Civil Rights Act of 1991. More than anyone in America, Belli was responsible for the successful post-war push by trial lawyers to increase damages awards in personal injury cases. Thus, the Civil Rights Act of 1991 married the sacrifices of the 1960s struggle for social justice with the ethos of an era best symbolized by a scene in the 1987 movie Wall Street in which Michael Douglas–playing the fictional corporate raider Gordon Gekko–famously proclaimed that “Greed is good.”

The term “woke,” which became a watch word among African American activists as early as the first half of the 20th-Century, was catapulted into the mainstream in 2014 in the wake of the shooting of Michael Brown in Ferguson, Missouri. While initially associated with the Black Lives Matter movement, it was quickly adopted by white Progressives committed to fighting inequality. The term then quickly came to encompass a constellation of ideas, slogans, and programs that sought to liberate a growing number of marginalized groups from a structurally biased system.

There was nothing particularly revolutionary about these activists’ stated concerns.  What was distinctive was both the tone of their rhetoric and their favored solutions for addressing disparities. Whereas the civil rights movement of the early 1960s inspired some of the nation’s greatest legislative successes through a spirit of forgiveness and reconciliation, the rhetoric of Wokeness was full of resentment, vengeance, and demands for immediate reparation.

While the definition of discrimination had been expanding steadily since 1964, suddenly Americans were being told that there were a whole lot more ways to offend and oppress their fellow Americans than they had ever imagined. Even as most Americans thought that the lives of minorities and women had improved significantly since the 1960s, the bar for what constitutes–and the standard of evidence to prove–discrimination was being lowered.  University administrators and company human resources experts now considered American society so fundamentally prejudiced and hostile that they introduced new ways of protecting people–such as trigger warnings and safe spaces–from even the slightest faux pax, which were now inelegantly called microaggressions. 

Meantime, those who failed to comply with the growing list of social infractions risked being “called out,” “cancelled,” fired, sued, and otherwise painted with a scarlet letter. To a sober observer, the sheer number and frequency of these auto-da fés seemed more than a little exaggerated. There’s no denying that discrimination occurs in America, but why all of a sudden was there an explosion of accusations and public recriminations, particularly in universities and among the educated upper middle classes, people who are among the most privileged humans on the planet? And why did so many people who were not directly involved in the incidents join in the angry choruses while others remained silent in the face of what were objectively disturbing spectacles?

Greg Lukianoff and Jonathan Haidt, the authors of The Coddling of the American Mind, have called the driving impulse of this era of discontent “vindictive protectiveness,” the practice of publicly shame and punishing those accused of having said or done anything to harm a member of a protected group.  Because even casual defenders of the accused are not immune to these mob attacks, those uncomfortable with the idea of public stoning tend to keep their objections to themselves.  This behavior, Lukianoff and Haidt argue, created “a culture in which everyone must think twice before speaking up, lest they face charges of insensitivity, aggression, or worse.” 

Vindictive protectiveness, Lukianoff and Haidt argue, arose from a style of fearful and overprotective parenting that educated middle-class and above parents began to practice in the 1980s and ’90s in an effort to give their children a competitive edge in life.  Wanting only the best, these parents sought to cultivate their children’s talents while erasing all potential sources of risk and adversity in their environments.  Overscheduled, over supervised, and left with precious little time to play and explore on their own, these children eventually arrived on university campuses believing that the world was dangerous, bad people should be removed from their presence, and that the institutions around them should protect them in the way their parents had.

The universities, of course, obliged. As higher education has become big business, students have been transformed into customers, and we all know that the customer is always right. Furthermore, Lukianoff and Haidt suggest, the rise of vindicative protectiveness may also “be related to recent changes in the interpretation of federal antidiscrimination statutes.”  Not only do university administrators seek to avoid lawsuits from students, they also want to avert any investigations by the Department of Justice into their civil rights compliance. Ironically, that’s why they developed “bias incident reporting” systems that allow students to report anonymously on anyone they feel has caused them or anyone else to experience any type of bias.  At Cornell University, for example, a bias incident is something done or said “that one could reasonably and prudently conclude is motivated, in whole or in part, by the alleged offender’s bias against an actual or perceived aspect of diversity, including, but not limited to, age, ancestry or ethnicity, color, creed, disability, gender, gender identity or expression, height, immigration or citizenship status, marital status, national origin, race, religion, religious practice, sexual orientation, socioeconomic status, or weight.”

In short, a combination of the desire to protect students as well as to remain in compliance with civil rights statutes and regulations turned campuses into places where students are encouraged to report on professors, staff, subcontractors, and, of course, one another. In response to professor pushback at the anonymous reporting system at Stanford, a university spokesperson insisted that the “process aims to promote a climate of respect.” Still, there is a growing realization that such reporting systems can be both easily abused and limit free speech. Nonetheless, as students graduate from college, they take the expectations these systems foster with them into the labor force.

The commentariat is telling us that Trump’s victory spells the end of Woke.  But too many election post-mortems treat Wokeness as a purely political phenomenon. Yes, it employs left-wing ideologies that treat identity and victimization as sources of resistance and power.  But Wokeness would never have become so pernicious outside the university had there not been the temptation (and fear) of financial gain (and loss).  In addition to money, cancellation also holds out the promise of professional advancement.  Campaigning to remove one’s colleague or boss–and even helping others cancel theirs–is also a way for young people to remove their supervisors, clear the field of competition and move up the ranks.

Ask any cynical political hack how to decode the power and drive of any politicized trend and they’re likely to tell you to follow the money.  Wokeness is no different. After you find out who all has profited from this phenomenon, you’ll come face to face with the absurd fact that one of the most diverse countries in the world has chosen to encourage private litigation and the threat of financial damages to curb discrimination and promote a more just and cooperative society.  What could possibly go wrong?

Gregory Rodriguez is the author of Whiteness: An American Tragedy and Other Essays and Mongrels, Bastards, Orphans, and Vagabonds: Mexican Immigration and the Future of Race in America. He is working on a book on the rise and fall of rights-based liberalism.

The Minority Voters Who Exposed the Politics of Benevolence

(Photo by Vetustense Photorogue)

It only took a little more than a week for Democrats to go from crying foul over a pro-Trump comedian’s insulting remarks on Puerto Rico to hurling insults at minorities whose electorates had shifted to the right on Election Day. 

Other than old fashion revenge, what much of the invective had in common was a desire to see how poorly those ungrateful minorities would fare without the protection of newly jilted liberals.

“I guess those Latinos [who voted Republican] will enjoy watching the Trump Presidency from wherever it is he deports them to,” wrote one Air Force veteran who calls herself a Democrat.  “Fu** Latinos and Arabs,” wrote another man who publicly identified as LGBTQ and pro-Black Lives Matter.  “There I said it. Hope you all get deported and banned.” Even a top writer for “Mother Jones,” a leading liberal magazine, joined the fray, writing, “Perhaps massive deportations will affect how they see Trump.”

Never mind the fact that voters are, by definition, U.S. citizens and therefore not subject to deportation. What matters is the source of this bitterness and what it tells us about the state of contemporary American liberalism. 

If asked, a political scientist would tell you that politics in a democracy is simply the struggle among competing interests. And they’d be right. Group A is likely to have different interests than Group B and, since neither group is powerful enough to grab hold of the levers of government by itself, each will weigh whether joining forces with Group C or D or E would likely get them whatever they’re after.  The groups don’t all have the same priorities. Indeed, some of their issues may be in direct conflict. But the hope is that all groups will get at least some of what they want from the party they support.

The currencies any given group can bring to a political coalition are many. Some groups bring a lot of money–cash from large donors helps to get the word out; another group may have privileged access to media sources; another may have a network of political organizations that can help push people to the polls, and some groups have lots of actual voters who can punch cards or tap touch screens at campaign’s end.  Put the resources of the coalition’s groups together and maybe they’ll carry each other over the finish line.

But particularly since the 1960s, when a combination of the civil rights movement and the New Left injected a stronger current of morality into modern politics, another more subtle cultural currency came into play in political alliances, the politics of benevolence. While politics in America has always involved some level of pretense that high-minded principles trumped material interests, of the two major parties the Democrats became more convinced that politics was, first and foremost, the collective expression of virtue.

Talk to a Democrat from the Westside of Los Angeles and they’re more likely to mention their party’s generous social policies aimed at the less fortunate than its approaches to the economic sector from which they earn a living.  They may even explicitly claim that they vote against their own interests. But that’s not true.

Let’s say our hypothetical benevolent voter belongs to Group A, maybe he or she works or invests in the tech sector.  Group A’s political alliance with the less fortunate members of Group H or J can bring their chosen candidate more voters, because let’s face it, Group A is a smaller demographic group than Groups H and J. 

But such an alliance can also supply members of Group A with a sense of moral satisfaction.  Because multi-group coalitions are inevitably hierarchical, members of the more privileged group can feel that they’ve protected the interests of groups who reside much lower than them on the totem pole.  And in return, whether they admit or not, members of Group A often expect gratitude and party loyalty from members of Groups H and J.

But ironically, when people believe their political power is derived from their benevolence, it behooves them to maintain the social hierarchy from which their sense of righteousness arose.  When sufficient numbers of Groups H and J abandon their alliance with Group A and join a new coalition, they not only undermine the privileged group’s belief in its own benevolence, they threaten its political power. In other words, if enough members of Group H and J abandon the coalition, Group A doesn’t get its own interests met. 

Thus, the vitriol being directed at apostate minority voters is a rearguard effort to reassert hierarchy in a faltering coalition, to bully Groups H and J back into their place at the bottom of the totem pole.  In addition to the revenge posts on X, more than a few analysts have suggested–without any evidence–that minorities who voted for Trump did so purely out of animus towards women or African Americans. All that proves is that while terms like racist or misogynist may have come into common use as part of a good faith effort to reform social attitudes, today–with their meanings now stretched beyond recognition–they are just as likely to be used to impose social control over recalcitrant groups and individuals. 

One thing is for sure, however, the nasty aftermath of this ugly election has already proven that there were always interests lurking beneath the Democrats’ politics of benevolence. 

Hillbilly Elegy as Tragedy

Gnadenhutten Park & Museum, Gnadenhutten, Ohio. (Photo by Gregory Rodriguez)

In 2016, before he got elected to the United States Senate, the Republican vice-presidential nominee, J.D. Vance, published a memoir exploring what he thought was wrong with working-class white culture. On the one hand, Hillbilly Elegy is a tale of economic instability, addiction, and cultural decline. On the other, it’s a classic story of American upward mobility wherein Vance shakes off his people’s pathologies in order to climb the social ladder. It’s a story that’s been told countless times by Americans who’ve climbed their way out of ghettos and barrios. Some authors blame the system. Others focus more on the self-destructive behaviors of those caught at the bottom. Vance tended toward the latter, so much so that The New York Times called it “tough love” while other critics accused him of “blaming the victim.” Meantime, more than one left-wing reviewer resented the fact that Vance painted a segment of the white population as pathological minorities. They evidently thought that he had crept onto their turf. 

I read Hillbilly Elegy as part of my research into white people. While his memoir didn’t particularly impress me, I appreciated Vance’s choice of looking at his Appalachian roots through an ethnic, rather than a racial lens. His book put a human face on Trump’s politics of white grievance. The primary difference was that Vance emphasized the need for cultural renewal, while Trump is always hammering away on who is to blame. 

For a generation now, newspapers have been beating the drums of demographic change as if they were looking forward to the day when whites become a minority. And here we are, if not numerically, then culturally. Frankly, I was already tired of the politics of grievance, but now we have a new minority eager to play. But, alas, grievance is the language America speaks, and whoever thinks violence is foreign to any aspect of life in America has never actually been to America.

Three years ago this week, I was on a research trip in Vance’s home state of Ohio when I stumbled on a sign marking the birthplace of the first white child in the state. The finding was made all the more unsettling given that I was in Gnadenhutten, where, in 1782, 160 Pennsylvania militiamen massacred 96 pacifist Christian Indians.

Vance called his bestseller an elegy. I called my book of essays on white people a tragedy.  It’s time to take white anger seriously or it will burn us all up. 

You can find Whiteness: An American Tragedy on Amazon. 

When Satire is Too on the Nose

I finally saw American Fiction tonight. It is a mildly amusing, and all too accurate, satire about a black fiction writer who, in a fit of pique, submits to an intellectual marketplace that puts a premium on minstrelsy by writing a novel that trades in racial stereotypes. The movie ridicules the liberal white reading public for thinking “they want the truth” but really want to “feel absolved.” So trapped in their virtuous imaginations, so thoroughly cleansed of sin, these readers look to minorities to provide them with “authentic” and “raw” stories to make them feel alive, or at least a little unsafe. And so the marketplace demands stories that draw on the ancient trope of the savage, both noble and not-so-noble, victim and victimizer.  Plus, these days, it’s just so important to “listen to black voices” and “center diversity,” even if the desired product is mere “trauma porn.”

The movie mocks the publishing world for indulging in racial fads–or “reckonings” as the journalists call them–in an effort to remain current and to not get canceled.  For reasons I suspect are calculated, the movie holds its fire at minority writers who play this demeaning game for the accolades and compensation.  I mean, they, too, have to get paid. 

The bigger message of American Fiction, however, may be that fighting against white expectations can sometimes twist your own identity into knots.  Unfortunately, there’s no resolution in the end, which was disappointing.  Maybe the director saw the problem so clearly that he just figured it was insurmountable.  Or maybe, like his lead character, he was trying too hard not to give us the ending he feared his white audience wanted.  Still, the movie is a worthwhile critique of the freshly inclusive creative marketplace that needs members of every group to play their assigned roles now more than ever. 

Gentlemen, Include Me Out

(Photo by Quinn Dombrowski)

July 4 is the commemoration of the victory of the colonial periphery over the imperial center, not in an effort to reform or reinvent the empire, but to leave it, and start a new one. 

In these contentious times, when Americans are at loggerheads over control of government and culture, it might be helpful to remember that sometimes—not always— independence is the answer.

Independence means freeing yourself from the clutches of decadent institutions. It means getting a chance to put your heads together to establish new ones, and mustering all your courage and creativity to forge new destinies on our own terms.  Oh, and in the spirit of freedom, it also means letting others do the same. 

On this Fourth of July, I wish for all my compatriots a renewed belief in independence, in breaking away, in starting anew. 

Today, I will lift a hot dog in all our honor and recall the quintessentially American wisdom of Samuel Goldwyn when he said, “Gentlemen, include me out.” 

Happy Fourth. 

Verified by MonsterInsights