The other day, I stumbled across an interesting essay in Christianity Today about what the collapse of membership in evangelical churches might mean for politics in the American South. The upshot was that it might not mean as much you’d think. But what was most interesting in the analysis was how the author, Daniel K. Williams, an historian who studies the intersection of religion and politics, challenged the prevailing notion that secularization—or, more accurately, dechurching—transforms people into hyper-rational high modernists. Sure, there are a fair number of intellectual types who reject the religion they’d formerly practiced and become strict devotees of the Enlightenment, but most people who become alienated from their faiths generally don’t reinvent their world views from top to bottom.
I usually cringe when I read anything about religion in the non-religious American press. More often than not, journalists treat faith as if it were first and foremost a set of “beliefs,” not appreciating that the average church goer generally isn’t willing to throw down over the fine points of theology and dogma. When citizens of modern liberal democracies choose to worship in formal religious venues, it’s because it provides them with a source of solace and wisdom when confronting the trials and tribulations of life and helps orient them toward answers to the most difficult questions, particularly those surrounding death and the hell that is other people. Of course, the fine points of theology–and liturgy–are not insignificant. They’re what make someone feel part of one denomination as opposed to another. But sociologically and politically speaking, what’s most important in any religion is the Weltanschauung, the comprehensive conception of the world and the place of humans with in it, that it provides followers. Any given theology serves to undergird this overall understanding of the world, notions about the meaning of life and what attitudes and behaviors best help you survive it. This general “moral orientation,” as Williams calls it, can live on even after people leave their churches, “even if it survives only in a distorted form.”
This explains why the political views of lapsed Catholics in the Northeast are still generally liberal. They still retain the “theology of communal beliefs” of the Church they left behind. In the same way, Williams argues, lapsed Southern evangelicals are not likely to suddenly become political liberals but instead will retain the “individualistic moralism” that defines–and even predates–evangelicalism in the South.
At this point, then, we’re not talking about articulated beliefs in the supernatural but more implicit assumptions about the nature of reality. An unchurched American who was raised by Baptists from Oklahoma is likely to have different assumptions about what humans owe one another or how they should generally behave than does a child of lapsed Catholics–or even mainline Protestants–from Texas. The worldview that may have once been instilled by religion becomes a more secular lens through which an individual views his fellow humans.
After mining survey data to compare the political views of churched and unchurched evangelicals, Williams hits pay dirt. The biggest contrast between the two groups came in the “area of personal trust in other people.” When asked, “Do you think most people would try to take advantage of you if they got a chance or would they try to be fair?” 54% of white Protestant Southerners who attended church no more than once a year said that most people would try to take advantage of them. The response to the same question by Southern Protestants who attend church every week was almost the opposite. Fully 62% percent said that most people would “try to be fair” and not take advantage of them.
Similarly, when asked “Would you say that most of the time people try to be helpful or that they are mostly just looking out for themselves?”, 58% of the once-a-year church goers chose the cynical answer. Again, the responses of weekly church goers were almost the opposite. 57% said that most of the time people “try to be helpful.” Stripped of any religiously inspired notions of divine or human grace, unchurched evangelicals were left with what Williams calls “a deeply suspicious individualism.”
This makes me wonder whether West Coast-variety secular progressivism—with its decidedly elevated focus on racial and gender discrimination—also elicits a similar kind of mistrust in people. Neo-Civil Rights social campaigns highlighting the need to take care when conducting cross-racial or cross-gender relationships can certainly make people properly conscientious about how they treat others who are unlike them. However, it’s conceivable that they can also instill mistrust. Particularly in an era in which minorities and women are encouraged to complain about any real or perceived mistreatment, I wonder if such a social climate could also make people more likely to assume—to paraphrase the aforementioned survey question—that most people unlike them would try to take advantage of them rather than be fair. It reminds me of the scene in Annie Hall when Woody Allen’s character calls a television executive an antisemite because, Allen insists, the exec asked him, “Jew eat lunch?” rather than “Did you eat lunch?” Allen’s character’s paranoia, of course, has clear historical origins. And the scene is poignant in addition to being funny, because even if this executive wasn’t an antisemite, it doesn’t mean that others might not be. And when do you trust people and when is it right to assume the worst? In any case, it’s fair to ask whether secular progressivism is spreading a “deeply suspicious communalism,” not entirely unlike the cynicism Williams found metastasizing in the South.
***
I’ve been reading a lot about the ideological mishmash that animated America’s founders, namely liberalism, republicanism, and the brand of Christianity and Deism that were clearly part of the mix. Despite historians’ best efforts to cast the American Revolution as some sort of intellectually-driven movement, there was not a single ideological through line that united the patriots or the framers. The Constitutional Convention itself was more a series of hard-won compromises over competing interests than it was some sort of intellectual debate about what constituted the best form of government.
I love that the delegates are called Framers. I realize it’s because they framed—or shaped—the Constitution, but what fascinates me is how much the Constitution itself was a frame to a nation that had not yet been born in any substantive sense. Francis Hopkinson, a signer of the Declaration of Independence and a member of the Continental Congress, called the Constitution a “new roof” that unified citizens of diverse and divided states. It’s significant that he saw it as a roof rather than a foundation. Nonetheless, that roof, according to the late Princeton historian John M. Murrin, “an ingenious contrivance” that gave a fragile, embryonic American national identity, a generation or two for interstate economic links to begin to tie together a real national community.
More than a half century later, Abraham Lincoln characterized the Constitution, which was a largely amoral set of procedural rules, as a picture frame designed to enhance the beauty of a work of art. Lincoln believed that the Declaration of Independence, which stated that all men are created equal, was the “primary cause of our great prosperity.” It was the “great fundamental principle upon which our free institutions rest.” As such, the Constitution could only be understood in tandem with the Declaration. Borrowing the biblical image of a “apple of a gold in a silver picture,” he compared the Constitution to “the picture of silver, subsequently framed around” the Declaration. “The picture,” he wrote, “was made, not to conceal, or destroy the apple; but to adorn, and preserve it. The picture was made for the apple—the apple not for the picture.”
Of course, the Declaration of Independence was a piece of war propaganda written to justify an act of secession and garner both domestic and international support. But that’s another story altogether. The point here is that the brilliant men who created the Constitution created a federal government but not a nation or a shared culture, which had to emerge through ongoing cooperation and conflict among the country’s inhabitants themselves.
I say inhabitants, because citizenship—along with its attendant rights—was defined legally and did not include all inhabitants. We all know the great trope of American civic life, that the Revolution is a work in progress, that the circle of citizenship widens through struggle over time. But citizenship is not the same as culture, and sharing a sense of national fate is not the same as sharing a worldview about the meaning and purpose of life.
That’s one of the many weaknesses of the current focus on social inclusion. Not only does it assume that there is a single culture into which everyone wants to be included, but that that single culture has already been pre-made by someone else before you were invited to participate. At the very least, you’d think that in the name of democracy, all Americans should be encouraged to inquire what exactly they’re being asked to include themselves in and whether there’s an escape hatch. Or at least an edit button.
More than 230 years after the Constitution was ratified, America still hasn’t congealed into a single culture. And while that fact can, at times, be a recipe for friction—political and otherwise—it isn’t necessarily a bad thing. Indeed, if you really believe in diversity, as so many Americans claim to these days, you should really hope that the inhabitants of this enormous nation never allow themselves to be compelled to living in one single homogeneous culture. It seems to me that the only way to lower the unhealthy levels of social mistrust in contemporary America is not to try to shove everybody into one box, but in learning to accept that some people will never see the world like you do.