Category: Jack H. Burke

Ignorance and the Decline of Responsible Government in America


Since at least the time of Julius Caesar’s coup in 49 B.C., man has surrendered his freedom to various strongmen who have offered to compromise liberty in exchange for swift political results.  This tradeoff, a lapse in that “Spirit of Liberty,” which Edmund Burke felt was the “greatest security of the people,” is the despot’s quickest and most ordinary way to power.

As Burke wrote in his 1791 “Letter to a Member of the National Assembly”:

Men are qualified for civil liberty in exact proportion to their disposition to put moral chains upon their own appetites … in proportion as their soundness and sobriety of understanding is above their vanity and presumption – in proportion as they are more disposed to listen to the counsels of the wise and good, in preference to the flattery of knaves. Society cannot exist, unless a controlling power upon will and appetite be placed somewhere; and the less of it there is within, the more there must be without. It is ordained in the eternal constitution of things, that men of intemperate minds cannot be free.

Without caution and self-control, and a reverence for the principles of the law, true liberty is impossible – even, and perhaps especially, when the pursuit of greater liberty is the goal of those needlessly spurning the law.  Society’s need for self-control becomes fulfilled by dictators and strongmen in proportion to the citizenry’s decrease in self-sufficiency.

It is ignorance, the wellspring of tyranny, which most often makes a population unable to practice good government and able to avoid demagoguery.  When “soundness and sobriety of understanding” give way in the face of “vanity and presumption,” social rot inevitably ensues.  And, unfortunately, the American body politic is, according to years of poll taking and studies, exceptionally ignorant on the topics of geography, politics, and history.

In 2011, the World Values Survey reported that 44% of non-college graduates and 28% of college graduates in the Unites States supported “having a strong leader who doesn’t have to bother with congress and elections” – a sign, if nothing else, of widespread exasperation with the conventions of Western representative government.  Yet America itself was conceived, as John Adams declared in the 1780 Constitution of the Commonwealth of Massachusetts, as “a government of laws and not of men,” and therefore the precedence of principles before personalities is at the very heart of the American experiment – as it is, ultimately, at the heart of all good government.  The results of this survey point to an unfamiliarity with what can happen when representative societies entirely reject what Max Weber described in 1922 as “rational-legal legitimacy” in favor of “charismatic,” personality-based legitimacy.  In light of other studies that show a widespread ignorance of the Supreme Court, the Constitution, the Bill of Rights, and even who the current vice president is, we can see at least in part why these views are so pervasive.

Perhaps the most noteworthy finding among the studies that examine the political knowledge of Americans is that the confidence of a subject’s opinion often increases as familiarity with the issue in question decreases.  For example, in 2014, in the aftermath of Putin’s invasion of Crimea, Daniel Larison in The American Conservative reported a survey study carried out by three political scientists, which discovered that respondents’ support for military intervention in Ukraine increased in a direct correlation with their difficulty in locating Ukraine on a map:

The further our respondents thought that Ukraine was from its actual location, the more they wanted the U.S. to intervene militarily[.] … [W]e found that the less accurate our participants were, the more they wanted the U.S. to use force, the greater the threat they saw Russia as posing to U.S. interests, and the more they thought that using force would advance U.S. national security interests[.] … Our results are clear, but also somewhat disconcerting: The less people know about where Ukraine is located on a map, the more they want the U.S. to intervene militarily.

In fact, as a 2017 article in Foreign Affairs said about this same survey, “[o]nly one in six could identify Ukraine on a map; the median response was off by about 1,800 miles.”  “It shouldn’t come as a surprise,” Larison later wrote in a 2017 reworking of his 2014 American Conservative piece, “that less knowledge about a foreign country corresponds with a preference for more destructive and irresponsible policies in relation to that country. If someone doesn’t even know where a given country is, it is unlikely that he knows much about the surrounding region or the probable consequences of military action.”

It should not take a sociological genius to see that this phenomenon, especially combined with the forces of the internet and modern mass media, makes for a polity ripe for mass manipulation and political demagoguery.

In a 2016 piece for the New York Times entitled “How Stable Are Democracies? ‘Warning Signs Are Flashing Red,” Amanda Taub wrote:

Across numerous countries, including Australia, Britain, the Netherlands, New Zealand, Sweden and the United States, the percentage of people who say it is “essential” to live in a democracy has plummeted, and it is especially low among younger generations.

While the article, which cites Harvard lecturer Yascha Mounk’s and University of Melbourne political scientist Roberto Stefan Foa’s research on “democratic deconsolidation,” is of course suffused with the standard anti-Trump and anti-conservative biases, it holds several useful pieces of information, if for no other reason than that it is an admission, straight from the horse’s mouth, that the Millennial generation is historically disenchanted with representative and constitutional values.  Across Western nations, the citizenry has, with each succeeding age bracket, progressively lost faith in the democratic process.

Taub continues:

Drawing on data from the European and World Values Surveys, the researchers found that the share of Americans who say that army rule would be a “good” or “very good” thing had risen to 1 in 6 in 2014, compared with 1 in 16 in 1995.

And, further:

The researchers calculated that 43 percent of older Americans believed it was illegitimate for the military to take over if the government were incompetent or failing to do its job, but only 19 percent of millennials agreed. The same generational divide showed up in Europe, where 53 percent of older people thought a military takeover would be illegitimate, while only 36 percent of millennials agreed.

America and other Western nations have scored as politically “deconsolidated” on what Taub describes as the “Mounk-Foa test.”  According to this test, these nations exhibit several similarities to Venezuela during the period prior to the ascent of Chávez, including a decreased belief in democratic values, openness to non-democratic forms of government, and support for “antisystem parties and movements.”  The idea that Trump’s election was in fact a democratic response to the non-democratic impulses of Clinton – older Americans, who her study found were more democratic on average, had an especially important role in electing Trump – is, for some reason, never a serious consideration in her article.  For the arguments made by Harvard academicians and The New York Times in favor of liberal cultural superiority, the social milieu of this new “deconsolidated,” “social justice warrior” generation could hardly be described as one of traditional conservatism.

In psychology, the Dunning–Kruger effect is a cognitive bias whereby the ignorant or those of “low ability” are deluded into believing they possess expertise or wisdom in a given field.  The subject’s very ineptitude and lack of experience is what prevents him from discerning his true level of ability.  As we have seen, if such a bias is allowed to range unchecked, it can grow from personal ignorance into a trend in the political life of an entire nation.  And if, as Edmund Burke said, we fail to “listen to the counsels of the wise and good, in preference to the flattery of knaves,” then we shall certainly no longer be “qualified for civil liberty” and will become, inevitably, unripe “for liberty on any standard.”  We shall witness the decline of responsible government in America, drowned out by radical demagogues; our idiotic policy choices; and the increasingly blithe, unaware populace that supports both.

Since at least the time of Julius Caesar’s coup in 49 B.C., man has surrendered his freedom to various strongmen who have offered to compromise liberty in exchange for swift political results.  This tradeoff, a lapse in that “Spirit of Liberty,” which Edmund Burke felt was the “greatest security of the people,” is the despot’s quickest and most ordinary way to power.

As Burke wrote in his 1791 “Letter to a Member of the National Assembly”:

Men are qualified for civil liberty in exact proportion to their disposition to put moral chains upon their own appetites … in proportion as their soundness and sobriety of understanding is above their vanity and presumption – in proportion as they are more disposed to listen to the counsels of the wise and good, in preference to the flattery of knaves. Society cannot exist, unless a controlling power upon will and appetite be placed somewhere; and the less of it there is within, the more there must be without. It is ordained in the eternal constitution of things, that men of intemperate minds cannot be free.

Without caution and self-control, and a reverence for the principles of the law, true liberty is impossible – even, and perhaps especially, when the pursuit of greater liberty is the goal of those needlessly spurning the law.  Society’s need for self-control becomes fulfilled by dictators and strongmen in proportion to the citizenry’s decrease in self-sufficiency.

It is ignorance, the wellspring of tyranny, which most often makes a population unable to practice good government and able to avoid demagoguery.  When “soundness and sobriety of understanding” give way in the face of “vanity and presumption,” social rot inevitably ensues.  And, unfortunately, the American body politic is, according to years of poll taking and studies, exceptionally ignorant on the topics of geography, politics, and history.

In 2011, the World Values Survey reported that 44% of non-college graduates and 28% of college graduates in the Unites States supported “having a strong leader who doesn’t have to bother with congress and elections” – a sign, if nothing else, of widespread exasperation with the conventions of Western representative government.  Yet America itself was conceived, as John Adams declared in the 1780 Constitution of the Commonwealth of Massachusetts, as “a government of laws and not of men,” and therefore the precedence of principles before personalities is at the very heart of the American experiment – as it is, ultimately, at the heart of all good government.  The results of this survey point to an unfamiliarity with what can happen when representative societies entirely reject what Max Weber described in 1922 as “rational-legal legitimacy” in favor of “charismatic,” personality-based legitimacy.  In light of other studies that show a widespread ignorance of the Supreme Court, the Constitution, the Bill of Rights, and even who the current vice president is, we can see at least in part why these views are so pervasive.

Perhaps the most noteworthy finding among the studies that examine the political knowledge of Americans is that the confidence of a subject’s opinion often increases as familiarity with the issue in question decreases.  For example, in 2014, in the aftermath of Putin’s invasion of Crimea, Daniel Larison in The American Conservative reported a survey study carried out by three political scientists, which discovered that respondents’ support for military intervention in Ukraine increased in a direct correlation with their difficulty in locating Ukraine on a map:

The further our respondents thought that Ukraine was from its actual location, the more they wanted the U.S. to intervene militarily[.] … [W]e found that the less accurate our participants were, the more they wanted the U.S. to use force, the greater the threat they saw Russia as posing to U.S. interests, and the more they thought that using force would advance U.S. national security interests[.] … Our results are clear, but also somewhat disconcerting: The less people know about where Ukraine is located on a map, the more they want the U.S. to intervene militarily.

In fact, as a 2017 article in Foreign Affairs said about this same survey, “[o]nly one in six could identify Ukraine on a map; the median response was off by about 1,800 miles.”  “It shouldn’t come as a surprise,” Larison later wrote in a 2017 reworking of his 2014 American Conservative piece, “that less knowledge about a foreign country corresponds with a preference for more destructive and irresponsible policies in relation to that country. If someone doesn’t even know where a given country is, it is unlikely that he knows much about the surrounding region or the probable consequences of military action.”

It should not take a sociological genius to see that this phenomenon, especially combined with the forces of the internet and modern mass media, makes for a polity ripe for mass manipulation and political demagoguery.

In a 2016 piece for the New York Times entitled “How Stable Are Democracies? ‘Warning Signs Are Flashing Red,” Amanda Taub wrote:

Across numerous countries, including Australia, Britain, the Netherlands, New Zealand, Sweden and the United States, the percentage of people who say it is “essential” to live in a democracy has plummeted, and it is especially low among younger generations.

While the article, which cites Harvard lecturer Yascha Mounk’s and University of Melbourne political scientist Roberto Stefan Foa’s research on “democratic deconsolidation,” is of course suffused with the standard anti-Trump and anti-conservative biases, it holds several useful pieces of information, if for no other reason than that it is an admission, straight from the horse’s mouth, that the Millennial generation is historically disenchanted with representative and constitutional values.  Across Western nations, the citizenry has, with each succeeding age bracket, progressively lost faith in the democratic process.

Taub continues:

Drawing on data from the European and World Values Surveys, the researchers found that the share of Americans who say that army rule would be a “good” or “very good” thing had risen to 1 in 6 in 2014, compared with 1 in 16 in 1995.

And, further:

The researchers calculated that 43 percent of older Americans believed it was illegitimate for the military to take over if the government were incompetent or failing to do its job, but only 19 percent of millennials agreed. The same generational divide showed up in Europe, where 53 percent of older people thought a military takeover would be illegitimate, while only 36 percent of millennials agreed.

America and other Western nations have scored as politically “deconsolidated” on what Taub describes as the “Mounk-Foa test.”  According to this test, these nations exhibit several similarities to Venezuela during the period prior to the ascent of Chávez, including a decreased belief in democratic values, openness to non-democratic forms of government, and support for “antisystem parties and movements.”  The idea that Trump’s election was in fact a democratic response to the non-democratic impulses of Clinton – older Americans, who her study found were more democratic on average, had an especially important role in electing Trump – is, for some reason, never a serious consideration in her article.  For the arguments made by Harvard academicians and The New York Times in favor of liberal cultural superiority, the social milieu of this new “deconsolidated,” “social justice warrior” generation could hardly be described as one of traditional conservatism.

In psychology, the Dunning–Kruger effect is a cognitive bias whereby the ignorant or those of “low ability” are deluded into believing they possess expertise or wisdom in a given field.  The subject’s very ineptitude and lack of experience is what prevents him from discerning his true level of ability.  As we have seen, if such a bias is allowed to range unchecked, it can grow from personal ignorance into a trend in the political life of an entire nation.  And if, as Edmund Burke said, we fail to “listen to the counsels of the wise and good, in preference to the flattery of knaves,” then we shall certainly no longer be “qualified for civil liberty” and will become, inevitably, unripe “for liberty on any standard.”  We shall witness the decline of responsible government in America, drowned out by radical demagogues; our idiotic policy choices; and the increasingly blithe, unaware populace that supports both.



Source link

The New Struggle: Religion, Morality, and American Law


Yet, this blessing, in one sense, seems to finally be evaporating. In modern Western society, even institutions as basic as marriage and values as basic as mere faith in God have come under attack, and American politicians have begun to receive criticism for adopting any moral principles whatsoever in their policymaking. Instead of a church conflict, in short, we are witnessing the dawn of a conflict over morality itself. The presupposition of moral absolutes in law, once supported as a matter of course and not viewed as necessarily connected with any particular religious creed, is now increasingly excoriated as a religious manipulation of public law. The religious undergirding of morality, accepted as a truism throughout Western history, has become the main target of modern secularist ideology.

History has demonstrated, time and time again, the truth of the principle laid out by the Taney court, and it will surely continue to do so. This article will seek to demonstrate that interpreting American law and constitutional principles through the lens of that modern, secular fanaticism which seeks to drive all morality and religion out of the public square is completely out of consonance with American legal precedent, and represents a clear break with the traditions of American society. This ideology will inaugurate, and in many cases already has inaugurated, an ideological conflict that that threatens to rip apart the very foundations of society. It will replace the old anticlerical struggle, which sought only to remove institutional clerical influence from government, with a struggle against any moral principles in government whatsoever, which can only end in total disaster.

To begin with, a mention of two of history’s key clerical vs. statist struggles is in order. The Spanish Civil War (1936-1939), one of history’s foremost examples of a church-state conflict, was an ideological war as vicious as any religious conflict fought in baroque Europe, imparting scars that divide many Spanish families to this day. Even many conservatives, such as Winston Churchill and William F. Buckley, Jr., who were glad to see the traditionalist—albeit authoritarian—Nationalists win, were often much happier with the defeat of the communist-aligned Spanish republic than the triumph of Generalissimo Francisco Franco. Many also decried the corruption of the Spanish government and church in the decades leading up to the war. It was a brutal struggle which often lacked a clear division between right and wrong.

 Perhaps deeper in the Anglo-American popular memory, however, are France’s revolutionary wars of the 1790s and 1800s. During this time, France witnessed, among other atrocities, the systematic killing of thousands, the execution of clergy and nuns, the razing of ancient churches, the arbitrary suspension of ancient laws, and, across the continent, saw innocent people defrauded of their lands and properties. At the peak its “reign of terror” (1793-1794), the most extremist revolutionary factions successively turned against the previous group in ascent and guillotined their rivals in a literal political suicide, accusing their foes of not being “revolutionary” enough. Such, it seems, is the natural fate of all such violent utopian movements—the Russian Revolution of 1917, whose communist perpetrators admired and sought to imitate its depredations is only one in history’s long line of additional examples.

Such, then, was the “violent birth” that brought the European continent into modernity, a phenomenon that seems, thankfully, to have mercifully ignored the United States—principally, perhaps, thanks to the good sense and prudence of its founders, the legal codes they erected, and the principles they propounded in documents such as the Federalist Papers.

The template of church-state relations laid out in American law is set in contrast (and was constructed in response) to this general European conflagration, this centuries-long struggle that the founding fathers were only too wary of. The great strength of the American concept of church-state affairs, as Alexis de Tocqueville laid out in Democracy in America in 1831, lay in the conviction—among the American citizenry—that the vitality of religion is grounded not in its connection to the state, but in its legal freedom from it. They felt that separation of church and state allow the church full freedom of action. It was an outlook that went beyond the mere letter of the law, and was sincerely held by millions of private citizens.

As he relates in his classic treatise, de Tocqueville found that essentially every American he queried on the issue in the early 1830s—clergy and laymen alike—were in total accord on the question of church-state relations. Americans felt that religion could persist just as strongly, perhaps more strongly, without state involvement (which they often equated with state control). In the American mind, that very Constitution which banned an established national church in no way hampered religion, but in fact encouraged it by freeing religious institutions from the corrupting influence of the state.

Instead of the bitterness that European peasants often directed at their churches because of corrupt, powerful, government sponsored clergymen, American citizens in the early days of the republic viewed their clergy, generally speaking, with a fundamental positivity and even friendliness. Nineteenth century American churchmen were, as a rule, respected by the citizenry as impartial community advocates and honored by the government as independent leaders of American society, and were usually strong spokesmen for political liberty. During this period there was a free mingling of government officials and religious clergy that took place without any legal coercion, a phenomenon which represented a true innovation in the course of human history. American presidents, governors, and legislators met, as they still do today, with prominent clergymen for assistance and advice on important matters, both religious and otherwise. This included leaders like President Lincoln himself, who made John Hughes, the first Catholic Archbishop of New York, his personal agent “with full powers to set forth the Union cause in Europe,” in the words of biographer Carl Sandburg.

For generations, this amicable modus vivendi made the United States a nation that, at least in one sense, was at philosophical unity and religious peace. Philosophical compromise, peace, and unity—unlike in the vicious all-or-nothing climates of revolutionary France or revolutionary Spain—became the default state of America society. Notwithstanding the prejudice of some private citizens, the faithful, those of different faiths, and those without discernible faith came together in the American alembic, finding peace because they agreed on fundamental practical points of personal behavior and church-state relations. Churches flourished and grew, competing just as they had in Europe for centuries, but with the stain of blood removed from their dealings with one another. America, in this way, had solved one of the key trials of history, becoming, as one of its national mottoes declares, a true novus ordo seclorum—a new order of the ages, emancipated, as it were, from the legal misfortunes and sufferings of the past ages of humanity.

Yet it seems that the cycle of history may have finally caught up with the United States. That dreaded European clerical vs. anti-clerical struggle may, in a way of speaking, be on the verge of arriving on American shores—albeit in a form quite different from the one it took in the eighteenth and nineteenth centuries. For while America has defeated the beast of the European-style church struggle, it has begun a debate over not which church will lead the nation but which basic code of morals, if any, will order society.

While the American population was always of a varied religious makeup, it was nonetheless, at least until roughly a half century ago, in general agreement on the validity of the basic tenets of the Judeo-Christian worldview. Many of the issues at the center of today’s cultural debate—gay marriage, abortion, feminism, divorce, the usage of religious symbols on government property—were not, generally speaking, matters of dispute until the relatively recent past. This was in large part because Americans were, overwhelmingly, of the same opinion on the principles that undergirded these issues and felt that the traditional outlook on topics such as marriage and gender was not necessarily connected to any particular religious institution or creed. (And, if certain individuals held differing views, they usually did not raise formal objections—even though they had full legal latitude to do so). The traditional stances on such topics were accepted and held as something more basic than even what one would call “social norms”: they were social truisms. The office of Chaplain of the Senate (in Lincoln’s day, Mr. Phineas Gurley), the affixation of “In God We Trust” to U.S. currency (which began in 1864), and the adoption of that same phrase as our national motto (under President Eisenhower), are all examples of this general cultural outlook and its persisting legacy. This was seen as non-contentious for most of U.S. history, inasmuch as these views were not seen as attached to a state-favored religion. The traditional, original understanding of the separation of church and state in the first amendment of the Constitution what that it was condemned, strictly speaking, the legal establishment of a state church, not that it condemned any hint of religion in government, as many revisionists presume today. If it were so oriented, then essentially every American leader since President Washington himself—who famously declared that “religious principles” were indispensable to the well-being of the nation in his 1796 farewell address—would be acting in contempt of the nation’s basic laws.

This relationship between Christianity, law, and society was often understood, as the Supreme Court ruled in Church of the Holy Trinity v. United States in 1892, as a manifestation of the legally embedded “general Christianity” (in the particular case of Church of the Holy Trinity v. United States, the “general Christianity” in the common law of Pennsylvania) of the American people, who were described, in the text of the court decision itself, as a “Christian people.” Common law, it was argued, guaranteed a place for the general creed of Christianity without reference to any one denomination; the American people were a “Christian people” who refrained, by law, from undertaking any religious persecutions or preferential treatment. The United States was a nation that, while inhabited by Jews, Christians, and non-aligned people alike, nonetheless acknowledged the practical Judeo-Christian traditions embedded in its laws, institutions, and very social fabric. This viewpoint was echoed in several other cases, including the aforementioned Commonwealth v. Nesbit in 1859.

Thus, one way in which the United States had avoided the European clerical vs. anti-clerical struggle was by allowing religious beliefs to suffuse its entire society and government—the people were, after all, an eminently religious one (as de Tocqueville had noted)—while refusing to legally connect itself, on the national level, with any religious body. The clergymen knew that this arrangement was one that preserved “domestic tranquility” and kept their churches largely free from the corruption of extraneous political influence. The religious forces among the people took full advantage of a system that, unlike the total separation of the French republic’s principle of lacïté, allowed religious faith a free hand to attempt to influence the government. American freedom was not a simple, sterile, domesticated freedom, but a true, radical, legally founded freedom that ensured liberty for individual citizens and churches alike. For generations, this arrangement was believed, by what seems to have been the great, preponderant swath of the people, to be fully compatible with, and even congenial to, the principles set forth in the Constitution. To illustrate the legal nature of this relationship, it may be helpful to reference several more Supreme Court cases from across American history.

One of the Court’s most dramatic formulations of its view on the relationship between “non-established” faith and the law came in 1846 with the ruling in City of Charleston v. Benjamin. The decision read, in part:

“Christianity is part of the common law of the land, with liberty of conscience to all. It has always been so recognized…The observance of Sunday is one of the usages of the common law recognized by our U.S. and State Governments…it is the foundation of those morals and manners upon which our society is formed; it is their basis. Remove it and they would fall…it [morality] has grown upon the basis of Christianity.”        

Again, nearly a hundred years later in United States v. Macintosh (1931)—a case dealing with the naturalization of citizens and the grounds for conscientious objection—the Supreme Court’s decision contained the outright declaration that the people of the United States were, by tradition and convention, a people of Judeo-Christian views who, nonetheless, allowed freedom of conscience to non-Christians under their Constitution. The decision (while it may stun the ear of many today) stated, in part:

“We are a Christian people… according to one another the equal right of religious freedom and acknowledging with reverence the duty of obedience to the will of God.”

 A corollary to this legal tradition came with the important Zorach v. Clausen decision in 1952, decided under Chief Justice Fred M. Vinson. This decision, like several previous ones, argued that the government should not merely permit the free exercise of religion, but should actually actively cooperate with religious authority. The decision reads, in part:

“The First Amendment…does not say that in every and all respects there shall be a separation of Church and State…Otherwise the State and religion would be aliens to each other—hostile, suspicious, and even unfriendly… We are a religious people whose institutions presuppose a Supreme Being…When the State encourages religious instruction or cooperates with religious authorities…it follows the best of our traditions [my italics].”

It is clear, however, that the climate presumed in these court rulings is far removed from our experiences today; we have, in many ways, been living under a drastically different paradigm, legally as well as socially, since the 1960’s. Whatever one’s opinion on today’s “hot button” social issues, anyone who has monitored the news for any length of time during the past half-century can see that the disappearance of common agreement on basic moral issues has introduced chaos into American society. And it is, in fact, this very chaos that is metastasizing into America’s own version of the church vs. state struggle.

With contemporary social and moral “development” has come a questioning of the very nature of truth itself; again, the court can furnish us with a ready-made example. Planned Parenthood v. Casey famously ruled in 1992 that “central” to freedom was the ability to “define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life.” This sentiment, and the relativistic philosophy that undergirds it, is entirely against the verdict of numerous prior supreme court cases and, moreover, repugnant to the basic integrity of law itself. How can the law stand if one has total freedom to create a new definition of reality wherein the law holds no power?

Planned Parenthood v. Casey and its close cousin Roe v. Wade are not only self-contradictory and poorly reasoned pieces of jurisprudence; even legal scholars of various political stripes dispute the logic behind Roe v. Wade’s interpretation of the fourteenth amendment, but the contradiction goes far beyond such particulars. We have entered a period in history were the moral scaffolding for humanity’s most basic behaviors has come under attack, with disastrous effects beyond the urbane faculty lounges where the existence of truth is cavalierly thrown into doubt. We are galloping through a chronic version of the “culture wars” which, while appearing to have abated somewhat in the wake of Obergefell v. Hodges, show no signs of stopping anytime soon. Our social and political fractures, far from disappearing, are growing even more salient. The 2016 presidential election furnished a perfect example of this, when, in spite of the undeniably moderate character of Trump’s social views, he was condemned by the left for his “backwardness.”

With violent campus protests, politically-inspired riots, increasing moral confusion, and now, assassination attempts sweeping the nation, I can only conclude by arguing for a return to those Judeo-Christian spiritual and philosophical values which, in the past, have provided our body politic (and the bodies politic of the Western world generally) with that ontological common ground essential for any mature society. As John Adams wrote in his letter to the Massachusetts militia in October 1798, “Our Constitution was made only for a moral and religious People. It is wholly inadequate to the government of any other,” and as John Quincy Adams, in turn, wrote to his son, “The law given from Sinai was a civil and municipal as well as a moral and religious code; it contained many statutes…of universal application—laws essential to the existence of men in society, and most of which have been enacted by every nation which ever professed any code of laws.”

These sentiments have held true across American history. Indeed, one might say that J. Q. Adams’ words find a natural echo in the judgment expressed by the Taney court, that “Law can never become entirely infidel,” all law being based, ultimately and unavoidably, on the spiritual values of mankind and the Truth of God. While it is true, especially since the 1960’s, that the court has issued rulings of a decidedly different character than the ones described here, and also, that American leaders have expressed a variety of views on this topic far too wide to be given full justice in a single article, the fact that the American founders almost unanimously ascribed tremendous importance to the place of religion in society ought to give any would-be legal secularist pause. The difference between “state secularism” and a “secular state” is a subtle yet essential distinction; in America, the lack of a legal establishment of religion is not an invitation to eject religion from the public square—it is, in fact, just the opposite.

Jack H. Burke is a member of the Fordham University Class of 2017 and a former contributor to the Fordham Political Review.

 

Until today, America has largely avoided the vicious church vs. state struggles which have so ravaged nations like France and Spain since the 1789 French Revolution. Western nations, from that time until today, have witnessed numerous political battles over the place of the church in public life, battles which usually had less to do with the particulars of theology than with longstanding economic and legal tensions. The European and Latin American debates over the relationship between church and state—such as those in nineteenth century Belgium, Mexico, and Italy—were complicated and dramatic affairs, which saw liberals and conservatives take a wide spectrum of positions.

American government, however, has historically sidestepped this conflict by, at the same time, both eschewing a state religion and fostering an informal relationship with religion, even to the point of presuming religious philosophies in its lawmaking. Indeed, as the Taney Court ruled in Commonwealth v. Nesbit in 1859, “Law can never become entirely infidel; for it is essentially founded on the moral customs of men and the very generating principles of these is most often religion.” A religious people who, of their own volition, decided against erecting any establishment of religion, the American people succeeded in avoiding the quagmire of European church politics—they had apparently solved one of history’s great political dilemmas.

Yet, this blessing, in one sense, seems to finally be evaporating. In modern Western society, even institutions as basic as marriage and values as basic as mere faith in God have come under attack, and American politicians have begun to receive criticism for adopting any moral principles whatsoever in their policymaking. Instead of a church conflict, in short, we are witnessing the dawn of a conflict over morality itself. The presupposition of moral absolutes in law, once supported as a matter of course and not viewed as necessarily connected with any particular religious creed, is now increasingly excoriated as a religious manipulation of public law. The religious undergirding of morality, accepted as a truism throughout Western history, has become the main target of modern secularist ideology.

History has demonstrated, time and time again, the truth of the principle laid out by the Taney court, and it will surely continue to do so. This article will seek to demonstrate that interpreting American law and constitutional principles through the lens of that modern, secular fanaticism which seeks to drive all morality and religion out of the public square is completely out of consonance with American legal precedent, and represents a clear break with the traditions of American society. This ideology will inaugurate, and in many cases already has inaugurated, an ideological conflict that that threatens to rip apart the very foundations of society. It will replace the old anticlerical struggle, which sought only to remove institutional clerical influence from government, with a struggle against any moral principles in government whatsoever, which can only end in total disaster.

To begin with, a mention of two of history’s key clerical vs. statist struggles is in order. The Spanish Civil War (1936-1939), one of history’s foremost examples of a church-state conflict, was an ideological war as vicious as any religious conflict fought in baroque Europe, imparting scars that divide many Spanish families to this day. Even many conservatives, such as Winston Churchill and William F. Buckley, Jr., who were glad to see the traditionalist—albeit authoritarian—Nationalists win, were often much happier with the defeat of the communist-aligned Spanish republic than the triumph of Generalissimo Francisco Franco. Many also decried the corruption of the Spanish government and church in the decades leading up to the war. It was a brutal struggle which often lacked a clear division between right and wrong.

 Perhaps deeper in the Anglo-American popular memory, however, are France’s revolutionary wars of the 1790s and 1800s. During this time, France witnessed, among other atrocities, the systematic killing of thousands, the execution of clergy and nuns, the razing of ancient churches, the arbitrary suspension of ancient laws, and, across the continent, saw innocent people defrauded of their lands and properties. At the peak its “reign of terror” (1793-1794), the most extremist revolutionary factions successively turned against the previous group in ascent and guillotined their rivals in a literal political suicide, accusing their foes of not being “revolutionary” enough. Such, it seems, is the natural fate of all such violent utopian movements—the Russian Revolution of 1917, whose communist perpetrators admired and sought to imitate its depredations is only one in history’s long line of additional examples.

Such, then, was the “violent birth” that brought the European continent into modernity, a phenomenon that seems, thankfully, to have mercifully ignored the United States—principally, perhaps, thanks to the good sense and prudence of its founders, the legal codes they erected, and the principles they propounded in documents such as the Federalist Papers.

The template of church-state relations laid out in American law is set in contrast (and was constructed in response) to this general European conflagration, this centuries-long struggle that the founding fathers were only too wary of. The great strength of the American concept of church-state affairs, as Alexis de Tocqueville laid out in Democracy in America in 1831, lay in the conviction—among the American citizenry—that the vitality of religion is grounded not in its connection to the state, but in its legal freedom from it. They felt that separation of church and state allow the church full freedom of action. It was an outlook that went beyond the mere letter of the law, and was sincerely held by millions of private citizens.

As he relates in his classic treatise, de Tocqueville found that essentially every American he queried on the issue in the early 1830s—clergy and laymen alike—were in total accord on the question of church-state relations. Americans felt that religion could persist just as strongly, perhaps more strongly, without state involvement (which they often equated with state control). In the American mind, that very Constitution which banned an established national church in no way hampered religion, but in fact encouraged it by freeing religious institutions from the corrupting influence of the state.

Instead of the bitterness that European peasants often directed at their churches because of corrupt, powerful, government sponsored clergymen, American citizens in the early days of the republic viewed their clergy, generally speaking, with a fundamental positivity and even friendliness. Nineteenth century American churchmen were, as a rule, respected by the citizenry as impartial community advocates and honored by the government as independent leaders of American society, and were usually strong spokesmen for political liberty. During this period there was a free mingling of government officials and religious clergy that took place without any legal coercion, a phenomenon which represented a true innovation in the course of human history. American presidents, governors, and legislators met, as they still do today, with prominent clergymen for assistance and advice on important matters, both religious and otherwise. This included leaders like President Lincoln himself, who made John Hughes, the first Catholic Archbishop of New York, his personal agent “with full powers to set forth the Union cause in Europe,” in the words of biographer Carl Sandburg.

For generations, this amicable modus vivendi made the United States a nation that, at least in one sense, was at philosophical unity and religious peace. Philosophical compromise, peace, and unity—unlike in the vicious all-or-nothing climates of revolutionary France or revolutionary Spain—became the default state of America society. Notwithstanding the prejudice of some private citizens, the faithful, those of different faiths, and those without discernible faith came together in the American alembic, finding peace because they agreed on fundamental practical points of personal behavior and church-state relations. Churches flourished and grew, competing just as they had in Europe for centuries, but with the stain of blood removed from their dealings with one another. America, in this way, had solved one of the key trials of history, becoming, as one of its national mottoes declares, a true novus ordo seclorum—a new order of the ages, emancipated, as it were, from the legal misfortunes and sufferings of the past ages of humanity.

Yet it seems that the cycle of history may have finally caught up with the United States. That dreaded European clerical vs. anti-clerical struggle may, in a way of speaking, be on the verge of arriving on American shores—albeit in a form quite different from the one it took in the eighteenth and nineteenth centuries. For while America has defeated the beast of the European-style church struggle, it has begun a debate over not which church will lead the nation but which basic code of morals, if any, will order society.

While the American population was always of a varied religious makeup, it was nonetheless, at least until roughly a half century ago, in general agreement on the validity of the basic tenets of the Judeo-Christian worldview. Many of the issues at the center of today’s cultural debate—gay marriage, abortion, feminism, divorce, the usage of religious symbols on government property—were not, generally speaking, matters of dispute until the relatively recent past. This was in large part because Americans were, overwhelmingly, of the same opinion on the principles that undergirded these issues and felt that the traditional outlook on topics such as marriage and gender was not necessarily connected to any particular religious institution or creed. (And, if certain individuals held differing views, they usually did not raise formal objections—even though they had full legal latitude to do so). The traditional stances on such topics were accepted and held as something more basic than even what one would call “social norms”: they were social truisms. The office of Chaplain of the Senate (in Lincoln’s day, Mr. Phineas Gurley), the affixation of “In God We Trust” to U.S. currency (which began in 1864), and the adoption of that same phrase as our national motto (under President Eisenhower), are all examples of this general cultural outlook and its persisting legacy. This was seen as non-contentious for most of U.S. history, inasmuch as these views were not seen as attached to a state-favored religion. The traditional, original understanding of the separation of church and state in the first amendment of the Constitution what that it was condemned, strictly speaking, the legal establishment of a state church, not that it condemned any hint of religion in government, as many revisionists presume today. If it were so oriented, then essentially every American leader since President Washington himself—who famously declared that “religious principles” were indispensable to the well-being of the nation in his 1796 farewell address—would be acting in contempt of the nation’s basic laws.

This relationship between Christianity, law, and society was often understood, as the Supreme Court ruled in Church of the Holy Trinity v. United States in 1892, as a manifestation of the legally embedded “general Christianity” (in the particular case of Church of the Holy Trinity v. United States, the “general Christianity” in the common law of Pennsylvania) of the American people, who were described, in the text of the court decision itself, as a “Christian people.” Common law, it was argued, guaranteed a place for the general creed of Christianity without reference to any one denomination; the American people were a “Christian people” who refrained, by law, from undertaking any religious persecutions or preferential treatment. The United States was a nation that, while inhabited by Jews, Christians, and non-aligned people alike, nonetheless acknowledged the practical Judeo-Christian traditions embedded in its laws, institutions, and very social fabric. This viewpoint was echoed in several other cases, including the aforementioned Commonwealth v. Nesbit in 1859.

Thus, one way in which the United States had avoided the European clerical vs. anti-clerical struggle was by allowing religious beliefs to suffuse its entire society and government—the people were, after all, an eminently religious one (as de Tocqueville had noted)—while refusing to legally connect itself, on the national level, with any religious body. The clergymen knew that this arrangement was one that preserved “domestic tranquility” and kept their churches largely free from the corruption of extraneous political influence. The religious forces among the people took full advantage of a system that, unlike the total separation of the French republic’s principle of lacïté, allowed religious faith a free hand to attempt to influence the government. American freedom was not a simple, sterile, domesticated freedom, but a true, radical, legally founded freedom that ensured liberty for individual citizens and churches alike. For generations, this arrangement was believed, by what seems to have been the great, preponderant swath of the people, to be fully compatible with, and even congenial to, the principles set forth in the Constitution. To illustrate the legal nature of this relationship, it may be helpful to reference several more Supreme Court cases from across American history.

One of the Court’s most dramatic formulations of its view on the relationship between “non-established” faith and the law came in 1846 with the ruling in City of Charleston v. Benjamin. The decision read, in part:

“Christianity is part of the common law of the land, with liberty of conscience to all. It has always been so recognized…The observance of Sunday is one of the usages of the common law recognized by our U.S. and State Governments…it is the foundation of those morals and manners upon which our society is formed; it is their basis. Remove it and they would fall…it [morality] has grown upon the basis of Christianity.”        

Again, nearly a hundred years later in United States v. Macintosh (1931)—a case dealing with the naturalization of citizens and the grounds for conscientious objection—the Supreme Court’s decision contained the outright declaration that the people of the United States were, by tradition and convention, a people of Judeo-Christian views who, nonetheless, allowed freedom of conscience to non-Christians under their Constitution. The decision (while it may stun the ear of many today) stated, in part:

“We are a Christian people… according to one another the equal right of religious freedom and acknowledging with reverence the duty of obedience to the will of God.”

 A corollary to this legal tradition came with the important Zorach v. Clausen decision in 1952, decided under Chief Justice Fred M. Vinson. This decision, like several previous ones, argued that the government should not merely permit the free exercise of religion, but should actually actively cooperate with religious authority. The decision reads, in part:

“The First Amendment…does not say that in every and all respects there shall be a separation of Church and State…Otherwise the State and religion would be aliens to each other—hostile, suspicious, and even unfriendly… We are a religious people whose institutions presuppose a Supreme Being…When the State encourages religious instruction or cooperates with religious authorities…it follows the best of our traditions [my italics].”

It is clear, however, that the climate presumed in these court rulings is far removed from our experiences today; we have, in many ways, been living under a drastically different paradigm, legally as well as socially, since the 1960’s. Whatever one’s opinion on today’s “hot button” social issues, anyone who has monitored the news for any length of time during the past half-century can see that the disappearance of common agreement on basic moral issues has introduced chaos into American society. And it is, in fact, this very chaos that is metastasizing into America’s own version of the church vs. state struggle.

With contemporary social and moral “development” has come a questioning of the very nature of truth itself; again, the court can furnish us with a ready-made example. Planned Parenthood v. Casey famously ruled in 1992 that “central” to freedom was the ability to “define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life.” This sentiment, and the relativistic philosophy that undergirds it, is entirely against the verdict of numerous prior supreme court cases and, moreover, repugnant to the basic integrity of law itself. How can the law stand if one has total freedom to create a new definition of reality wherein the law holds no power?

Planned Parenthood v. Casey and its close cousin Roe v. Wade are not only self-contradictory and poorly reasoned pieces of jurisprudence; even legal scholars of various political stripes dispute the logic behind Roe v. Wade’s interpretation of the fourteenth amendment, but the contradiction goes far beyond such particulars. We have entered a period in history were the moral scaffolding for humanity’s most basic behaviors has come under attack, with disastrous effects beyond the urbane faculty lounges where the existence of truth is cavalierly thrown into doubt. We are galloping through a chronic version of the “culture wars” which, while appearing to have abated somewhat in the wake of Obergefell v. Hodges, show no signs of stopping anytime soon. Our social and political fractures, far from disappearing, are growing even more salient. The 2016 presidential election furnished a perfect example of this, when, in spite of the undeniably moderate character of Trump’s social views, he was condemned by the left for his “backwardness.”

With violent campus protests, politically-inspired riots, increasing moral confusion, and now, assassination attempts sweeping the nation, I can only conclude by arguing for a return to those Judeo-Christian spiritual and philosophical values which, in the past, have provided our body politic (and the bodies politic of the Western world generally) with that ontological common ground essential for any mature society. As John Adams wrote in his letter to the Massachusetts militia in October 1798, “Our Constitution was made only for a moral and religious People. It is wholly inadequate to the government of any other,” and as John Quincy Adams, in turn, wrote to his son, “The law given from Sinai was a civil and municipal as well as a moral and religious code; it contained many statutes…of universal application—laws essential to the existence of men in society, and most of which have been enacted by every nation which ever professed any code of laws.”

These sentiments have held true across American history. Indeed, one might say that J. Q. Adams’ words find a natural echo in the judgment expressed by the Taney court, that “Law can never become entirely infidel,” all law being based, ultimately and unavoidably, on the spiritual values of mankind and the Truth of God. While it is true, especially since the 1960’s, that the court has issued rulings of a decidedly different character than the ones described here, and also, that American leaders have expressed a variety of views on this topic far too wide to be given full justice in a single article, the fact that the American founders almost unanimously ascribed tremendous importance to the place of religion in society ought to give any would-be legal secularist pause. The difference between “state secularism” and a “secular state” is a subtle yet essential distinction; in America, the lack of a legal establishment of religion is not an invitation to eject religion from the public square—it is, in fact, just the opposite.

Jack H. Burke is a member of the Fordham University Class of 2017 and a former contributor to the Fordham Political Review.

 



Source link

The Enlightenment: A Useful Myth



Starting with David Brooks, here's the view that modern secular "moderates" and liberals take of the Western Enlightenment, which had its darker edges.



Source link