These highlights are from the Kindle version of The Right Side of History by Ben Shapiro.
Table of Contents
- The Modern Era
- American Society’s Twin Pillars
- Foundation for Happiness
- Collective Mobilization & Individuality
- God & Religion
- Education & Academia
- Greek Philosophy
- Science & Faith
- Challenging the Church
- The Death of God
- Nationalism & Revolution
- Nazism & Communism
- Existentialism & Human Motivation
- On Liberty
- On Fascism
- Marcuse & Self-Realization
The Modern Era
For nearly all of human history, life has been nasty, brutish, and short. In 1900, some 10 percent of all infants died before reaching their first birthday in the United States; in other countries, the number was far higher.
Yet we now live in an era in which mothers can expect to survive pregnancy and childbirth (the mortality rate among pregnant women has dropped 99 percent). Babies can be expected to survive infancy and then live another eight decades; we live in an era in which the vast majority of the American population lives in climate-controlled spaces with plenty of food, a car, and at least one television.
In July 2017, Pew Research found that 47 percent of self-described liberal Democrats said they’d have a tough time staying friends with those who voted for Trump; 13 percent of conservatives said the same, but it’s difficult to say whether that number might have been reversed if Trump had lost. It’s also worth noting that fully 47 percent of Clinton voters said they didn’t have a single close friend who voted for Trump. A more telling statistic: 68 percent of Democrats said it was “stressful and frustrating” to talk to political opponents; 52 percent of Republicans agreed.
Just 27 percent of Americans trust banks; just 20 percent of Americans trust newspapers; just 41 percent of Americans say they trust organized religion; that number is 19 percent for the federal government overall, and 39 percent for the health care system. Only 30 percent of Americans trust the public schools, 18 percent trust big business, and 9 percent trust Congress. We still trust our police, but those numbers have dropped over the past decade, particularly among Democrats. The only thing we still seem to trust is the military—which makes sense, since it provides for our common defense.
As for our democracy, fewer and fewer people like it. An October 2016 poll showed that 40 percent of Americans said they had “lost faith in American democracy,” with another 6 percent stating that they never had faith to begin with. No wonder only 31 percent of those polled said they would “definitely” accept the results of the election if their candidate lost.
The upper middle class in the United States grew from 12 percent of Americans in 1979 to 30 percent as of 2014. Income mobility hasn’t changed significantly in the United States since the 1970s.
We’re more racially equal than ever before in our history—more equal than any other society in human history. In 1958, just 4 percent of Americans approved of black-white intermarriage; as of 2013, that statistic was 87 percent. In that year, 72 percent of white Americans thought race relations were good, and so did 66 percent of black Americans; that statistic had remained relatively stable from 2001 through 2013.
American Society’s Twin Pillars
This book argues that Western civilization, including our modern notions of values and reason and science, was built on deep foundations. And this book argues that we’re tossing away what’s best about our civilization because we’ve forgotten that those foundations even exist.
We believe freedom is built upon the twin notions that God created every human in His image, and that human beings are capable of investigating and exploring God’s world. Those notions were born in Jerusalem and Athens, respectively.
Jerusalem and Athens built science. The twin ideals of Judeo-Christian values and Greek natural law reasoning built human rights. They built prosperity, peace, and artistic beauty. Jerusalem and Athens built America, ended slavery, defeated the Nazis and the Communists, lifted billions from poverty, and gave billions spiritual purpose. Jerusalem and Athens were the foundations of the Magna Carta and the Treaty of Westphalia; they were the foundations of the Declaration of Independence, Abraham Lincoln’s Emancipation Proclamation, and Martin Luther King Jr.’s Letter from Birmingham Jail.
Civilizations that rejected Jerusalem and Athens, and the tension between them, have collapsed into dust. The USSR rejected Judeo-Christian values and Greek natural law, substituting the values of the collective and a new utopian vision of “social justice”—and they starved and slaughtered tens of millions of human beings. The Nazis rejected Judeo-Christian values and Greek natural law, and they shoved children into gas chambers. Venezuela rejects Judeo-Christian values and Greek natural law, and citizens of their oil-rich nation have been reduced to eating dogs.
We are in the process of abandoning Judeo-Christian values and Greek natural law, favoring moral subjectivism and the rule of passion. And we are watching our civilization collapse into age-old tribalism, individualistic hedonism, and moral subjectivism. Make no mistake: we are still living off the prosperity of the world built by Jerusalem and Athens. We believe we can reject Judeo-Christian values and Greek natural law and satisfy ourselves with intersectionality, or scientific materialism, or progressive politics, or authoritarian governance, or nationalistic solidarity. We can’t.
Foundation for Happiness
Politics is about working to build the framework for the pursuit of happiness, not the achievement of it; politics helps us establish the preconditions necessary for happiness, but can’t provide happiness in and of itself. The Founding Fathers knew that. That’s why Thomas Jefferson didn’t write that the government was granted power to grant you happiness: it was there to protect your pursuit of happiness. The government existed to protect your rights, to prevent those rights from being infringed upon. The government was there to stop someone from stealing your horse, from butchering you in your sleep, from letting his cow graze on your land.
To Aristotle, “good” wasn’t a subjective term, something for each of us to define for ourselves; “good” was a statement of objective fact. Something was “good” if it fulfilled its purpose. A good watch tells time; a good dog defends its master. What does a good human being do? Acts in accordance with right reason. What makes human beings unique, says Aristotle, is our capacity to reason, and to use that reason to investigate the nature of the world and our purpose in it:
Happiness isn’t rolling around in the mud at Woodstock, nor is it a nice golf game after a rough week at work. Happiness is the pursuit of purpose in our lives. If we have lived with moral purpose, even death becomes less painful.
What do we need to generate the moral purpose that provides the foundation for happiness? We need, in my estimate, four elements: individual moral purpose, individual capacity to pursue that purpose, communal moral purpose, and communal capacity to pursue that purpose. These four elements are crucial; the only foundation for a successful civilization lies in a careful balance of these four elements.
Collective Mobilization & Individuality
The key phrase—the beginning of Western civilization—lies in Genesis 1:26: we are all made in God’s image. All of us, not just kings or potentates. That means we all have inherent value, and that our mission in life is to draw close to something beyond ourselves.
Voltaire famously stated, “I want my lawyer, my tailor, my servants, even my wife to believe in God, because it means that I shall be cheated and robbed and cuckolded less often. . . . If God did not exist, it would be necessary to invent him.”
Having strong social relationships strengthens the immune system, extends life (more than does quitting smoking), speeds recovery from surgery, and reduces the risks of depression and anxiety disorders.” In fact, a massive longitudinal Harvard study found that the single best predictor of lifelong happiness was the presence of close relationships: satisfaction with relationships at age fifty was actually more predictive of long-term health than cholesterol level.
Like Aristotle, the founders believed in social organizations fostering virtue: a country without such social ties could not survive in freedom. They also agreed that the Judeo-Christian tradition had to provide a basis for sound values for individuals living in a free community: as John Adams stated in a letter to the Massachusetts militia, “We have no Government armed with Power capable of contending with human Passions unbridled by . . . morality and Religion. Avarice, Ambition, Revenge or Gallantry, would break the strongest Cords of our Constitution as a Whale goes through a Net. Our Constitution was made only for a moral and religious People. It is wholly inadequate to the government of any other.”
The best countries—and the best societies—are those where citizens are virtuous enough to sacrifice for the common good but unwilling to be forced to sacrifice for the “greater” good.
In 2012, the Democratic National Convention featured a video with the slogan, “Government’s the only thing we all belong to.” That belief has been the defining feature of tyrannies the world over: the utopian notion that if we all pull our oars in the same direction, at the behest of a centralized government, we’ll be able to accomplish more together.
It’s tempting to mobilize our ardor for collective mobilization and use it as a state-wielded club to force individual virtue, or to force large-scale change. Tyranny rarely begins with jackboots; it usually begins with ardent wishes for a better future, combined with an unfailing faith in the power of mass mobilization.
God & Religion
Biblical sacrifices aren’t designed merely to appease a higher power. They’re designed to change us, to teach us something.
Perhaps the most important sentence ever penned was this, from Genesis 1:27: “God created man in His image, in the image of God He created him; male and female He created them.” No longer was divine choice restricted to great leaders: in fact, God mocks such pretensions in Genesis 6:2, when He decides to flood the earth after the “sons of the gods”—the rulers—begin running roughshod over the rights of commoners, prompting God to scoff at the arrogant humans who “are mortal.”
The Bible tells us what God expects of us and tells us that we have the duty to fulfill those expectations; it tells us that we are special, and that we are loved by an infinitely good, caring, and powerful Being. It tells us that we have a duty to reach out to Him. The Bible makes God accessible; it brings God down to earth. In doing so, it offers man the opportunity to raise himself.
The soul with which God endowed man seeks the Divine through reason—the uniquely human quality that lifts human beings above animals, and places us at the foot of God’s throne.
Education & Academia
There is a battle currently raging on college campuses regarding the role of the university. Is it to create a safe space for students to “find themselves”? Is it a place to experience the wonder of a wide variety of thought? Or is it a place to inculcate the basic thought underpinning Western civilization? In the past, the answer was clearly the last option: people went to college to be steeped in the classics. The Founding Fathers were well versed in Latin and Greek; their writing is replete with references to ancient literature.
In 1900, half of all American public high school students were obligated to take Latin classes. But many current scholars and students find the classics trite and boring, old and hackneyed—ethnocentric and culturally stagnating. Jesse Jackson famously marched at Stanford in the 1980s, arm in arm with students, chanting, “Hey, hey, ho, ho, Western Civ has got to go.” As of 2010, not a single top university required students to take a course in Western civilization; only sixteen even offered such a course.
Western civilization, in the view of many on the radical Left, was a bastion of imperialism and racism; students should be devoted to learning about those shortcomings, rather than about the glories of ancient philosophy. In fact, study of ancient philosophy and civilization dramatically limits our understanding of the West’s innate evils.
The promulgators of multiculturalism in education all too often aren’t promoting breadth of learning, but lack of learning. This attitude reaches its apotheosis in the anti-classics movements we see on college campuses like Reed, where one student group lobbies against basic humanities courses because they allegedly “perpetuate white supremacy. . . . The texts that make up the [Humanities 110] syllabus . . . are ‘Eurocentric,’ ‘Caucasoid,’ and thus ‘oppressive.’”
This is a dramatic, deliberate misreading of the history of Western civilization—the greatest force for good in world history. That statement is not meant to ignore the myriad evils in which Western civilization has participated. But Western civilization has freed more people than any other, by a long shot; it has reduced poverty, conquered disease, and minimized war. Western civilization is responsible for the economic betterment of the global population, and for the rise in human rights and democracy.
The ancient Greeks gave us three foundational principles: first, that we could discover our purpose in life from looking at the nature of the world; second, that in order to learn about the nature of the world, we had to study the world around us by utilizing our reason; and finally, that reason could help us construct the best collective systems for cultivating that reason. In short, the Greeks gave us natural law, science, the basis of secularly constructed government. Jerusalem brought the heavens down to earth; Athens’s elevation of reason would launch mankind toward the stars.
Plato and Aristotle both believed in the notion of objective truth as well. But Plato and Aristotle disagreed with regard to what constituted objective truth: the Forms, or knowledge of the physical world. In the end, this disagreement would wind up creating the basis for the scientific method: deduction would present human beings with a scientific hypothesis; facts presented by empirical evidence would become the basis for judging that theory; the hypothesis would then be accepted or rejected or changed. Aristotle’s establishment of logical rigor with regard to empirical observation would provide the basis for all further scientific thought.
The Greek vision of citizenship focused centrally on the place of the individual within the polis, the city-state; Hellenic thought focused heavily on how to shape individuals to best serve as citizens, and how to cultivate virtues that would be useful in such citizens. Judaism, however, had another commitment: the commitment to individual and collective service to Divine law.
Christianity universalized the message of Judaism. The Gospels were deliberately written in Greek, not the Aramaic used by the Jews of the period. Jesus’s story was meant to extend to the entire world. Because Jesus was no longer a Jewish figure in the Christian view, but the material incarnation of the divine, that meant that Jewish law could be abandoned in favor of universalism.
The Middle Ages saw technological revolution in agriculture, the rise of commerce, and the institution of new forms of art ranging from polyphonic music to Gothic architecture; it also saw new developments in the art of war, with technological advances that would allow the West to defeat its enemies in the course of coming centuries. While many historians tout the power of Islamic civilization during this time period—and Islamic civilization did thrive on the Arabian Peninsula particularly—when Islamic civilization came up against Western civilization at the Battle of Tours, Islamic forces were soundly defeated.
If reason supports the notion of an intelligent God who crafts nature and stands behind its ever-present glory, Aquinas posited, then human beings can examine the natural world as a pathway to understanding Him. God made nature; to discover nature is to investigate the works of God. In fact, God wanted man to do this—God wanted man to seek Him everywhere. And God granted human beings the power of free will and reason to do so—as Aquinas celebrates, “man acts from judgment, because by his apprehensive power he judges that something should be avoided or sought . . . forasmuch as man is rational it is necessary that man have a free-will.”
In Aquinas’s thought, Jerusalem and Athens are reunified. God orders us to use our reason, and reason impels us to discover the natural law—laws designed by God.
Aquinas is completely comfortable with the notion of scientific discovery and progress; he openly states that if the astronomers of his day were proved to be wrong, that would not refute any of his metaphysics, since “perhaps the phenomena of the stars are explicable on some other plan not yet discovered by man.”
Aquinas’s faith in human reason—and his faith that human reason would not be able to tear down the revelation of God—led to a consonance that would blossom into the scientific revolution. The development of Western science was rooted in the notion that man’s task was to celebrate God through knowledge of His creation. Contrary to the propaganda of a postmodern atheist movement, nearly every great scientist up until the age of Darwinism was religious. The Scholastic movement produced the earliest roots of the scientific method, all the way up through the discovery by Nicolaus Copernicus (1473–1543) of a heliocentric solar system.
Science & Faith
Perhaps the greatest exponent of the Scholastic method was Roger Bacon (1219–1292), a Franciscan friar who devoted himself to understanding the natural world. Like Aquinas, Bacon was a devoted Aristotelian who suggested gathering facts before coming to conclusions. He wrote fulsomely on optics, alchemy, and astronomy; he suggested revising the Julian calendar, which he found obtuse; he even set down the first European formula for gunpowder. The age of scientific progress didn’t begin with the Enlightenment. It began in the monasteries of Europe.
Christianity, like all religions, focuses on the spiritual to the exclusion of the physical. And that failure to take into account the drive for betterment in the physical world would be used as a club wielded against Christianity itself before long.
The secularist myth holds that religion held back science for millennia. The reverse is true. Without Judeo-Christian foundations, science simply would not exist as it does in the West.
Contrary to popular opinion, new discoveries weren’t invariably seen as heretical or dangerous to the dominion of the Church; in fact, the Church often supported scientific investigation. Nicole Oresme (1320–1382), the discoverer of the Earth’s rotation about its axis, was a bishop of Lisieux and a graduate of the University of Paris. Nicholas of Cusa (1401–1464), cardinal of Brixen, theorized that the Earth was not stationary, but moved through space. Nicolaus Copernicus studied in parochial school and served the church of Warmia as medical adviser; his publication of De revolutionibus, his theory that the Earth moved around the sun and not the other way around, in March 1543, included a letter to Pope Paul III.
Eventually, the backlash to the inclusion of secular knowledge in the Christian worldview—a backlash led by thinkers like Martin Luther (1483–1546) and John Calvin (1509–1564)—led to the Church’s famous persecution of Galileo. Galileo Galilei (1564–1642) famously posited that the Earth moves around the sun, and was forced to recant by the Church for his failure to state that his theory was not fact.
Despite his differences with the Church, Galileo never abandoned his own faith that science could be a pathway to God. He wrote, “I say that as to the truth of the knowledge which is given by mathematical proofs, this is the same that Divine wisdom recognizes, [although] our understanding . . . is infinitely surpassed by the Divine. [Yet] when I consider what marvelous things and how many of them men have understood, inquired into, and contrived, I recognize and understand only too clearly that the human mind is a work of God’s, and one of the most excellent.”
Galileo was no exception. He was a representative of the rule: religious men saw a duty to examine the universe, and to do so with the best possible methodology. This philosophy permeated the wisdom of the Enlightenment’s greatest scientists. Johannes Kepler (1571–1630), the discoverer of the laws of planetary motion, explained: “The chief aim of all investigations of the external world should be to discover the rational order and harmony which has been imposed on it by God and which He revealed to us in the language of mathematics.”
Kepler’s philosophy was also that of Isaac Newton (1642–1726): “Opposite to God is Atheism in profession & Idolatry in practice. Atheism is so senseless & odious to mankind that it never had many professors.”
Challenging the Church
The rise of science was coincident with the simultaneous rise of human freedom. The dominance of the Catholic Church over the course of the Middle Ages and Renaissance led some to rebel against the notion of centralized authority altogether. One of the first to do so was Marsilius of Padua (1275–1342), who fought against the notion of papal plenitude of power—the notion that the Church ought to rule in the City of Man as well as the City of God. Marsilius saw that the Catholic Church’s power could threaten secular authorities—and that those secular authorities could then turn on the Church. Instead of theocracy, Marsilius proposed sovereignty of citizens. His philosophy actually bordered on calls for democracy—he suggested that freedom to worship God itself prohibited theocracy. No wonder Pope Clement VI stated that he had almost never read a worse heretic than Marsilius.
The rise of Lutheranism challenged both the spiritual and temporal power of the Church. Luther, in his ardent attempt to reclaim the Bible from what he deemed the thoroughgoing corruption of the papacy, worked to decimate the hierarchy of believers, leaving merely individuals before God, capable of comprehending God’s direct word: “A shoemaker, a smith, a farmer, each has his manual occupation and work; and, yet, at the same time, all are eligible to act as priests and bishops.” In pursuit of that egalitarian vision, Luther discarded the notion of sanctuary from the secular law: “It is intolerable that in canon law, the freedom, person, and goods of the clergy should be given this exemption, as if the layman were not exactly as spiritual, and as good Christians, as they, or did not equally belong to the church.”
The horrors of religious conflict from the mid-sixteenth century culminated in the Thirty Years’ War (1618–1648)—a war that resulted in some eight million deaths—and forced the choice of religious toleration or mass carnage. It was in making this choice that the notion of human rights was born.
This long philosophical journey would come to fruition in the first country in history to be crafted based on philosophy: the United States of America. The Founding Fathers were devotees of Cicero and Locke, of the Bible and Aristotle. They’d done their reading. And they based their new national philosophy on the lessons garnered from that reading: natural law, rooted in reason and enshrined by religion; individual natural rights, balanced by corresponding duties; a limited government of checks and balances designed to protect those rights in accordance with natural law; and inculcation of virtue, to be pursued by individuals and communities, again in accordance with the dictates of natural law.
No better founding document has ever been penned than the Constitution of the United States, the product of compromise. That virtue took the form of prudence—the practical wisdom of The Federalist Papers has not yet been surpassed in political thought. And that virtue took the form of justice—the rule of law, not of men, and the creation of a system where each receives his due.
The evils of their time did not leave our founders unaffected. The founding was rife with self-contradiction: that great exponent of liberty, Jefferson, a man who called slavery “a cruel war against human nature,” was a slaveholder and the father of six children by a slave, Sally Hemings; Madison, another slaveholder, said that slavery based on “mere distinction of colour” was “the most oppressive dominion ever exercised by man over man.”
The Death of God
Charles Darwin’s On the Origin of Species (1859) provided the first scientific grounding to the notion of a world without God, and a world beyond the mind of man.
Darwinism was seen by the intelligentsia of the time as a final permission to break with the ways of the ancients. Finally, at long last, the superstitions of religion could be put aside; finally, at long last, the legacy of the ancient Greeks could be escaped.
Fyodor Dostoyevsky (1821–1881) worried deeply about mankind unbound from moral obligation. He saw in the rise of an atheistic world the face of the Marquis de Sade (1740–1814), the famed French sadist, rapist, and pedophile who embraced passion, discounted human responsibility, and saw in his own pleasure the highest good. De Sade infamously dismissed God and added, “We rail against the passions, but never think that it is from their flame that philosophy lights its torch.”
Dostoyevsky feared the materialism that had come to dominate European thought. In the famous “Grand Inquisitor” chapter of The Brothers Karamazov, in which Ivan Karamazov tells his tale of a Spanish inquisitor grilling Jesus, Dostoyevsky suggested that the day had come when human beings would give up on meaning in favor of worldly goods: “Dost Thou know that the ages will pass, and humanity will proclaim by the lips of their sages that there is no crime, and therefore no sin; there is only hunger?” Dostoyevsky suggested the cure for hunger would be the dictator—the man who satiates hunger will be worshipped as a deity.
The search for meaning, untrammeled by Judeo-Christian values and Greek telos, freed from moral responsibility by scientific determinism, would burst forth in a conflagration that will set the whole world on fire, Dostoyevsky predicted. The result would be blood and suffering, a maelstrom of horror, followed by an epoch of emptiness. God’s death, Dostoyevsky thought, was man’s death as well.
The death of God, Friedrich Nietzsche (1844–1900) saw, left no room for purpose through reason. Like Hume, Nietzsche believed that morality through reason alone was a lie, a cover for instinct wearing the costume of reason: “Your decision, ‘this is right,’ has a previous history in your impulses, your likes and dislikes.”30 Nietzsche, in other words, had consolidated the lessons of the eighteenth and nineteenth centuries, and ripped the mask of nicety from the face of the Enlightenment.
Nietzsche knew that the only answer to scientific materialism was radical subjectivity, and with it would come the death of morals. He celebrated that fact, and reveled in the power of the will.
Nationalism & Revolution
The French Revolution was bloody, vicious, and awful. Tens of thousands of people were murdered by the regime between 1793 and 1794, with another quarter million dead in a civil war over a draft designed to fight foreign invasion.
Where, exactly, did the French Revolution—born with dreams of liberty, equality, and fraternity—go so wrong? It went wrong because the Enlightenment of the French Revolution rejected the lessons of the past; it saw in the history of the West mere repression and brutality, and longed for a tomorrow full of visions and dreams based on vague notions of human goodness.
It was the French Revolution that made romantic nationalism a driving force in history. The very definition of citizenship changed in Revolutionary France, from subjects at the beck and call of more powerful actors to citizens with an equal stake in the formation of the general will. But that definition of citizenship quickly devolved into a new form of subject status: in France, citizens owed their rights to the state.
Nationalism turns toxic when it fails to reach that moral minimum—when it tyrannizes its own citizens, or locks people out based on immutable characteristics. Nationalism turns poisonous when it becomes imperialism—when it suggests that it represents a universalism that can override the legitimate rights of other states, or when it uses national interest as an excuse for conquest on behalf of a “volk.”
The Bible contended that man could not live by bread alone; the French Revolution contended that without bread, nothing else mattered.
The term bureaucracy is a mashup of the French word bureau, meaning the material used to cover desks, and the Greek term kratos, meaning power. So the term itself literally meant “desk power” or “office rule.”
Didn’t individual rights come into direct conflict with a small clique of all-knowing experts, ruling from above? The man who solved this conundrum was Auguste Comte (1798–1857).
In 1822, Comte wrote his Plan for the Scientific Work Necessary to Reorganize Society—his suggestion then, as it was throughout his career, is that human knowledge could lead to the imposition of rational laws that could reorganize all of humanity. Expertise was the basis of governance. And all inquiries into transcendental values were worthless, except insofar as they served man. Science was philosophy was religion. All were the same.
Nazism & Communism
The worst sins of the nineteenth and twentieth centuries sprang from various combinations of romantic nationalism, collectivist redistributionism, and supposedly scientific governance.
The most obvious example, of course, came in Germany. The regime of Otto von Bismarck was characterized by its embrace of romantic nationalism, which culminated in the unification of Germany—and Bismarck’s subsequent focus on Kulturkampf, the “cultural struggle” for German solidarity.
When Germany surrendered after World War I and the Kaiser was ousted in a coup, romantic nationalism did not dissipate: it was buried lightly under the soil, where it festered. In the aftermath of World War I, the German national myth of the “back-stabbing” outsiders who had given away a battle victory spread far and wide. Germany had been betrayed, the logic went, and only a united Germany, beyond democracy and liberalism, could rise again to defeat its enemies.
The Nazi regime promulgated the most extreme romantic nationalism in world history—hundreds of thousands of Germans united in lockstep, cheering wildly at the sight of a dictator, greeting each other by hailing Hitler, hanging pictures of the Führer over their mantels.
The appeal of such ideals may be confusing, but George Orwell brilliantly summed them up in a 1940 essay on Hitler’s Mein Kampf: “Hitler, because in his own joyless mind he feels it with exceptional strength, knows that human beings don’t only want comfort, safety, short working-hours, hygiene, birth-control and, in general, common sense; they also, at least intermittently, want struggle and self-sacrifice, not to mention drums, flags and loyalty-parades.”
Meanwhile, concurrent with the rise of romantic nationalism in Germany, the ideals of collectivist redistributionism took precedence in the East. The end of World War I marked the final breath of the tsarist regime in Russia—and with it, the rise of Marxism.
Sounding a lot like Bernie Sanders, Lenin wrote in 1917, “Democracy for an insignificant minority, democracy for the rich—that is the democracy of capitalist society.” Instead, Lenin sought on the one hand “immense expansion of democracy, which for the first time becomes democracy for the poor, democracy for the people, and not democracy for the money-bags,” and on the other hand, “a series of restrictions on the freedom of the oppressors, the exploiters, the capitalists. We must suppress them in order to free humanity from wage slavery, their resistance must be crushed by force; it is clear that there is no freedom and no democracy where there is suppression and where there is violence.”
Stalin would carry Lenin’s bloody legacy further, of course—by the end of his life, Stalin was responsible for the murder of tens of millions of people under his rule, including five million people during the forcible starvation of Ukraine for agricultural collectivization from 1931 to 1934 alone.
Mao openly bragged about “burying alive 46,000 scholars.” During the Cultural Revolution, Mao’s forces committed atrocities upon intellectuals—and mirroring the USSR’s gulags, Mao built a system of laogai that housed tens of millions of dissidents over the decades. Today, the North Korean gulag state mirrors the glorious heritage of its communist predecessors.
Until the USSR’s fall, many on the mainstream Left believed it to represent a viable ideology.
Despite the fall of the USSR, the desire to find a new meaning in the collective remains, both in the United States and abroad. The romanticism about communism has never truly died on the American Left—in 2017, the New York Times ran op-eds about why “women had better sex under socialism,” “when Communism inspired Americans,” and why “socialism’s future may be its past.”
During the FDR administration, economic policy was set from the top; ignoring the injunction by economically laissez-faire thinkers that no set of individuals can know more than the entire market at large, FDR and his cadre of geniuses lengthened the Great Depression by nearly a decade by manipulating the currency, setting wages and prices, and bullying those who objected into silence. As Harold L. Cole and Lee E. Ohanian of UCLA concluded, “The economy was poised for a beautiful recovery, but that recovery was stalled by these misguided policies.” Those misguided policies included FDR setting the price of gold based on his lucky number. Treasury Secretary Henry Morgenthau wrote in his diary, “If anybody ever knew how we really set the gold price through a combination of lucky numbers, etc., I think they would be frightened.”
Ironically, as Cole points out, “The fact that the Depression dragged on for years convinced generations of economists and lawmakers that capitalism could not be trusted to recover from depressions and that significant government intervention was required to achieve good outcomes.”
Teddy Roosevelt wrote a letter in 1913 stating, “Society has no business to permit degenerates to reproduce their kind. . . . Some day we will realize that the prime duty the inescapable duty of the good citizen of the right type is to leave his or her blood behind him in the world; and that we have no business to permit the perpetuation of citizens of the wrong type.”
Magna Carta, the first great charter of Western liberties, was signed by King John in 1215, and set limits to monarchic powers based on “regard to God and for the salvation of our soul, and those of all our ancestors and heirs, and unto the honour of God and the advancement of his holy Church.”
Existentialism & Human Motivation
Existentialism truly began in the nineteenth century with Søren Kierkegaard (1813–1855), a Danish philosopher bothered by the problem of Enlightenment reason, which he saw as arrogant—the notion that a universal ethical system could be discerned by human beings was a fool’s errand, the idea that history was an unerring unfolding of Hegelian dialectics far too simplistic.
All truth was subjective, according to the existentialists, not merely moral truth. This was the view of Karl Jaspers (1883–1969), a German philosopher who wrote, “All knowledge is interpretation.”6 It was also the view of Martin Heidegger (1889–1976), who suggested that the essence of being human was being—not reason or passion, but existence.
Science had begun, in the Francis Bacon philosophy, as an aid toward the betterment of man’s material conditions; it had morphed over time into an aid toward the betterment of man’s moral condition, though not the source of morality itself. But now, with God out of the picture and the collective implicated in the worst crimes in human history, science was handed the task of creating a new morality, a new law. The existentialists had reduced human purpose to creation of subjective truth; science provided the last remnant of objective truth in Western thought.
The first serious advocate of the position that human beings were no longer rational, free actors came from Sigmund Freud (1856–1939). Freud was a charlatan, a phenomenal publicist but a devastatingly terrible practicing psychologist. He was a quack who routinely prescribed measures damaging to patients, then wrote fictional papers bragging about his phenomenal results. In one 1896 lecture, he claimed that by uncovering childhood sexual trauma he had healed some eighteen patients; he later admitted he hadn’t cured anyone. Freud himself stated, “I am actually not at all a man of science, not an observer, not an experimenter, not a thinker. I am by temperament nothing but a conquistador—an adventurer, if you want it translated—with all the curiosity, daring and tenacity characteristic of a man of this sort.”
But Freud’s radical theories about human nature became world famous. He submitted that religion was but a form of “childhood neurosis” from which the world had to recover. He suggested that the roots of religion lay in an ancient event during which a group of prehistoric brothers had killed their father. Dreams were a form of wish fulfillment, behavior was a manifestation of unconscious desires; in general, people were governed by forces beyond their control. Mirroring Plato, Freud posited a tripartite soul—Plato suggested reason, spirit, and appetite, while Freud suggested superego (moral reason), ego (life experience militating between appetite and reason), and id (appetite).
But where Plato suggested that man should work to ally spirit with reason to overcome appetite, Freud suggested that working to uncover unconscious forces shaping our id would be the best possible solution. In other words, Freud believed that we were all governed by forces we couldn’t understand, absent psychoanalytic intervention.
Neo-Enlightenment philosophers like to connect religion with slavery, overlooking that the abolitionist movement in the West was almost entirely led by religious Christians—and ignoring that the global movement against slavery was led by the West (slavery was only legally abolished in China in 1909, and slavery was only legally ended in Saudi Arabia in 1962).
The same holds true of the movement against Jim Crow. Martin Luther King Jr. quoted the Bible far more than he quoted David Hume—and with good reason. It was the prophet Amos’s dream that animated King’s: “I have a dream that one day every valley shall be exalted, and every hill and mountain shall be made low, the rough places will be made plain, and the crooked places will be made straight; ‘and the glory of the Lord shall be revealed and all flesh shall see it together.’”
The traditions of individual liberty didn’t spring into being in the West miraculously, from nothing. They sprang from the tension between Jerusalem and Athens. Western civilization is a bridge suspended over the waters of chaos. Removing that tension collapses that bridge into the roiling river below.
I agree with a lot of Enlightenment ideals, particularly regarding individual liberty and natural rights, as we’ve discussed. But the new scientific Athenians will have to make common cause with the devotees of Jerusalem, rather than making war on them. The same holds true in reverse. For, as it turns out, there are larger philosophical threats to Western civilization that require our attention.
Unfortunately, reason is no longer in vogue. That’s why when I visited Berkeley to speak, protesters outside chanted “SPEECH IS VIOLENCE!” It’s why I—an Orthodox Jew, and the leading antagonist of the racist alt-right—have been routinely castigated as a Nazi. Subjectivity rules the day.
Europe had been devastated by war; Europe had discarded religion long ago. But in the United States, the post–World War II situation looked bright. Unlike in Europe, religious practice remained incredibly strong. As of 1950, about three-quarters of Americans were members of a church, synagogue, or mosque; in 1954, almost half of all Americans said they had attended church, synagogue, or mosque in the last seven days. Over nine in ten Americans identified as Christian.
in the aftermath of World War II, the American dream was still very much alive. And that dream was never merely a white picket fence, a dog, and two kids out in suburbia. It was a dream of cultural enrichment and common purpose. As Fred Siegel reports, between 1940 and 1955, local symphony orchestras increased 250 percent; in 1955, thirty-five million people paid to attend symphonies as opposed to fifteen million paying to attend baseball games. Even early television became part of the cultural education of the public: NBC presented a three-hour production of Laurence Olivier’s Richard III; as of 1951, there were twenty-five thousand members of the Great Books discussion groups, with “50,000 Americans a year . . . buying collections of the writings of Plato, Aristotle, the Founding Fathers, and Hegel” at serious cost to themselves.
Nazism didn’t arise from consumerism. It arose from communal purpose overriding individual purpose, and individual capacity abandoned in favor of worship of the communal capacity of the state. Nazism, in other words, lay a lot closer to Marxism than capitalism did.
But Fromm and thinkers like him suggested that the solution to the supposedly inevitable slide from dull consumerist conformity to horrible fascism lay in complete rebellion.
Marcuse & Self-Realization
The leading advocate of that rebellion was Herbert Marcuse (1898–1979). Marcuse, one of the progenitors of the so-called New Left, preached that the prevailing order had to be torn out root and branch. In 1955, coincident with the rise of Kinsey’s thought, Marcuse penned Eros and Civilization, in which he argued that repressive sexuality had damaged mankind, and that only freeing man of his Victorian mentality regarding sex could build a better world.
No wonder Marcuse’s popular slogan ran “Make love, not war.” Students in Paris during the 1968 revolt carried banners reading MARX, MAO, AND MARCUSE.
The capstone to his theory came in the form of censorship—what he called, in Orwellian fashion, “repressive tolerance.” Marcuse suggested that certain forms of speech had to be barred so that they could not emerge victorious, toppling critical theory itself. According to Marcuse, “the objective of tolerance would call for intolerance toward prevailing policies, attitudes, opinions, and the extension of tolerance to policies, attitudes, and opinions which are outlawed or suppressed.” Freedom, Marcuse said, was “serving the cause of oppression”; oppression, therefore, could serve the cause of freedom. Speech could be labeled violence—Marcuse called to “reexamine the issue of violence and the traditional distinction between violent and non-violent action”—and violence speech. In essence, “Liberating tolerance, then, would mean intolerance against movements from the Right and toleration of movements from the Left . . . it would extend to the stage of action as well as of discussion and propaganda, of deed as well as of word.”
Religion, Greek teleology, and capitalism all have something in common: none of them cares particularly much about “your bliss.” Religion suggests that your self-realization lies in consonance with God, and that any attempt to placate your ego through pursuit of personally defined happiness is bound to fail. Religion suggests that “your bliss” does not exist: only God’s bliss does. Greek teleology is utterly unconcerned with your personal definition of self-realization; the only thing that counts is whether you are acting virtuously in accordance with right reason. And capitalism cares far less about how you’re feeling than about your ability to create products and services someone else wants.
By calling self-realization the highest good, then, the New Left had cast out the specter of the roots of Western civilization and replaced them with a call to action. What was that call to action? Forming alliances directed at tearing down the system. The theory went like this: self-esteem is the key good. But self-esteem cannot be achieved while there are structural impediments to that self-esteem.
Those structural impediments came in the form of sexism, racism, and other forms of bigotry. Such bigotry didn’t have to be expressed outwardly—the structures of society themselves were institutionally biased against victim groups. And members of those victim groups couldn’t achieve self-realization so long as those institutions remained standing.
The only way for members of these victimized groups to restore their self-esteem would come by banding together to tear down the system.
Professor Kimberlé Crenshaw of Columbia University came up with a term to describe this coalition of victims: intersectionality. According to Crenshaw, human beings are members of various groups: racial groups, gender groups, religious groups, sexual orientation groups. And we can describe their “lived realities” by referring to the intersection between those groups. Thus, a black female lesbian Muslim has a different lived reality than a white male heterosexual Christian. Furthermore, we can identify the level of difficulty someone has experienced in life simply by referencing the various groups of which she is a member. The more minority groups to which you are a member, the more you have been victimized; the more you have been victimized, the more your opinion about the innate institutional bigotry of the United States ought to carry weight.
Those who refuse to abide by tribal dictates of intersectionality—people who insist that they are not victims of American society simply by dint of their skin color—are deemed sell-outs, Uncle Toms. Thus Clarence Thomas is not legitimately black because he doesn’t vote Democrat; Nikki Haley isn’t legitimately a woman because she is a pro-life Republican. According to Ta-Nehisi Coates, when black people call for individualistic thinking that strays from traditional Democratic ideology, that means supporting “white freedom”: “freedom without consequence, freedom without criticism, freedom to be proud and ignorant.”
In order to promote discussions about intersectionality, systems of oppression must be curbed—including the speech of others. Discussions must end. Reason must be thrown out the window since, owing to our different life experiences, we cannot understand one another.
Science, too, must take a backseat. Science might undercut the intersectional argument by providing evidence that not all suffering springs from institutional discrimination. Say, for example, that social science shows a high correlation between single motherhood and crime—and that single motherhood is especially predominant among American blacks. Or say that there are group differences in IQ, and that those differences may be at least partially heritable. Or say that men and women are biologically different, and that this difference explains differences in pay—since women choose different types of jobs than men and make different decisions with regard to the number of hours worked, their pay tends to be lower on average than that of men.
These basic facts become subject to scrutiny since science itself is a construct of the system. That’s the argument of Donna Hughes in the Women’s Studies International Forum, who explains, “The scientific method is a tool for the construction and justification of dominance in the world . . . the new methodological techniques were invented by men who were interested in explaining the inheritance of traits in order to support their political ideology of natural human superiority and inferiority.”
Ire has met Sam Harris, who dared to point out that IQ differentials between groups exist. He did not say they were entirely genetic; he did not draw any policy inferences from IQ studies (for the record, Harris was a Hillary Clinton supporter). Nonetheless, Ezra Klein’s Vox chided him for citing studies, with scientists Eric Turkheimer, Kathryn Paige Harden, and Richard Nisbett suggesting that Harris had engaged in “pseudoscientific racialist speculation.” When confronted by Harris about this, Klein immediately retreated to the confines of identity politics: “These hypotheses about biological racial difference are now, and have always been, used to advance clear political agendas.” Thus, these hypotheses must never be discussed, since they fall afoul of identity politics considerations. And even engaging in scientific discussion is a form of identity politics, according to Klein—citing studies and fighting back against attempts to censor those studies is merely engaging in a tribalism of white scientists.
The National Science Foundation, a federal funding agency for science, says that it wants to pursue a “diverse STEM workforce”—not the best scientists of all races, but a specifically diverse group. To that end, the NSF spent millions funding projects on implicit bias research, one of the least-verified, most-hyped attempts to ferret out secret racism ever attempted, as well as $500,000 on studying intersectionality. Science departments around the country are seeking not those with the highest scores or the best credentials, but those with special “contributions to diversity.” As Heather Mac Donald points out, the American Astronomical Society has now asked PhD programs to stop using the Graduate Record Exam (GRE) in physics for applicants, since too few women were doing well. The same is happening in medicine, where schools have been encouraged to stop using the Medical College Admission Test (MCAT) for ethnic minorities. The impact: “From 2013 to 2016, medical schools nationally admitted 57 percent of black applicants with a low MCAT of 24 to 26, but only 8 percent of whites and 6 percent of Asians with those same low scores, according to Claremont McKenna professor Frederick Lynch.” How it helps patients to have less qualified but more ethnically diverse heart surgeons remains unexplained.
Professor Heather Heying when she pointed out that men are taller than women. Heying’s husband, Professor Bret Weinstein, lost his job at Evergreen when he refused to leave campus after black students demanded that white teachers not teach on a particular day; students called him a racist and led a takeover of campus buildings. College speakers ranging from Charles Murray to Heather Mac Donald to Christina Hoff Sommers have been run off campus, usually following violent protests, for the sin of citing statistics. Better false statistics and bad social science than to violate someone’s sense of self-esteem.
All of this is deeply damaging to precisely the people who are supposed to be freed by it. Intersectional thinking promotes a victim mentality entirely at odds with the pursuit of fulfillment and success. If you are told repeatedly that your self-esteem is threatened by the system and the structure, and that even statistics and science must not offend you—if you are taught that your bliss matters more than objective truth—you become weak and fragile, unable to cope in the real world.
Focusing on right-able wrongs is worthwhile; blaming all disparities on discrimination leads to more political polarization and individual failure. Studies show that perceived discrimination is heavily connected with “lower grades, less academic motivation . . . and less persistence when encountering an academic challenge.” That’s certainly a case for fighting discrimination. It’s also a case for not exaggerating its extent, or silencing conversations in order to pander to sensitivities.
Individual capacity has been abandoned in this worldview—individuals, after all, are mere creations of the systems into which they have been born. Collective purpose, too, has gone by the wayside—after all, it’s the system keeping you down.
Perhaps the most polarizing and puzzling story in the Bible is the story of Abraham’s binding of Isaac. It’s a deceptively simple story: God tells Abraham to sacrifice his son, Abraham acquiesces without argument, Abraham takes Isaac to the top of a mountain to slaughter him, an angel intervenes and stops the killing, and Abraham substitutes a ram for Isaac rather than killing his son. But the story raises serious questions, obviously. Is God barbaric? Richard Dawkins says yes: “By the standards of modern morality, this disgraceful story is an example simultaneously of child abuse, bullying in two asymmetrical power relations, and the first recorded use of the Nuremberg defence: ‘I was only obeying orders.’” Dawkins says that the story can’t be an allegory for anything praiseworthy. And it can’t teach anything moral: “what kind of morals could one derive from this appalling story?”
As Ronald Reagan put it, “Freedom is never more than one generation away from extinction. We didn’t pass it to our children in the bloodstream. It must be fought for, protected, and handed on for them to do the same, or one day we will spend our sunset years telling our children and our children’s children what it was once like in the United States where men were free.”