These highlights are from the Kindle version of The Age of Entitlement by Christopher Caldwell.

In the mid-1960s, at a moment of deceptively permanent-looking prosperity, the country’s most energetic and ideological leaders made a bid to reform the United States along lines more just and humane. They rallied to various loosely linked moral crusades, of which the civil rights movement, culminating in the 1964 Civil Rights Act, provided the model.

The changes of the 1960s, with civil rights at their core, were not just a major new element in the Constitution. They were a rival constitution, with which the original one was frequently incompatible—and the incompatibility would worsen as the civil rights regime was built out. Much of what we have called “polarization” or “incivility” in recent years is something more grave—it is the disagreement over which of the two constitutions shall prevail: the de jure constitution of 1788, with all the traditional forms of jurisprudential legitimacy and centuries of American culture behind it; or the de facto constitution of 1964, which lacks this traditional kind of legitimacy but commands the near-unanimous endorsement of judicial elites and civic educators and the passionate allegiance of those who received it as a liberation. The increasing necessity that citizens choose between these two orders, and the poisonous conflict into which it ultimately drove the country, is what this book describes.

The Civil Rights Act of 1964, passed by Lyndon Johnson in the immediate aftermath of John F. Kennedy’s death, was meant to deliver the coup de grâce to Jim Crow, and to end the black marches and police crackdowns in Mississippi and Alabama that television viewers were seeing almost weekly.

Over time, more of the country’s institutions were brought under the act’s scrutiny. Eventually all of them were. The grounds for finding someone or something guilty of discrimination expanded. New civil rights acts—notably the Voting Rights Act of 1965 and the Fair Housing Act of 1968—brought new rights for black citizens and new bureaucracies to enforce them.

Race is the part of the human experience in which American schoolchildren are most painstakingly instructed. Their studies of literature, of war, of civics, are all subordinated to it.

The NAACP not only staged events, it scripted them. The plaintiffs it hand-picked to carry them out were chosen for their sympathy and skill. One example is Rosa Parks. Over decades, Black History Month has taught millions of schoolchildren to think of her as a “tired seamstress,” whose need to rest her weary legs in the white section of a Montgomery, Alabama, city bus unleashed a storm of spontaneous protest. But she was considerably more than that.

Five months before the Montgomery bus boycott began, she had attended the Highlander Folk School in New Market, Tennessee, an academy that the Congress of Industrial Organizations had set up for training social agitators. She was an organizer of considerable sophistication, one of the intellectual leaders of the Montgomery NAACP chapter.

There is a reason most countries are not multi-ethnic countries and why most of those that have tried to become multi-ethnic countries have failed. Where a shared heritage is absent or unrecognized, as it is in the contemporary United States, all the eggs of national cohesion are placed in the basket of the constitution. Hence a paradox: With the dawn of the civil rights era, the U.S. Constitution—the very thing that made it possible for an ethnically varied nation to live together—came under stress.

The problem is that rights cannot simply be “added” to a social contract without changing it. To establish new liberties is to extinguish others.

In 1964, on the eve of the Civil Rights Act, only 16 percent of Americans said that mass demonstrations had helped the cause of racial equality—versus 74 percent who said they had hurt it. Sixty percent even disapproved of the March on Washington, at least in the days leading up to it, while only 23 percent approved.

Starting in the early 1960s, an astonishing spike in crime, in which blacks made up a disproportionate share of both perpetrators and victims, took on aspects of a national emergency. The emergency would pass through various stages: the looting episodes in Memphis that preceded the assassination of Martin Luther King on April 4, 1968, and a new wave of deadly riots that followed it, the Attica Prison Revolt of 1971, the New York blackout of 1977, the crack epidemic of 1986, the Los Angeles “Rodney King” riots of 1992, O. J. Simpson’s acquittal in his 1995 murder trial. After that, crime rates fell in general, but the overrepresentation of blacks in the criminal statistics never went away.

The legislation of the mid-1960s made legal equality a fact of American life. To the surprise of much of the country, though, legal equality was now deemed insufficient by both civil rights leaders and the government.

Once its ostensible demands had been met, the civil rights movement did not disband. It grew. It turned into a lobby or political bloc seeking to remedy the problem according to what Freeman would call the victims’ view: “lack of jobs, lack of money, lack of housing.” The federal government made it a central part of its mission to procure those things for blacks. The results were disappointing on almost every front—naturally, since the country had never signed up for such a wide-ranging project.

Americans were by no means opposed to black advancement—but they had accepted the government’s assurance that de jure racism was the main obstacle to it. They were probably surprised when the advance in blacks’ fortunes slowed after 1964, relative to its rate in the two decades after the Second World War.

In Griggs v. Duke Power Co. (1971), the Supreme Court justices asked whether a power plant in North Carolina could give aptitude tests to its employees. Title VII (Section 703) of the Civil Rights Act had said they could. But Chief Justice Warren Burger and a unanimous Court decided they could not, if such tests disadvantaged blacks in any way: “Good intent or absence of discriminatory intent does not redeem employment procedures or testing mechanisms that operated as ‘built-in headwinds’ for minority groups,” Burger wrote.

Government could now disrupt and steer interactions that had been considered the private affairs of private citizens—their roles as businessmen or landlords or members of college admissions boards. It could interfere in matters of personal discretion. Yes, this was for a special purpose—to fight racism—but the Griggs decision made clear that the government was now authorized to act against racism even if there was no evidence of any racist intent. This was an opening to arbitrary power. And once arbitrary power is conferred, it matters little what it was conferred for.

Truth was among the first casualties of the affirmative action regime. At the simplest level the term “affirmative action” meant discarding prevailing notions of neutrality in order to redistribute educational and employment chances on the basis of race. The idea that it could be a permanent solution to the problem of racial prejudice required doublethink. “Affirmative action requires the use of race as a socially significant category of perception and representation,” as Kimberlé Crenshaw and her colleagues put it, “but the deepest elements of mainstream civil rights ideology had come to identify such race-consciousness as racism itself.” Just half a decade into the civil rights revolution, America had something it had never had at the federal level, something the overwhelming majority of its citizens would never have approved: an explicit system of racial preference.

The civil rights movement was a template. The new system for overthrowing the traditions that hindered black people became the model for overthrowing every tradition in American life, starting with the roles of men and women.

Second-wave feminism began in 1963. Betty Friedan published The Feminine Mystique a few months before President Kennedy was shot. Her preoccupations were not those of the “first wave” of nineteenth-century abolitionists, prohibitionists, and suffragists. She did not philosophize about the inequalities, incompatibilities, and quarrels that the feminist Robin Morgan called “a five-thousand-year-buried anger.”

In President Dwight D. Eisenhower’s view, the great boon of the Highway Act of 1956 was that it made the country easier for U.S. military vehicles to traverse and thus easier to defend against attack. It incidentally tied the country together in ways that promoted trucking and automotive vacationing and eviscerated, enervated, and devitalized many small towns and urban neighborhoods.

The historian George Marsden recalled his 1950s childhood as a time when the United States “had been thrown into a position of world leadership.” “Thrown into”! Americans, like many victorious peoples, mistook the fruits of conquest for the rewards of virtue. They repressed, for the most part, their memory of the Korean War, the deadly stalemate into which Truman had drawn them in 1950.

Today The Feminine Mystique reads like a collection of magazine articles—exciting, insightful, gripey. It is hard to say exactly what it is about as a whole.

In the 1970s, a generation of women would come of age who could take advantage of them, not having already wagered their happiness on a traditional mating and family-forming system. Universities grew, admitted more women, and insulated a critical mass of elite women from child-rearing expectations. It was they who would mobilize against “sexism.” The word did not exist when Friedan’s book came out. It was coined in 1968 by the writer Caroline Bird, who had enrolled at Vassar in 1935, three years before Friedan (then Bettye Goldstein) went to Smith.

Look at the TV commercial and marketing campaign that Eastern Air Lines ran in the summer of 1967. Entitled “Presenting the Losers,” it announced that the company had hiring standards so rigorous that only one in twenty applicants for a job as stewardess was hired. Rather than describe those standards, it paraded a dozen young women across the screen and invited the viewer to check them out, accompanied by a narrator’s contemptuous assessments: “She’s awkward…. Not very friendly…. She bites her nails…. She wears glasses…. Oh!… Aww, she’s married.” One is too tall, one too short, another chews gum. Eastern’s standards were wholly physical. Sexual. The feminine sexuality they reflected was preposterously, cruelly, pointlessly narrow. Yet a woman who did not want to find herself left alone in life would do well to conform to it.

The journalist Gloria Steinem (Smith College, 1956), who on a journalistic assignment in 1963 went undercover as a scantily clad Playboy Club “bunny,” saw women’s liberation as patterned on black liberation.

Workplace feminism exacerbated inequality. It increased the number of intra-class marriages, and it undermined the New Deal culture of the “family wage”—the common-sense assumption among Americans of all classes that a wage paid to a “working man” went to support his wife and children as well. In its imperfect way, the family wage had compensated the housewife for her unwaged work. Whether she got sufficient control over the part of the wage that she flat-out earned is a separate, intra-familial question and potentially a genuine feminist grievance.

Feminism offered corporations an excuse (what the political philosopher Nancy Fraser called a “legitimation”) for breaking the implicit contract to pay any full-time worker a wage he could raise a family on. It was feminism that provided, under pressure of the recessions of the 1970s, a pretext for repurposing household and national budgets. Instead of being used for reproduction (understood as both family-forming and investment), those budgets would now be consumed.

The Supreme Court made abortion legal in 1973 with its 7–2 ruling in Roe v. Wade. The justices must have assumed that the future course of thinking on matters of reproductive rights was easy to discern. They were wrong. Roe was the most politically divisive Supreme Court decision in 116 years.

In the last poll Gallup took before Roe v. Wade, in December 1972, 46 percent said they would favor a law permitting abortions during the first three months; 45 percent said they would oppose it. Polls taken in the days after Roe v. Wade show that Americans had a sense that abortion was bad, but lacked a moral framework that would allow them to think about abortion logically and confidently. They were concerned about dangers to a mother’s health (for which they favored allowing abortion by 91 to 8 percent), rape (they favored abortion by 81 to 10 percent), and birth defects (abortion was okay by 82 to 15 percent). They wanted, it seems, to guard against “risk.”

Female adultery had throughout history been a more serious trespass than male adultery, not because of any irrational sexism but out of a rational instinct for survival. When a man strays, he risks placing a child in some other household. When a woman strays, she risks introducing a creature with ulterior allegiances into the heart of her own. Such straying was the cause of a good deal of anguish and murder, and many of the best novels in the European tradition. But where contraception had reduced the incidence of such problems, abortion outright “solved” them. There need never be any progeny from an adultery. A certain rein on sexuality, particularly female sexuality, was snapped.

A mythology that moralizes sex may do something to shelter a delicate flame. It is hard to say exactly what, but there must be a reason that flourishing, fertile, creative societies tend to be conservative about sex.

As early as the 1920s, the English philosopher Bertrand Russell had warned that the establishment of welfare states risked turning not just the economy but everything upside-down, because the state would replace the father as protector and provider. Breaking the traditional family structure might look rational, modern, and sensible.

Nonetheless, Russell wrote, if this should occur, we must expect a complete breakdown of traditional morality, since there will no longer be any reason why a mother should wish the paternity of her child to be indubitable. … Whether the effect upon men would be good or bad, I do not venture to say. It would eliminate from their lives the only emotion equal in importance to sex love. It would make sex love itself more trivial. It would make it far more difficult to take an interest in anything after one’s own death. It would make men less active and probably cause them to retire earlier from work. It would diminish their interest in history and their sense of the continuity of historical tradition.

In the winter of 1976, a convention of the American Bar Association came within two votes of endorsing the legalization of prostitution, while pornography was confined to disreputable red-light districts. It is a measure of how rare dirty movies were in 1978 that X-rated videocassettes sold in “men’s magazines” for about a hundred dollars apiece.

The sudden victory of Ho Chi Minh’s troops over the French in 1954 had led to a UN partition of the country into a communist North and a pro-American South. But Vietnam was never as divided as it looked. Ho’s Communists would likely have won the national election that the peace accords called for, had the United States permitted one.

In 1950, Harry S. Truman, using a UN Security Council resolution as a substitute for declaring war, had committed the United States to defend the southern part of that nation, divided at the 38th parallel, after an invasion from the Communist North. He wound up paying for it with his presidency. General Douglas MacArthur swiftly recovered almost the entire country, South and North, from North Korean forces in a campaign of tactical mastery, but when he had driven the North Koreans almost to the Yalu River along the Chinese border, China entered the war with hundreds of thousands of troops. The Chinese and North Korean counterattack drove U.S. forces back to the other end of the peninsula and took tens of thousands of lives. The war ended in a stalemate, with a demilitarized zone back at the 38th parallel.

For decades, reinvigorating Western religion and philosophy with a dose of Asian spiritualism had been a favorite project of Western intellectuals: Aldous Huxley, Alan Watts, F.S.C. Northrop, the Beat novelists and poets. The generations that came of age in the 1960s and 1970s were, in this sense, religious.

Americans of the sixties and seventies sought out places where the twentieth century had not done its awful work on the national character: The University of Minnesota professor Robert Pirsig worked throughout the 1960s on Zen and the Art of Motorcycle Maintenance (1974), which mixed a memoir of mental illness and a reflection on occidental philosophy with a picaresque account of a long motorcycle journey with his son.

The Baby Boom generation was the most culturally stable (or as we would say today, the “least diverse”) generation in American history. The 1970 census, the first in which all the Boomers were present, was the only one since the founding of the republic two centuries before in which the foreign-born population was below 5 percent. Immigration still came largely from Europe. The three largest sources of foreign-born were Italy (1,009,000), Germany (833,000), and Canada (812,000). Twenty-first-century readers, accustomed to much higher numbers, might need to be told that those were total resident populations, not annual arrivals. There were only 760,000 Mexicans in the country, slightly more than the number of people from the United Kingdom, Poland, or the Soviet Union. Mexico and Cuba were the only non-European countries in the top ten. By 2010, there would be no European countries at all on the top ten list. Here it is: Mexico, China, India, the Philippines, Vietnam, El Salvador, Cuba, Korea, the Dominican Republic, and Guatemala.

The years from roughly 1968 to 1971, when, as noted, the American population was less “diverse” than at any time in its history, were also the high-water mark of the country’s post-war culture, certainly of its popular music. In the four-and-a-half months of 1970 that separate the albums Workingman’s Dead (June) and American Beauty (November), the Grateful Dead alone cranked out a dozen songs that have endured through the generations.

It was not only that Richard Nixon had been forced from office in a scandal. The three great progressive endeavors of the preceding decades—civil rights, women’s liberation, the attempt to impose a liberal order on the world militarily—had all been resoundingly repudiated by the public. Post–Civil Rights Act, violent crime and drug abuse in inner cities were at record highs. Post–Ms. magazine, legislatures were rescinding the ERA ratifications they had only recently passed. Post–Vietnam War, Soviet troops entered Afghanistan and revolutionary governments came to power in Nicaragua and Iran.

The American public had come to see the political project of the 1960s as dangerously utopian. They brought California governor Ronald Reagan to power to put an end to it. Instead, in ways that neither his supporters nor his detractors have ever fully understood, he rescued it.

William F. Buckley, Jr., the founder and editor of National Review, was impressed with Rand, but he described future Treasury secretary Alan Greenspan as having been “absolutely bewitched” by her in the early 1950s, when he attended her salons at 36 East 36th Street in New York. Decades later, the future Republican speaker of the House, Paul Ryan, was similarly smitten as a high school student in Wisconsin. “I give out Atlas Shrugged as Christmas presents, and I make all my interns read it,” he would later say. In 1991, when the Book of the Month Club and the Library of Congress asked readers to name the book that had most influenced their lives, Atlas Shrugged ranked second only to the Bible.

That he learned to sound certain conservative notes about sex in the 1970s, and even gave a barn-burning speech against abortion in 1983, should not distract from his pre-eminence as a sexual progressive. In 1967, as governor of California, he signed the furthest-reaching liberalization of abortion in American history. In 1969 he introduced no-fault divorce statewide with his Family Law Act. By the time of his campaign for the presidency in 1980, it would have been fair to say Reagan had done more than any politician of either party to build up the institutions of post-feminist sexual liberation.

Reagan’s libertarian vision had as much of Martin Luther King’s “dream” in it as it did of Ayn Rand’s capitalism. It was sunny and it was progressive. It assumed that an untrammeled thriving was possible, if only a few hardened sticks-in-the-mud and pessimists could be kept from undermining it. Skepticism was the equivalent of oppression. Bearers of bad news were indistinguishable from enemies of the people. This vision did not appeal to all minds, but it seemed to strike a chord in all hearts. That is why Reagan won every state but Minnesota when he ran for re-election in 1984.

In social life, questioning limits means not bowing down to anything. In economics, questioning limits means not paying for anything. At first, the American Baby Boomers appeared to be doing with little effort what other generations had only managed to do by the sweat of their brow. But that was an illusion. What they were doing was using their generation’s voting power to arrogate future generations’ labor, and trading it to other nations and peoples for labor now. Reaganism meant Reaganomics. Reaganomics meant debt.

One night in the autumn of 1974, when the top income tax rate stood at 78 percent, the business economist Arthur Laffer of the University of Southern California sketched out a graph with a felt-tip pen on a cocktail napkin. He was having dinner at the Two Continents restaurant, across the street from the White House, with Wall Street Journal opinion writer Jude Wanniski and two aides to President Gerald Ford, Richard Cheney and Donald Rumsfeld. Laffer was claiming that, on top of the collateral inefficiencies they provoked, high tax rates could actually cost the government money. The “Laffer Curve” was not terribly rigorous. Its origins in a restaurant rather than a classroom are evident from the napkin itself, which is now in the possession of the National Museum of American History.

Wanniski would become the great salesman and impresario of so-called supply-side economics. He urged it on Republican politicians as a political strategy two years before he laid it out as an economic theory. With the toastmaster’s language that was his trademark, he argued in 1976 that Republicans kept losing elections because Democrats forced them to “embrace the role of Scrooge,” while Democrats played the role of Santa Claus. “The first rule of successful politics,” he wrote, “is Never Shoot Santa Claus.”

Dwight Eisenhower warned in his 1961 farewell address, “We—you and I, and our government—must avoid the impulse to live only for today, plundering, for our own ease and convenience, the precious resources of tomorrow.”

The rhetoric that brought Reagan two landslides was, among other things, a sign that Americans were unwilling to bankroll with their taxes the civil rights and welfare revolution of the 1960s and the social change it brought in its train.

Once debt was used as a means to keep the social peace, it would quickly run into the trillions.

Consider affirmative action—unconstitutional under the traditional order, compulsory under the new—which exacted a steep price from white incumbents in the jobs they held, in the prospects of career advancement for their children, in their status as citizens. Such a program could be made palatable to white voters only if they could be offered compensating advantages. A government that was going to make an overwhelming majority of voters pay the cost of affirmative action had to keep unemployment low, home values rising, and living standards high. Reaganomics was just a name for governing under a merciless contradiction that no one could admit was there: Civil rights was important enough that people could not be asked to wait for it, but unpopular enough that people could not be asked to pay for it.

By 1989, the year Reagan left office, according to an estimate by the economist Roy H. Webb of the Federal Reserve Bank of Richmond, the government’s unfunded liabilities (mostly for Social Security, Medicare, and veterans’ benefits) had reached $4–5 trillion, and would rise exponentially if nothing were done. Nothing was done. By the time of the 2016 election, a calculation of those liabilities similar to Webb’s ran to at least $135 trillion.

Periods of fiscal irresponsibility are often not immediately recognizable as such. Outwardly they can even look like golden ages of prosperity, because very large sums dedicated to investment are freed up for consumption.

Mass immigration can help a confident, growing society undertake large projects—the settlement of the Great Plains, for instance, or the industrialization of America’s cities after the Civil War. But for a mature, settled society, mass immigration can be a poor choice, to the extent that it is a choice at all.

The Hart-Celler immigration reform of 1965 is sometimes overlooked amid the tidal wave of legislation that flowed through Congress that year. It overturned the “national origins” system, passed under the Immigration Act of 1924 and reaffirmed in 1952, that had aimed to keep the ethnic composition of the United States roughly what it was. Even in the mid-1960s, immigrants from Britain and Germany made up more than half of national “quota” immigration—and those countries plus Ireland, Italy, and Poland accounted for almost three quarters.

The Hart-Celler bill would alter the demography of the United States. It would also alter the country’s culture, committing the government to cut the link that had made Americans think of themselves for three centuries as, basically, a nation of transplanted Europeans.

In the three-and-a-half centuries between its discovery and 1965, the United States had received 43 million newcomers (including a quarter-million slaves). In the half-century that followed Hart-Celler, it would get 59 million.

“People of color” was a harbinger of what later came to be called “intersectionality,” a philosophical-sounding term for the political strategy of bundling different minorities into a coalition.

A big problem with immigration was that it bred inequality. Its role in doing so was as significant as that of other factors more commonly blamed: information technology, world trade, tax cuts. In 1995, the economist George Borjas, writing in the Journal of Economic Perspectives, modeled the actual effects of immigration on Americans. He found that while immigration might have caused an increase in economic activity of $2.1 trillion, virtually all of those gains—98 percent—went to the immigrants themselves. When economists talk about “gains” from immigration to the receiving country, they are talking about the remaining 2 percent—about $50 billion. This $50 billion “surplus” disguises an extraordinary transfer of income and wealth: Native capitalists gain $566 billion. Native workers lose $516 billion.

Immigration, like outsourcing and tighter regulation of unions, allowed employers to pay less for many kinds of labor. But immigrants came with other huge costs: new schools, new roads, translation (formal and informal), and health care for those who could not afford it. Those externalities were absorbed by the public, not the businessmen who benefited from immigration.

Immigrants caused a revolution in the way Americans ate—more because of the new savings that could be had from immigrant labor than because of the cuisines immigrants brought. (That is, Starbucks is as much a creation of the immigrant economy as El Taco Rico.) Inexpensive landscape gardeners made possible an explosion of golf courses and an extraordinary beautification of the country’s suburbs.

The decades after Ronald Reagan’s arrival in power were a time of globalization and globalism. Globalization means the internationalization of the division of labor. Globalism means the political promotion of globalization, and the breaking and remaking of institutions to facilitate it.

In the year after the Kennedy assassination, the University of Chicago historian Daniel Boorstin warned about the way “we have used our wealth, our literacy, our technology, and our progress, to create the thicket of unreality which stands between us and the facts of life.” Computers were part of the thicket of unreality. If not kept in their place, they would deaden souls and turn people into robots. Martin Luther King, Jr., spoke of our “computerized plans of destruction” in Vietnam, and even when computers weren’t dealing death they were a kind of death.

Great Britain and France collaborated on a short-lived experiment in supersonic flight, the Concorde, but fewer than two dozen were ever built. The United States, more focused on military applications, never participated. Its SR-71 “Blackbird” reconnaissance plane, developed in secret after 1964 in the Lockheed “skunk works” by engineer Kelly Johnson, set the world air speed record in 1976—a record that was still standing four decades later.

Acquired knowledge obeys a Malthusian logic: Each new fact brings a handful of new questions, which, when answered, bring a handful more. Facts grow arithmetically, but questions grow geometrically. The result is a deterioration of certitude. The closer we get to the truth, the less confident we are in our possession of it.

The old word for this state of affairs was relativism. The French adage Tout comprendre, c’est tout pardonner (“To understand is to forgive”) summed it up. The new word for it was “postmodernism.”

But there was a difference between relativism and postmodernism. Relativism was a kind of tolerance. Postmodernism was a kind of insurgency. Postmodernism described narratives, from communism to mainline Christianity, that—interesting though they might be as myths—were losing their power to bind people into communities and spur them to action. Postmodernism also vied with those narratives: It was a project to delegitimize them. Every institution it penetrated, it politicized.

Postmodern writers and critics naturally took up one of the great obsessions of the 1960s and ’70s: authenticity. Back then, the counterculture had been divided into many bickering currents, but there was one viewpoint that all of them shared: a dismay that the United States was so fake. Uniform housing developments that either made claims to be what they were not (“Heritage Boulevard”) or purported to celebrate what they had actually destroyed (“Oak Meadows”), a Procrustean educational culture (although worse was still to come), factory-made food—these were powerful symbols of the dishonest basis on which the culture was built. The postmodernists’ way of addressing these issues was the one that artist Andy Warhol had pioneered in the 1960s: They saw that, most of the time, Americans not only tolerated fakeness but embraced it.

When Ronald Reagan arrived in office, a handful of states had a holiday celebrating Martin Luther King, Jr. Many more had chosen not to have one. Reagan signed a federal King holiday into law in 1983. Glorious and tragic though King’s life was, the felt need to commemorate it was waning in the 1980s.

Though many whites saw the King day as a “black holiday,” there was a case for white people to honor him as well. It was like the case that the abolitionist Charles Francis Adams, Jr., writing decades after the Civil War, had laid out for calling Robert E. Lee an American hero and not just a Southern hero—that he had forsworn violence even as many of his allies were urging it. King, similarly, had created a middle ground where the two races could meet.

When President Gerald R. Ford first officially recognized Black History Month in 1976, calling upon the public to “honor the too-often-neglected accomplishments of black Americans,” it seemed a negligible development, a perfunctory bit of officialese. The New York Times gave the story two small paragraphs at the bottom of the fifth column on page 33 of a Wednesday morning paper. Yet by the twenty-first century, black history was far from neglected. It was fair to say that ethnic studies had taken over not just college curricula but even primary and secondary school history teaching. In 2008, education professors from Stanford and the University of Maryland asked 2,000 eleventh and twelfth graders to name the ten most significant Americans who had never been president. Three standbys of Black History Month—Martin Luther King, the anti-segregationist protester Rosa Parks, and the escaped slave Harriet Tubman—ranked 1, 2, and 3, far ahead of (for example) Benjamin Franklin, Emily Dickinson, Mark Twain, Thomas Edison, and Henry Ford.

Political correctness, or P.C., as it was called by everyone except its adherents, was a grab bag of political stances descended from queer theory, critical race theory, critical legal studies, post-colonial studies, and various other new academic schools of thought. It aimed at the redesign of institutions and philosophies so that they might recognize, accept, vindicate, validate, and console groups deemed disadvantaged: blacks, women, gays, immigrants.

The origins of P.C. were political, not intellectual. They lay in feminist and anti-racist activism. Of all the battles that pitted students against forces of order in 1968—from demonstrations against administrative high-handedness at Columbia University to clashes with police at the Chicago Democratic convention—the most consequential, in retrospect, was the five-month strike launched by the Black Student Union and the Third World Liberation Front (TWLF) at San Francisco State University that fall. At the end of it, the university established the first ethnic studies departments in the United States. Thus began a process that would saturate the national culture with racial and gender politics.

These ethnic studies departments, which had spread to virtually all universities by the end of the 1970s, aimed not so much at understanding power relations among ethnic groups as at transforming them.

Traditionally there had been extraordinary expressive freedom in the United States. “Free speech, free press, free exercise of religion are placed separate and apart,” wrote Supreme Court justice William O. Douglas. “They are above and beyond the police power; they are not subject to regulation in the manner of factories, slums, apartment houses, production of oil and the like.” This conception of First Amendment freedoms was being eroded.

In early press accounts, the enforcers of P.C. appeared as hate-filled and totalitarian. They reminded the Berkeley philosopher John Searle, who covered various campus battles for the New York Review of Books, of Nazis. “The objective of converting the curriculum into an instrument of social transformation” he wrote, “…is the very opposite of higher education. It is characteristic of the major totalitarian regimes of the twentieth century—leftist and rightist.”

Patrick J. Buchanan, the former speechwriter to Nixon and Reagan, was alone among conservative political figures in correctly assessing the power of political correctness. In 1992, he challenged George H.W. Bush for the Republican nomination and won a stunning 38 percent in the New Hampshire primary. Nostalgic, protectionist, isolationist, often misunderstood, Buchanan argued on behalf of the declining American “woodwork,” and warned that there was a “war for the soul of America” going on. His campaign was about globalization, which his friend and supporter Samuel Francis defined as “the managed destruction of the nation, its sovereignty, its culture, and its people.”

In the quarter-century after Reagan, conservatives lost every battle against the substance of political correctness. Almost no claim made for expanded rights or recognitions for women or minorities would be deemed to have “gone too far.” Already in 1988, Stanford had replaced its Western Culture requirement with a more flexible, multicultural rubric called “Culture, Ideals and Values.” The five hundred mostly minority students who on January 15, 1987, had marched with Rev. Jesse Jackson down Palm Drive in Palo Alto, chanting “Hey, hey! Ho, ho! Western Culture’s got to go!” were snorted at on campus and mocked in newspapers from coast to coast. Then they won. Change spread beyond the universities to corporations, foundations, and government offices. CNN founder Ted Turner ordered his company’s personnel to refer to things outside the United States as “international” rather than “foreign,” threatening to levy fines (which he would donate to the United Nations) on those who disobeyed. Political correctness was not a joke after all. It was the most comprehensive ideological capture of institutional power in the history of the United States.

“Subversive” scholars were supporting the very same things the government was mustering all its budgetary and enforcement power, and the corporate and foundation sector all its funding and ingenuity, to bring about. Rarely did professors now seek to subvert (as they had in the past) promiscuity or atheism or pacifism. Today’s “subversive” opinions—that there ought to be more blacks in positions of authority, that a gay relationship is just as good as a straight one—were given special protection by civil rights laws, and there were now hundreds of thousands of people at all levels of government and business who had been trained to impose them.

Once social issues could be cast as battles over civil rights, Republicans would lose 100 percent of the time.

“Political correctness” was a name for the cultural effect of the basic enforcement powers of civil rights law.

Only with the entrenchment of political correctness did it become clear what Americans had done in 1964: They had inadvertently voted themselves a second constitution without explicitly repealing the one they had. Each constitution contained guarantees of rights that could be invoked against the other—but in any conflict it was the new, unofficial constitution, nurtured by elites in all walks of life, that tended to prevail. This was a recipe for strife.

The real estate market is heavily supported by government. Big tax breaks—from deductions for mortgage interest to homeowners’ exemptions from capital gains tax—subsidize home owning over renting. And “government-sponsored enterprises” (GSEs) purchase, guarantee, and securitize home loans, issuing trillions in debt to do so. They create a deep secondary market in mortgages, making possible that socially transformative product—the 30-year mortgage at a low and preferably fixed rate—on which rests the way of life of the American middle class.

On October 28, 1992, at the end of an election campaign shadowed by the outbreak in Los Angeles that spring of the deadliest race riots in a quarter-century, President George H. W. Bush signed a Housing and Community Development Act. It inaugurated the process we have seen at many junctures in this book: the sudden irruption of civil rights law and diversity promotion into an area from which it had been mostly absent, in this case mortgage finance. The GSEs would now have “mission goals” of supplying “affordable housing,” particularly in “underserved areas”—the government’s euphemism for ethnic-minority neighborhoods.

That meant lowering underwriting standards. GSEs would now be able to “establish a down payment requirement for mortgagors of 5 percent or less” and “approve borrowers who have a credit history of delinquencies” as long as the borrower could show that his credit had been reasonably good for twelve months.

Bill Clinton, elected in 1992, made this mission his own. Starting in the summer of 1994, he crusaded against the dearth of private housing credit in poor, black, urban neighborhoods. He used the term “redlining,” which described the practice, illegal for decades and subjected to intense federal scrutiny since the Fair Housing Act of 1968, of systematically denying credit to black neighborhoods. The word would have meant little to Clinton’s white supporters, but it was part of the basic vocabulary of the civil rights movement. It imparted to blacks the incendiary accusation that the lower rates of mortgage lending to black people were due not to their poverty but to a conspiracy of racist bankers.

There was no evidence of redlining. Clinton’s allegation rested on a study from the Boston office of the Federal Reserve, the shortcomings of which were laid out almost immediately on the front page of the New York Times business section. Once the effects of poverty were adequately modeled and two outlying banks put aside, there was no discrimination to be found. There was inequality, certainly, but inequality was not the same thing as discrimination.

By the time Clinton left office, the Department of Housing and Urban Development (HUD) required that low-income loans make up 50 percent of the GSEs’ portfolio. Republicans never objected.

By 2006, 46 percent of new homeowners were making no down payment at all on their houses, and banks had trillions of dollars in loans on their books that would never have been made, absent government pressure. No well-informed accountant thought these loans could survive an economic downturn, and they did not. The politicization of poor people’s mortgages in a single country brought the world to the brink of economic disaster.

Raghuram Rajan, born in Bhopal and trained at MIT at the same time as Simon Johnson, had warned central bankers before the crisis that many American financial innovations meant to minimize risk, such as derivatives and swaps, were in fact amplifying it. (After the crisis Rajan would become governor of the Bank of India.) With extraordinary social sensitivity, Rajan linked the financial crisis to the gradual rise of American inequality.

To inquire too closely into borrowers’ creditworthiness would leave bankers in danger of falling afoul of anti-discrimination laws, particularly after Clinton’s redlining initiative.

The debt crisis was a sign that the United States was nearing the limits of its ingenuity in finding new ways to borrow money. Without extra money, the country could no longer afford to pay for both a pre– and a post–Great Society social order. When we speak of “polarization” today, what we mean is the conflict between those two social orders, newly reactivated under conditions of curtailed resources.

Obama was a black candidate with a difference. He had been raised in Hawaii, his father a Kenyan scholarship student he never really knew and his Kansan mother an activist scholar of international development. By his own account he was an “eighth or ninth cousin” of Confederate president Jefferson Davis.

China; George the mechanic in Nairobi. But Obama had no black American cousins. He was the descendant of American slave owners but not of American slaves.

Obama’s presidency extinguished an important illusion on which the consensus for civil rights laws had rested: the illusion that its more intrusive tools would be temporary. There would be no need for affirmative action, Americans trusted, once we were all standing on Martin Luther King’s mountaintop. In the then most recent Supreme Court examination of affirmative action, Grutter v. Bollinger (2003), the University of Michigan Law School had assured the court that it “would like nothing better than to find a race-neutral admissions formula” and would get rid of affirmative action as soon as it did. That assurance was taken seriously. Justice Sandra Day O’Connor wrote in her opinion: “We expect that twenty-five years from now the use of racial preferences will no longer be necessary to further the interest approved today.”

Amazon was able to destroy the country’s traditional bookstores and chains, thanks to an extraordinary tax subsidy, conferred by both judicial shortsightedness and congressional design. A 1992 Supreme Court decision had given states the right to tax only companies that had a “physical presence” there. The justices did not envision that a retailer using emerging technologies might grow into a half-trillion-dollar company by cannibalizing local stores while having a physical presence almost nowhere. The 1998 Internet Tax Freedom Act, mentioned earlier, banned taxes for internet access and email “postage,” as well as special levies at any level for online shopping.

In 2010 the investor Warren Buffett won praise for backing President Obama’s suggestion of a 30 percent minimum income tax for those making over a million dollars a year, a guideline that Obama tried to endear to the American public by attaching Buffett’s name to it. Buffett admitted he paid a rate of only 17.4 percent (about $6 million) on his income of $40 million a year. It took the supply-side economist Arthur Laffer to point out—correctly—that in 2010 Buffett’s net worth had actually risen not by $40 million but by $10 billion, a figure that was 250 times as large as the one Buffett cited. And, because it was in the form of unrealized capital gains, it was almost completely sheltered from income tax. Buffett’s effective tax rate was not 17.4 percent but 6 one-hundredths of 1 percent. He was proposing to raise it to 12 one-hundredths of 1 percent. The so-called Buffett rule was a means not of tapping Buffett’s wealth but of sheltering it.

Philanthropy, the Princeton historian Olivier Zunz wrote, is “a capitalist venture in social betterment, not an act of kindness as understood in Christianity.”

During the presidential campaign of 2008, two of Barack Obama’s friends and advisors from the University of Chicago collaborated on a book called Nudge, which used behavioral economics to justify an activist state. The law professor Cass Sunstein would become the senior advisor on regulation to the Obama White House. The economist Richard Thaler, an early collaborator of Kahneman, would take up a similar role as head of the Behavioural Insights Team for Britain’s Conservative prime minister David Cameron.

Thaler and Sunstein laid out the baleful consequences of poorly designed choosing systems and suggested ways to fix them. Schoolchildren carrying their trays through lunch lines don’t usually mull over which dessert they prefer; they often grab the first thing their eyes alight on. So why not put at eye level apples and pears rather than Cheetos and Funyuns?

Nudge challenged the “harm principle” that John Stuart Mill had laid out in his book On Liberty (1859). Mill held, basically, that the state ought to intervene in people’s lives only if there is a risk of their harming others. One of Mill’s reasons for not intervening is that people tend to be the best judges of what is good for them. Sunstein disagreed: What is good for them, he wrote elsewhere, is “largely an empirical question.”

But Sunstein was leaving out the first and most important step. What is good for people, at least for free people, depends on their priorities. The question becomes empirical only once a priority has been set. The important political matter is who gets to set it. Thaler and Sunstein were disguising the arrogant project of setting priorities as a modest and commonsensical project of measuring outcomes.

In a lot of languages the word for “vote” is the same as the word for “voice” (ΓΌЛΌС, Stimme, voix). Democracy is the system in which the voice of the people is sovereign and beyond appeal. Nudge puts conditions on that sovereignty. In this sense its principles are those of civil rights law, in which the voice of the people is sovereign, once it has been cleared of the suspicion of bias. If people are blocked by bias from understanding the “true” nature of human relations, authorities have the license, even the obligation, to overrule them. Civil rights thus does not temper popular sovereignty, it replaces it. What we call political correctness is the natural outcome of civil rights, which makes fighting bias a condition for the legitimacy of the state. Once bias is held to be part of the “unconscious,” of human nature, there are no areas of human life in which the state’s vigilance is not called for.

Civil rights was always this way: dignity was an integral and non-negotiable part of what was demanded, and a government interested in civil rights must secure it, no matter what the cost in rights to those who would deny it.

In order to provide gays with their public dignity, authorities were going to have to alter, delegitimize, and eliminate a lot of institutions in which people had made large moral investments.

While it used a rhetoric of exclusion traditionally deployed by downtrodden ethnic minorities, gay marriage was also the single cause that most united the richest and best-connected people on the planet. It united them politically more than either tax rates or financial regulation did. The Human Rights Campaign, which lobbied for gay rights legislation and backed gay rights litigation, was bankrolled by Amazon, American Airlines, Apple, Citibank, Coca-Cola, Dell, Goldman Sachs, Google, Hershey, Hyatt, IBM, Intel, Lexus, Macy’s, MasterCard, Microsoft, Morgan Stanley, Nationwide, Nike, Northrop Grumman, Orbitz (“Visit GayOrbitz.com”), Pepsi, Pfizer, Pottery Barn, Prudential, Shell, Starbucks, Target, Tylenol, UPS, Whirlpool, and Williams-Sonoma—to give only a selective list of its very largest contributors.

The investors George Soros and Michael Bloomberg, tech billionaires Bill Gates and Jeff Bezos, entertainers David Geffen and Brad Pitt, Republican financiers Paul Singer and Seth Klarman—all backed gay marriage with millions in donations. Support for gay marriage in Silicon Valley was almost unanimous. Google’s employees gave 96 percent of their campaign contributions, and Apple’s 94 percent, to oppose California’s anti–gay marriage Proposition 8. There were no equivalents on the other side. When it was discovered in 2014 that Mozilla chief executive officer Brendan Eich, the designer of the web browser Firefox, had given $1,000 to support Prop 8 six years before, a storm of outrage forced his resignation.

The reforms of the 1960s had created mechanisms for forcing social change even against the democratically expressed wishes of the electorate. As Rosa Luxemburg had written of the Russian Revolution, “The real dialectic of revolution stands the parliamentary cliché on its head: The road leads not through majorities to revolutionary tactics, but through revolutionary tactics to majorities.”

Media jobs in the old days had been spread nationwide and linked to the political cultures of their diverse readerships. “The Sioux Falls Argus Leader is stuck in South Dakota just as the owners of hydroelectric plants in the Rockies are stuck where they are,” Shafer and Doherty pointed out. But three quarters (73 percent) of twenty-first-century internet media jobs were either in the northeast corridor, on the West Coast, or in Chicago. Those places were rich and overwhelmingly Democratic: 90 percent of people working in the reconfigured news industry lived in a county that Democrats would win in the 2016 presidential election.

As enacted in 1964, civil rights had meant a partial repeal of the First Amendment. It had withdrawn the right to freedom of association long implicit in the freedom of assembly. But now, when authorities and judges hit difficulties or resistance in advancing civil rights, they were tempted to insist on ever-larger derogations of First Amendment rights, perhaps believing themselves to be working in the “spirit” of civil rights. Civil rights meant affirmative action. Civil rights meant political correctness. And in the wake of court-ordered gay marriage, it quickly became apparent that civil rights would mean court-ordered approval of gay marriage.

At some point in the half-century leading up to 2014, the United States had ceased to be a classic, open-ended democratic republic. It had taken on features of a “managed democracy” along European lines.

Starting with the passage of the Civil Rights Act in 1964, whites were “racialized” in a way they never fully understood. They were the people whom American politics was not about. They were excluded—at least as claimants—under civil rights law. As civil rights spread to cover groups other than blacks, the term “people of color” marked whites off as the only people so excluded, and legitimized that exclusion. In this sense the United States had re-created the problem that it had passed the Civil Rights Act to resolve: It had two classes of citizens.

That was the civil rights project’s great Achilles’ heel. It eventually drove a critical mass of whites to conceive of themselves as a race, whether they wished to or not. The process through which most white Americans came to understand the dynamic of the system was not fast, but it was inexorable. It took fifty years, years that saw a steep decline in their social status and a degradation of their way of life.

By the time of the 2016 election, which it did much to decide, the opioid epidemic that had begun with OxyContin was killing not 1.5 or 2 but 20 Americans per 100,000. In New Hampshire, Ohio, and Pennsylvania it was killing almost 40 per 100,000, and in West Virginia it was killing 50. Yet until the Republican candidate began to mention it, the airwaves were nearly silent about it. So were the newspapers.

This was a health emergency of maximum gravity, comparable to the outbreak of AIDS in the early 1980s. The tepid response had to do with how drugs were tied up with race and class. Since to be alarmed about heroin was to be alarmed on behalf of poor white people, Americans were hesitant—perhaps “frightened” would be a better word—to be seen to take account of it. Unlike blacks in the decades after the Vietnam War, twenty-first-century suburban and rural whites were not protagonists of the nation’s official moral narrative.

In 1975, rural America had meant banjos, bait shops, and cornbread. By the election of 2016, it meant SNAP cards, internet pornography, and OxyContin.

Far from suppressing the Southern obsession with race and “blood,” civil rights conveyed it to every nook and cranny of the country. The entire culture—all journalism, all art, all books—was suffused with it.

At one point in 2015, the Washington Post called the black incendiary Ta-Nehisi Coates the country’s “foremost public intellectual”—and it was probably right, since race was getting to be the sum total of what the country’s intellectual life was about.

The social scientist Andrew Hacker cited as evidence of rank-and-file Republicans’ racism their agreement with the statement “If blacks would only try harder they could be just as well off as whites.” But of course disagreeing with that statement would have been even stronger evidence of racism.

Nathan Glazer had foreseen such misunderstandings in the mid-1970s, when certain radical civil rights policy makers had assumed that “any deviation from statistical parity could, and should, be interpreted as owing to unconstitutional discrimination.” All difference, all inequality, was becoming actionable. But now there need not even be difference or inequality at all. Racism was becoming an official narrative in newspapers, on television, and on the internet.

Belonging to a racial minority conferred moral authority.

It turns out to be a difficult and unnatural thing to replace a system of prejudice with a system of real equality and respect. It’s a lot to ask of people. As Friedrich Nietzsche understood, it is far easier, for both former perpetrators and former victims alike, simply to transvalue the prejudices—so you wind up with the old world turned upside down.

In 1992, after the acquittal of four Los Angeles police officers whose beating of black fugitive Rodney King had been captured on video, black and immigrant youths ransacked much of South Central Los Angeles, leaving 63 dead and thousands injured. The riots’ climax came when a gang of young men were videotaped from a traffic helicopter hauling the hapless white truck driver Reginald Denny out of his cab, crushing his skull with a concrete block, and doing a celebratory dance in front of the camera.

The murder trial of O. J. Simpson in 1995 hardened this division. Simpson, in his youth the greatest running back in the history of American football, in middle age a beloved movie star and television personality, had been accused of stabbing to death his white ex-wife Nicole Brown Simpson and her friend Ron Goldman, and had fled arrest. The evidence against Simpson—blood at the scene, blood in his car, blood in his house, the prints of his rare Bruno Magli shoes, and seemingly incontrovertible DNA testing—pointed to an open-and-shut case. But Simpson’s lawyers turned it into a way for a largely minority jury to send a message.

Cochran had more rights in the trial—both statutorily and informally—than his prosecutorial counterparts: The Simpson defense could appeal the verdict if, for instance, it found the state was purging the jury of blacks (under the 1978 case People v. Wheeler). That the prosecution had no such recourse allowed the defense to purge the jury of whites.

In 2009, blacks, who made up 13 percent of the population, accounted for almost half the arrests (49.2 percent) for murder and more than half (55.5 percent) for robberies.

On August 9, 2014, 18-year-old Michael Brown was shot dead by policeman Darren Wilson after a confrontation in a housing project in Ferguson, Missouri, a fast-changing but relatively well integrated town in St. Louis County. The incident sparked three waves of riots, the first one lasting two weeks, the second one coming three months later when a grand jury chose not to indict Wilson. There was additional unrest on the first anniversary of the shooting.

On August 9, 2014, 18-year-old Michael Brown was shot dead by policeman Darren Wilson after a confrontation in a housing project in Ferguson, Missouri, a fast-changing but relatively well integrated town in St. Louis County. The incident sparked three waves of riots, the first one lasting two weeks, the second one coming three months later when a grand jury chose not to indict Wilson. There was additional unrest on the first anniversary of the shooting. The controversy was, in hindsight, unwarranted. According to an investigation into Wilson’s conduct by the civil rights division of the Obama administration’s Justice Department, the 289-pound Brown, high on THC and accompanied by his friend Dorian Johnson, had stolen several boxes of cigarillos from an Indian-owned variety store, manhandling the diminutive owner when he protested. The incident was captured on video. When Wilson, making his rounds in a squad car, encountered the pair a few moments later, Brown moved to the driver’s window, blocking Wilson’s exit, punched him in the face, reached into the car, and got a hand on Wilson’s gun. Wilson fired a shot into Brown’s hand. He pursued Brown when he ran and shot Brown only when Brown turned and charged him.

The controversy was, in hindsight, unwarranted. According to an investigation into Wilson’s conduct by the civil rights division of the Obama administration’s Justice Department, the 289-pound Brown, high on THC and accompanied by his friend Dorian Johnson, had stolen several boxes of cigarillos from an Indian-owned variety store, manhandling the diminutive owner when he protested. The incident was captured on video. When Wilson, making his rounds in a squad car, encountered the pair a few moments later, Brown moved to the driver’s window, blocking Wilson’s exit, punched him in the face, reached into the car, and got a hand on Wilson’s gun. Wilson fired a shot into Brown’s hand. He pursued Brown when he ran and shot Brown only when Brown turned and charged him.

The Justice Department’s novel-length report, full of lab work, cell phone records, and dozens of interviews, showed that there was no case for indicting Wilson for any kind of police misconduct. Brown’s DNA was found on Wilson’s collar, shirt, and pants. Gunpowder soot was found on Brown’s hand, and evidence of change in the skin from heat discharge was “consistent with Brown’s hand being on the barrel of the gun.” On detail after detail, in multiple interrogations, Wilson’s account matched that of the material evidence.

A mythological account spread nonetheless. Brown, it was said, had been raising his hands to surrender. He had been saying “Hands up, don’t shoot!” Most, perhaps all, of these stories had their beginnings in tales that Brown’s accomplice Johnson (known as Witness 101 in the civil rights investigation) had told in the aftermath of the shootings. Johnson claimed that Wilson had shot Brown in the back and then killed him with a volley of shots as he stood with his hands up, pleading that he was unarmed.

In November, days after the grand jury’s decision not to indict Wilson and the riots that followed, black players on the St. Louis Rams took the field for a Sunday-night football game with their arms raised in a hands-up-don’t-shoot pose. Ta-Nehisi Coates raged in a short book that would become a number-one bestseller about the difficulty of telling his son why “the killer of Mike Brown would go unpunished.” The country’s official culture was now squarely on the side of the protests, even if any neutral reading of the evidence showed that Brown’s killer, Wilson, should have gone unpunished.

Yale and other outposts of the American credentialocracy now belonged to her and to “activists” like her. The Christakises canceled their spring courses, and resigned their Silliman positions shortly thereafter. Next Yale, which described itself as “an alliance of Yale students of color and our allies,” had demanded the Christakises’ resignation, the installation of a bias-reporting system, the renaming of three of Yale’s colleges after non-whites, more non-white psychologists, the institution of “racial competence and respect training,” and millions for ethnic studies programs. Yale met most of their demands and added $50 million to hire professors “who would enrich diversity.” Brown University, after a similar uprising, earmarked $100 million to create a “more just and inclusive campus.”

The physicist Max Planck once wrote, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” Even radical movements that have appeared to fail often go on wreaking institutional change slowly and quietly.

Yale may have been an “intellectual space” under the old constitution. Under the new one, how could it be? Once Yale changed its rules to recruit minority students through affirmative action, as the law required it to, upholding purely meritocratic standards (as Kimball urged) would have been a contradiction in terms. Following rules of decorum (in which the Christakises placed so much faith) was only a roundabout route to the same disappointment. An institution could claim to be upholding its old standards—Yale did—but it could not actually uphold them. It had taken on political responsibilities that overrode its educational ones. The protesters were wise to this. Their incredulity at the Christakises’ failure to “understand” the real terms of their service at Yale was surely unfeigned.

The faster racism and privilege are dismantled, the greater the psychological need to point to racism and privilege.

On American and British campuses, so-called no-platforming—the move to “deny fascists, organized racists and other haters the freedom to spread their poison”—was on the rise. Only an eighth of Americans over the age of 70 believed that “government should be able to prevent statements that are offensive to minority groups.” But a quarter of Americans aged 35 to 70 believed it, and fully 40 percent of adults under 35 did.

An idea of Americans as something other than a people had begun to take hold of the political class: Ours was a “creedal” nation, a country united not by race or by history but by belief in certain ideas. This sounds like open-mindedness, but if not managed carefully, it can turn into the opposite. A country you can join by simply changing your mind is a country you can fall out of by doing the same. On literally dozens of occasions as president, Barack Obama described highly specific political opinions, always those of his own party, as expressing “who we are” as Americans. Not since the McCarthy era had Americans been told that to disagree with the authorities was to forfeit one’s membership in the American nation.

The Civil Rights Act of 1964 was, as we have noted, a legislative repeal of the First Amendment’s implied right to freedom of association. Over decades it polarized the political parties and turned them into something like secret societies, each of them loyal to a different constitutional understanding. Democrats, loyal to the post-1964 constitution, could not acknowledge (or even see) that they owed their ascendancy to a rollback of the basic constitutional freedoms Americans cherished most. Republicans, loyal to the pre-1964 constitution, could not acknowledge (or even see) that the only way back to the free country of their ideals was through the repeal of the civil rights laws.