We Used to Be Wise

Share
We Used to Be Wise
It's a book

How the United States Dismantled the Structures That Made It Work, and the Fifty-Year Campaign That Did It

This is the new book — the whole thing, start to finish, for paid readers first.

I'd love your help before it goes out into the world. Read it like a friend looking over my shoulder. Anything that trips you up, bores you, or could be sharper — tell me. Comments, notes, typos, hunches. All of it helps.

You're the first readers, and that means something to me. Thank you.

— Joe

Arrakis Publishing, Inc.

Copyright © 2026 Joe Zeigler

All rights reserved.

No part of this publication may be reproduced, distributed, or transmitted in any form or by any means, including photocopying, recording, or other electronic or mechanical methods, without the prior written permission of the publisher, except in the case of brief quotations embodied in critical reviews and certain other noncommercial uses permitted by copyright law.

Published by Arrakis Publishing, Inc.

Crystal River, Florida

First Edition, 2026

ISBN: 979-8-9952539-6-9

arrakispublishing.com

Printed in the United States of America

 

Contents

Introduction: The Road Not Taken.......................................... 1

Chapter One: The Memo........................................................ 3

Chapter Two: The Floor........................................................ 18

Chapter Three: One Side of the Story................................... 29

Chapter Four: Who Pays...................................................... 43

Chapter Five: The Casino..................................................... 56

Chapter Six: The Panels....................................................... 70

Chapter Seven: The Handshake........................................... 81

Chapter Eight: The Ballot...................................................... 87

Chapter Nine: The Auction.................................................... 99

Chapter Ten: The Stolen Court........................................... 109

Chapter Eleven: Your Word................................................ 120

Chapter Twelve: What We Built........................................... 131

Chapter Thirteen: The Supply Chain to the Grave............... 141

Chapter Fourteen: The Bill.................................................. 151

Chapter Fifteen: The Map................................................... 164

Chapter Sixteen: The Loan................................................. 175

Chapter Seventeen: The Pipes........................................... 187

Chapter Eighteen: The Revolving Door............................... 201

Chapter Nineteen: The Reckoning...................................... 214

Conclusion: The Question................................................... 222

 

Introduction: The Road Not Taken

We used to be wise.

Not virtuous. Not fair. The exclusions were real and their costs are still compounding. But wise in a measurable way. We understood what happens when banks gamble with depositors' money, because we had seen it. Congress wrote a law. The law worked for 66 years. We understood what happens when workers have no floor and no power to push back, because we had the graves. We built institutions. For 30 years, wages at the bottom grew faster than wages at the top. We understood what happens when alliances go conditional, when courts get captured, when concentrated wealth purchases political outcomes, because all of it had happened before and we had documented exactly what it cost.

The wisdom was not philosophical. It was empirical. We paid for it in specific catastrophes, and we kept the receipts.

Then, over 50 years, the people who did not like what the receipts showed spent a great deal of money making sure everyone forgot.

This book is the accounting.

In 1979, Jimmy Carter put 32 solar panels on the roof of the White House West Wing. Solar thermal, not photovoltaic. Functional. Symbolic. Deliberate. He dedicated them on June 20 and gave a speech about the sun that nobody could embargo. He was describing a choice. The country could either build the infrastructure for energy independence or remain dependent on oil that any regime sitting on top of it could turn into a weapon at any moment. The panels were a beginning. They were also a statement that some roads are better than others and that the man in charge understood the difference.

Reagan had them taken down in 1986 during a roof resurfacing. They ended up in a warehouse in Virginia. A journalist found them years later, still in rows, still working. They are in a college cafeteria in Maine now, heating water, exactly as Carter said they would be.

The road Carter was pointing toward is the road this book is about. Not that road specifically, though energy policy is one chapter. The pattern. Fifty years of choosing, over and over, the road that served the fewest people most completely over the road that would have served the most people adequately. The choice was not always made by presidents or Congress. It was made by an infrastructure of think tanks and donor networks and legal foundations and lobbying operations and media organizations, built deliberately, funded generously, and deployed with patience the other side has never matched.

The patience is the most important word in that sentence.

Short-term democratic politics, the kind that runs on two-year House cycles and four-year presidential ones, cannot match an adversary that plans in decades. The dismantling project was never in a hurry. It lost elections and kept moving. It lost news cycles and kept building. It lost cultural moments and kept funding the legal infrastructure that would outlast them. By the time the public understood what was happening, the instruments to reverse it had been captured.

That is the story this book tells. It opens with a number.

In 1971, the top 1 percent of households in the United States held around 9 percent of national wealth. In 2024, they held around 31 percent. The bottom half of United States households went the other direction. They held around 4 percent of national wealth in 1971 and around 2.5 percent in 2024. The graph of those two lines, one rising, one flattening toward zero, is the graph of what this book is about. The lines did not separate by accident. People wrote memos. They funded foundations. They bought courts. They rewrote the rules that governed both lines. Then they told the country it had happened naturally.

It did not happen naturally.

You already know some of this. You have watched a neighbor lose a house in 2008 and the people who caused the crisis receive bonuses paid by the public. You have watched a relative ration insulin. You have watched an election where the outcome in your state was obsolete months before ballots were counted because the district maps had made it so. You have watched a president stand beside the leader of a hostile foreign power and take that leader's word over the word of United States intelligence services. You have noticed that none of this is punished, that the careers of the people responsible continue, that the structures that were supposed to prevent this were either absent or captured. You have been told that what you were seeing was not the pattern it appeared to be. This book is the pattern.

The reason to read this book is not to be convinced that inequality grew. That fact is not contested. The reason to read this book is to understand how. Not the standard story, which attributes the gap to globalization and automation and the inevitable gravity of markets. That story is incomplete in a specific way. It leaves out the politics. It treats the distribution of globalization's costs as a law of physics rather than a series of decisions made by identifiable people with identifiable interests. Germany globalized. Japan globalized. The Nordic countries globalized. They made different choices about who would absorb the costs. The United States distribution was a choice. The choice had authors.

This book names them.

Naming them matters. A system that lies about its own origins cannot be corrected by people who believe the lies. If the current state of the country is the result of forces nobody chose, no remedy is possible. If the current state of the country is the result of choices made by specific institutions funded by specific people pursuing specific interests, the remedy is political. The first conclusion ends the conversation. The second one starts it. The difference between them is the difference between despair and work.

The book treats the reader as an adult. It does not reassure. It does not offer optimistic endings where the evidence does not support them. It does not flatter the reader's prior beliefs and does not invent new ones to flatter. It presents a record. The record is what it is. The record is also enough. A country that can no longer describe what happened to it cannot begin to decide what to do about it. Description, accurate and complete, is the first political act available to a country in this condition. It is also the act the infrastructure this book describes has spent 50 years preventing.

What follows is the work.

The mechanism begins with a memo. In August of 1971, a tobacco lawyer named Lewis Powell wrote a memorandum for the United States Chamber of Commerce explaining how concentrated wealth could capture the institutions that governed it. Courts. Universities. Media. Legislatures. Regulatory agencies. The memo was not a secret. It was a blueprint. Two months later, Nixon put its author on the Supreme Court. The first chapter of this book tells that story. The remaining chapters describe what the blueprint built.

The blueprint built a country where 45,000 to 68,000 people die every year from causes that would not kill them in any other wealthy democracy. A country where Black women die in childbirth at more than five times the rate of women in Germany. A country where elections are decided by the drawing of district lines by the party whose voters lost the last census. A country whose alliances rest on a promise the last president actively questioned and the current one might again. A country that learned after 1929 how to regulate banks, wrote the law, watched it work for 66 years, then repealed it. A country that had by 1979 mapped a path away from fossil fuel dependence and chose not to take it.

Every one of these outcomes has an author. The authors are named in the chapters ahead.

The argument of this book is not that the authors are individually unusual. Most of them are ordinary in the way that most people who do enormous damage are ordinary. They believed in what they were doing. They had theories about why it would help. Many continue to believe the theories and continue to not notice what the theories produced. The book does not require a thesis about their character. It requires only a record of what they did and an honest accounting of what it cost. The record exists. The cost is measurable. The accounting is what the rest of this book is for.

The reader who stays through these chapters will finish with one question. What happens next. The book does not answer it. The book does not know. What the book offers is a diagnosis complete enough that the reader no longer has to argue with the framing. The framing is settled. What remains is the decision about what a country does with a diagnosis of this kind.

The 20th century taught this country how to survive its own worst instincts. It built the banking rules. It built the wage floor. It built the alliances. It built the court. The 21st century has been the story of unlearning each of those things. The unlearning was funded. It was deliberate. The instrument that funded it is the subject of the next chapter.

The decision is not in this book.

Chapter One: The Memo

Lewis Powell, August 23, 1971, and the Blueprint for Everything That Followed

Lewis F. Powell Jr. was a tobacco lawyer. He had been counsel to Philip Morris for more than a decade, arguing in every venue that would hear him that the evidence linking cigarettes to cancer was not yet conclusive enough to justify restricting the sale of the product that was killing his client's customers. He understood the value of manufacturing doubt. He understood the value of organized institutional effort against regulatory accountability. He was also a former president of the American Bar Association, a member of eleven corporate boards, and a distinguished Virginia gentleman who had navigated the desegregation era with more grace than most of his contemporaries. He was the right man to write this particular memo.

Eugene Sydnor, chairman of the education committee of the United States Chamber of Commerce, asked him to write it. The topic was the state of American enterprise. Powell titled the document "Attack on the American Free Enterprise System," and the 34 pages he produced on August 23, 1971 were, in the words of historian Nancy MacLean, "a call to arms for the corporate mobilization that we've seen over the ensuing five decades." Two months after Powell submitted it to Sydnor, Nixon nominated him to the Supreme Court. The Senate confirmed him. He served until 1987. The memo became public only after the nomination, when columnist Jack Anderson reported its existence in the Washington Post. By then it had already been quietly distributed to the Chamber's corporate membership and was being read in boardrooms across the country.

What the Memo Said

Powell was not a paranoid man. The memo had an edge of alarm anyway. He argued that the American free enterprise system was "under broad attack" from a coalition whose power far exceeded its numbers. Leftist professors reshaping young minds. Consumer advocates like Ralph Nader who had become heroes to the press. Journalists whose instinct was to expose rather than celebrate. A regulatory state that had grown teeth in the late 1960s as public concern about corporate behavior, from auto safety to environmental poisoning to pharmaceutical deception, produced actual legislation. The Clean Air Act passed in 1970. OSHA was established. The EPA was created. The Federal Trade Commission was filing cases. Nader's Raiders were everywhere. To Powell, this was not the natural operation of democracy. It was an existential threat.

His prescription was not defensive. "Strength lies in organization," he wrote, "in careful long-range planning and implementation, in consistency of action over an indefinite period of years, in the scale of financing available only through joint effort, and in the political power available only through united action." He was describing, with unusual candor, exactly what the business community needed to build. A permanent, coordinated, well-funded apparatus for shaping public opinion, the judiciary, the legislative process, and the universities. Not a response to specific regulations. A sustained campaign to shift the entire terrain on which regulatory battles were fought.

The specific recommendations matter because they were all implemented. He called for the Chamber to build "a staff of highly qualified scholars in the social sciences" who would produce research supporting business interests. He called for monitoring textbooks and pressuring universities to present "a balanced viewpoint." He called for funding speakers bureaus that would place pro-enterprise voices on campuses and in public forums. He called for systematic engagement with the courts, "a first-rate staff of lawyers" to litigate in the business community's interests, and "judicial activism of the highest order." He called for sustained engagement with the media, including organized challenges to the broadcast licenses of stations whose coverage was hostile to business. He called for direct political engagement. Funding candidates. Engaging the parties. Pursuing legislative agendas at the state as well as the federal level.

In a private memo to the Chamber of Commerce, Powell had laid out the roadmap for what would become the most consequential institutional transformation in American political life since the New Deal.

The Apparatus Assembles

The memo was circulated within Chamber of Commerce circles in late 1971. At first it drew little public notice. That changed in 1972 when Jack Anderson obtained it and published excerpts, using it to ask whether a man who had written a blueprint for corporate capture of American institutions should be sitting on the Supreme Court. The controversy was short-lived. Powell was already confirmed. But the publicity put the memo in front of a wider audience of exactly the people it was designed to reach. Corporate executives read it. Conservative philanthropists read it. They acted on it.

In 1971, before the memo, 176 corporations had public affairs offices in Washington, D.C. By 1980, nine years later, there were 2,445 companies with Washington offices, supported by 9,000 registered lobbyists and 60,000 trade association employees. That apparatus was built in direct response to Powell's call. A specific strategy, implemented by people who had read a specific document and taken it seriously.

The Heritage Foundation was founded in 1973 by Paul Weyrich with seed money from beer magnate Joseph Coors, who was reportedly moved to act in part by the Powell memo. The American Enterprise Institute, a modest operation until then, received enormous infusions of corporate funding and expanded its staff and output dramatically. The Cato Institute was founded in 1977 with Koch money. The Pacific Legal Foundation, the first of the corporate-funded public interest law firms Powell had called for, was founded in 1973, housed in the Sacramento Chamber of Commerce building, with 80 percent of its income from corporations and corporate foundations. ALEC, the American Legislative Exchange Council, was founded in 1973 by Paul Weyrich to produce corporate-drafted legislation at the state level. Its mechanics are described later in this chapter.

In 1973, the Chamber of Commerce organized two Powell Memo Task Force meetings to accelerate the agenda. One was held at Disney World and was attended by Gerald Ford, then Republican House minority leader and soon to be president, along with CBS president Richard Jenks, ABC executive James Hagerty, and newspaper magnate Edward Scripps II. A second meeting in Dallas featured a speech by a young television producer named Roger Ailes, who had made his reputation managing Richard Nixon's television strategy in 1968. Ailes would found Fox News in 1996. His presence at a Powell Memo Task Force meeting in 1973 was not incidental. He was already embedded in the network Powell had called into existence, doing the media work that network required.

The Federalist Society

In 1982, a group of conservative law students at Yale, Harvard, and the University of Chicago founded the Federalist Society for Law and Public Policy Studies. Its funding came from the Olin Foundation, the Scaife Foundations, and the Koch networks, the same donor infrastructure that had funded Heritage and Cato in the years after Powell. Its founding mission was to challenge the perceived liberal dominance of American legal education and to build a network of conservative and libertarian lawyers who could be placed in influential positions throughout the legal system.

What the Federalist Society built over the following four decades was precisely the judicial apparatus Powell had called for. A pipeline from ideologically vetted law students to prestigious clerkships with conservative judges, from those clerkships to positions in Republican administrations and conservative law firms, from those positions to the federal bench, and from the federal bench to the Supreme Court. The Society did not place secret conservatives on the Court. It operated openly and proudly. But it produced a network so thorough that by the time of the Trump administration, it had become the effective judicial nominations office for Republican presidents. Every Supreme Court justice Trump nominated, Gorsuch, Kavanaugh, Barrett, had been vetted by the Society. More than 200 of his federal court nominees were Federalist Society members or had been approved by the network. Powell had called for "judicial activism of the highest order." He got it.

Koch and the Long Game

Powell wrote his memo in 1971. In 1974, Charles Koch, CEO of Koch Industries, at the time the second-largest privately held company in the United States, began publishing and speaking in ways that pushed the Powell agenda harder. He argued that corporations were being forced to subsidize universities hostile to their agenda and that CEOs were putting too few strings on their philanthropic giving. Koch had read Powell's memo and concluded it did not go far enough. Over the following decades, Koch and his brother David spent more than $1.5 billion building the political infrastructure to implement the Powell agenda at a scale Powell had not imagined. The Koch network, channeling money through a labyrinth of nonprofit organizations designed to obscure the source of funds, raised $578 million and spent $548 million in the 2024 election cycle alone. Charles Koch transferred more than $5 billion of Koch Industries stock into his political network's nonprofit organizations between 2020 and 2022.

The network's model was not just campaign contributions. It was the construction of permanent institutional capacity across every domain Powell had identified. Think tanks that produced the economic arguments for deregulation and tax cuts. Legal foundations that litigated for those positions in the courts. Campus organizations that built the pipeline of young conservatives into positions of influence. State-level policy organizations in all 50 states that translated national agenda items into state legislation. And the funding infrastructure that connected all of it and sustained it between election cycles. This is what a serious political project looks like when it operates at scale across decades. It does not look like a conspiracy. It looks like an institution.

What the Memo Produced

Some historians argue the memo is overrated as a causal factor. That the institutions it described would have been built anyway, because the interests behind them were powerful and the political conditions were favorable. Partly true. The Koch brothers did not need Powell to tell them that concentrated private wealth should be deployed in defense of policies favorable to concentrated private wealth.

But the memo did something specific that mattered. It provided a framework for coordination. Individual corporations acting in their own immediate interests tend to focus on their specific regulatory and legislative problems. The Powell memo argued for collective action across the entire terrain. Not just fighting specific regulations, but reshaping the institutions through which all regulations were produced. It named the universities, the media, the courts, and the legislature as simultaneous targets. It called for patience across "an indefinite period of years." It said explicitly that strength lies in organization and scale and consistency. These are not instinctive behaviors for executives trained to focus on quarterly results. They are strategic behaviors that require a framework to sustain. The Powell memo provided the framework.

Fifty years after it was written, the results are measurable. The top marginal income tax rate has fallen from 70 percent to 37 percent. Union density in the private sector has fallen from 35 percent to 6 percent. The Supreme Court has a 6-3 conservative supermajority whose members were selected through the pipeline the Powell network built. Campaign finance limits have been eliminated through legal strategies the Powell network developed and executed. ALEC model bills have become law hundreds of times in state legislatures across the country. The regulatory state that Powell feared has been substantially weakened through a combination of budget cuts, legal challenges, and the placement of industry-aligned officials in the agencies responsible for regulating their industries.

Powell was nominated to the Supreme Court less than two months after he submitted the memo. From the bench, he spent 15 years doing exactly what he had called for from the outside. Engaging the courts as a vehicle for expanding the constitutional rights of corporations and limiting the authority of government to regulate them. His 1978 opinion in First National Bank of Boston v. Bellotti held for the first time that corporations had First Amendment rights to spend money on ballot initiatives, a precedent that would eventually lead, through a chain of decisions the Powell network helped construct, to Citizens United v. Federal Election Commission in 2010. Powell did not live to see Citizens United. He did live to see the foundation being laid.

The memo was not a conspiracy document. It was a strategic plan. The difference matters. Conspiracies are hidden and fragile. They collapse under scrutiny. Strategic plans are public and sturdy and get implemented more effectively when more people understand them. The Powell memo was circulated to business executives who were supposed to read it and act on it. Many of them did. The institutions it called for were built openly, funded transparently enough for tax purposes, and operated in public view. What they produced is the political terrain of the United States in the early twenty-first century. You are living in the results. The question this book is asking is whether you would like to continue.

The State-Level Machine

While the Federalist Society worked the federal judiciary and the Heritage Foundation worked Congress and the media, ALEC worked the states. The American Legislative Exchange Council operates on a model that has no equivalent in American politics. State legislators pay fifty dollars a year to join. Corporations pay between seven thousand and twenty-five thousand dollars annually, plus additional fees for task force participation. In return, the corporate members and legislators sit together in closed-door task force meetings where they vote on model legislation that the legislators then introduce in their state houses. ALEC's own internal documents describe the organization as a business. "ALEC must begin to function more like a business, and recognize that it has a product that it provides to a defined customer base for a profit. ALEC's product is policy, and its customers are state legislators and private sector members."

What this means in practice is that corporations draft or propose legislation, vote on it with legislators as co-equals in the room, and then those legislators introduce the resulting bills at home, usually without disclosing where the language came from. The Center for Media and Democracy obtained 850 ALEC model bills in 2011 through a document leak. The scope was astonishing. Bills to weaken labor unions, loosen environmental regulations, mandate photo ID requirements for voting, privatize prisons, block renewable energy standards, limit consumer lawsuits against corporations, and prevent cities from passing minimum wage ordinances. Between 2010 and 2018, ALEC-connected model legislation was introduced in state legislatures approximately 2,900 times. More than 600 such bills were enacted into law. An average of roughly 70 a year across that eight-year window.

ALEC receives more than 98 percent of its revenues from corporations, corporate trade groups, and corporate foundations. Its funding has included Koch Industries, ExxonMobil (which donated $1.4 million from 1998 to 2009), and foundations funded by the Coors, Scaife, Bradley, and DeVos families. It has one Democrat out of 104 people in legislative leadership positions. It describes itself as nonpartisan. The pretense is maintained because it is useful. The reality is that ALEC functions as a lobbying operation for the interests of its corporate funders, conducted at the scale of all fifty state legislatures simultaneously, with the costs hidden behind the fig leaf of a nonprofit educational organization.

Roger Ailes and the Television Plan

Roger Ailes was 30 years old when he met Richard Nixon at the Mike Douglas Show in 1967. Nixon appeared as a guest. Ailes was the executive producer. Nixon complained that television was a gimmick. Ailes replied that if Nixon thought so, he would lose again. Nixon hired him. Ailes spent the following years working as a Republican media consultant, eventually managing the television strategy for Nixon's successful 1968 campaign and later for Reagan and George H.W. Bush.

In 1970, Ailes and other Nixon aides drafted a memo titled "A Plan for Putting the GOP on TV News," which stayed buried in the Nixon Presidential Library until Gawker obtained it in 2011. The memo proposed sidestepping the "prejudices of network news" by delivering "pro-administration" video content directly to local television stations, which would broadcast it as news. The plan described its audience with a contempt that would have been unsettling if stated publicly: "People are lazy. With television you just sit, watch, listen. The thinking is done for you." Nixon wrote in the margin: "This is an excellent idea." The network did not get built in the Nixon years. Watergate intervened. But the intellectual blueprint was in place. Ailes spent the intervening decades as a Republican media consultant, attending Powell Memo Task Force meetings in 1973, working for Big Tobacco's PR campaigns, and eventually becoming the founding president of Fox News, which launched on October 7, 1996.

Fox News was not the first attempt at this project. In the early 1970s, Joseph Coors, the same beer magnate who funded Heritage, funded Television News Incorporated, an Ailes-helmed news service that supplied local stations with video content that promoted the Republican line while presenting itself as impartial journalism. It failed commercially. It established the model. Fox News succeeded because Rupert Murdoch had the capital to sustain it through early losses and the regulatory environment had changed enough to allow the consolidation the Telecommunications Act of 1996 would soon accelerate. By 2002, Fox News was the most watched cable news channel in the country. It has remained so since.

The Scale of the Achievement

By any honest measurement, the Powell network succeeded beyond what Powell himself probably imagined in 1971. Federal lobbying spending topped $4.5 billion in 2024, compared to nothing organized on this scale in 1971. The number of corporate public affairs offices in Washington went from 176 to 2,445 in the nine years following the memo. The think tank infrastructure Powell called for now encompasses dozens of well-funded organizations producing thousands of policy papers a year. The legal infrastructure he envisioned now includes not only the Federalist Society but a constellation of public interest law firms litigating for corporate interests in every federal circuit. ALEC operates in all fifty states. The Koch network alone raised and spent more than half a billion dollars in the 2024 cycle.

None of these organizations describe themselves as implementing the Powell agenda. Each describes itself as advancing sound policy, protecting constitutional rights, promoting economic freedom, or educating citizens about the benefits of limited government. The descriptions are not entirely false. The Federalist Society produces rigorous legal scholarship. Heritage Foundation analyses occasionally contain useful empirical work. ALEC model bills sometimes address genuine problems in state law. But the direction of effort has been consistent for 50 years. Lower taxes on concentrated wealth. Weaker regulatory authority over corporations. Expanded corporate constitutional rights. Restricted voting access for populations that tend to vote against the interests of concentrated wealth. A judiciary selected for alignment with all of the above. That consistency is not coincidental. It is what deliberate, long-term strategic effort looks like when it works.

Lewis Powell wrote the memo because he believed the system he valued was under threat. He was right that organized pressure could shift institutional outcomes. He was right that patience and coordination and funding at scale were required. What he was wrong about was the description of the threat. In 1971, the American system that had produced 30 years of broadly shared prosperity was not under attack from the left. It was working. The regulations he found burdensome were protecting workers and the environment. The unions he saw as destabilizing were maintaining the wage levels that had produced a middle class. The progressive tax structure that he found confiscatory was funding the research and infrastructure on which American productivity depended. What was under threat was not the American system. What was under threat was the ability of concentrated private wealth to operate without democratic accountability. Powell's memo was a plan to protect that ability. That is what the network built.

Sources, Chapter One

The Powell memo is available in full at the Powell Archives and has been analyzed extensively. The definitive account of its influence is Jacob Hacker and Paul Pierson, "Winner-Take-All Politics" (Simon & Schuster, 2010). For Nancy MacLean's analysis, see "Democracy in Chains: The Deep History of the Radical Right's Stealth Plan for America" (Viking, 2017). The 176-to-2,445 Washington office data is from Al-Jazeera's interactive "The People vs. America" documentary series. For the Federalist Society, see Amanda Hollis-Brusky, "Ideas with Consequences: The Federalist Society and the Conservative Counterrevolution" (Oxford University Press, 2015). For ALEC's operations, see the Center for Media and Democracy's ALEC Exposed database. For the Koch network's spending, see the New York Times's analysis of the network's 2024 tax filings (December 2025) and the Koch political operation spending database maintained by OpenSecrets. Roger Ailes's attendance at the 1973 Powell Memo Task Force meeting is documented in David Sirota's podcast "Master Plan" (2024) and in The Nation's analysis "The Powell Memo Helped Create Project 2025" (September 2024). The Ailes 1970 Nixon-era memo "A Plan for Putting the GOP on TV News" is documented at the Nixon Presidential Library and was reported by Gawker in 2011 and analyzed by Rolling Stone.

Chapter Two: The Floor

Labor, Unions, and the Systematic Removal of the Countervailing Power

Clara Lemlich and What the Uprising Built

On March 25, 1911, a fire broke out on the eighth floor of the Asch Building in New York City's Greenwich Village. The building housed the Triangle Shirtwaist Company, which employed roughly 600 workers. Most were young immigrant women. Many were teenagers. Almost all were Jewish or Italian. All of them worked for wages that were not enough, in conditions that were not acceptable, with protections that were nonexistent. Within 18 minutes, 146 of them were dead. They burned. They suffocated from smoke. They fell nine stories to the sidewalk because the alternative was to burn. Witnesses on the street below watched them fall. Some witnesses jumped from the viewing platform of a nearby elevated railway to avoid seeing it.

The doors to the stairways were locked. Max Blanck and Isaac Harris, the "Shirtwaist Kings" of the trade press, the men who had made their fortunes by finding the minimum possible price for human labor and setting it just slightly above that, had ordered the doors locked during working hours. The stated reason was to prevent theft of shirtwaists. The actual reason was also to prevent union organizers from entering the building. In 1909 and 1910, the women of the garment industry had gone on strike. They called it the Uprising of the 20,000. Blanck and Harris had refused to negotiate, refused to recognize the union, and maintained their practice of locking the doors. The doors were still locked when the fire started on a Saturday afternoon in March 1911.

The Uprising began when a 23-year-old Ukrainian immigrant named Clara Lemlich interrupted a meeting at Cooper Union where labor leaders were debating the merits of a general strike. She demanded the floor. She spoke in Yiddish. She moved that the strike begin. The crowd took the traditional Jewish oath, "if I turn traitor to the cause I now pledge, may this hand wither from the arm I now raise," and the general strike was on.

The strike lasted fourteen weeks. Workers at more than 700 shops walked out. Most of the factories eventually signed agreements. Blanck and Harris refused. They hired replacement workers. They hired private detective agencies to harass picketers. They had strikers arrested. They maintained the locked doors and the long hours and the wages they had set. When the fire came in 1911, the doors were still locked and the wages were still whatever Blanck and Harris chose to pay and the union was still excluded because Blanck and Harris had decided they preferred it that way and the law gave them the power to maintain that preference.

The New York state legislature had passed a workmen's compensation law in 1909, which would have made employers liable for workplace injuries and deaths. The law was challenged in court and declared unconstitutional. The ruling came down on March 24, 1911. The day before the Triangle fire. Blanck and Harris were acquitted of manslaughter in December 1911. They collected $60,000 from their fire insurance, which had insured their property for more than it was worth. They paid $75 per victim in civil settlements to the families. They then opened another factory, around the corner, and recreated the same conditions. Same locked doors. Same overcrowded workrooms. Same absence of sprinkler systems. The fire safety legislation passed in response to Triangle did not stop them. Nothing stopped them until 1918, when they finally dissolved the business.

The law was eventually replaced and upheld. The factories were eventually regulated. The unions that emerged from those struggles were eventually broken. The laws that made the breaking possible were eventually passed. The cycle has been running for more than a century. Regulation, resistance, rollback, catastrophe, regulation again. Understanding it as a cycle rather than as progress in a single direction is necessary to understanding what has to be done next.

The Lesson and the Law

The Triangle fire produced an immediate legislative response. Frances Perkins, who would later become Franklin Roosevelt's Secretary of Labor and the architect of the New Deal's labor protections, witnessed the fire from the street. She later said it was the moment that politicized her. The New York State Factory Investigating Commission was formed within weeks, and over the following four years produced thirty-six new laws reforming the state labor code, covering fire safety, maximum hours, sanitation, child labor, and the rights of workers to organize. New York's response became a model for other states and eventually for federal legislation. The principle embedded in the response was simple. The market, left to itself, will not protect workers. The government has to do it. The government needs enforcement authority to do it. And the enforcement authority needs to be used.

The Wagner Act of 1935, the National Labor Relations Act, created the legal framework for collective bargaining in American industry. It prohibited employers from firing workers for union activity. It established the National Labor Relations Board to supervise elections and adjudicate disputes. For the first time, it gave workers the legal right to organize and strike without being immediately replaced or imprisoned. The act was not charity. It was a response to decades of evidence that without the countervailing power of organized labor, employers set wages at whatever the market allowed and working conditions at whatever the law, or the absence of law, permitted. The Triangle fire was one data point. There were thousands of others. The coal mines. The steel mills. The meatpacking plants. The textile factories. The Wagner Act encoded a lesson that had cost lives to learn.

The immediate result was an explosion of union membership. In 1935, three million workers were unionized, about eight percent of the non-agricultural workforce. By 1945, union membership had reached 14 million. By 1954, it peaked at about 35 percent of the workforce. More than one in three American workers was a union member. The contracts those workers negotiated set the wage floor for the entire economy. The period economists call the Great Compression, when wages grew fastest at the bottom and middle of the income distribution, when the share of income going to the top declined, when the middle class expanded, was the period of peak union density. The correlation is not coincidental.

The Counter-Campaign

The Taft-Hartley Act of 1947 was the first significant congressional rollback of the Wagner Act. Passed over Truman's veto, it prohibited secondary boycotts, workers in one industry striking in solidarity with workers in another. It allowed states to pass "right-to-work" laws that permitted workers in unionized shops to decline union membership while still receiving the union-negotiated wages and benefits. The right-to-work provision was designed to undercut union finances, because unions were required to represent all workers in a shop regardless of whether those workers paid dues. By making dues payment optional while keeping representation mandatory, Taft-Hartley created a classic free-rider problem that gradually drained the financial resources of unions in states that adopted the provision.

The strategic campaign against unions that accelerated after the Powell memo was more sophisticated than Taft-Hartley. It operated on multiple fronts. Legal challenges weakened the Wagner Act's enforcement mechanisms. Corporate lawyers perfected techniques for delaying union elections until organizers lost momentum. Consulting firms specialized in running anti-union campaigns inside workplaces. State legislatures passed additional right-to-work laws. NLRB appointments during Republican administrations produced board majorities hostile to the agency's mission. Each of these developments individually was modest in its effect. Together, across 50 years, they produced the collapse of private sector union density from 35 percent to 6 percent.

Reagan's firing of 11,345 air traffic controllers in August 1981 was the public signal of what the new environment permitted. The controllers were members of PATCO, the Professional Air Traffic Controllers Organization, and had struck for better working conditions and higher pay. Federal workers did not have the legal right to strike, but previous administrations had treated strikes as labor disputes to be negotiated rather than as crimes. Reagan treated PATCO's strike as the latter. He fired the striking controllers, banned them from federal employment, and broke the union. The message to private-sector employers was clear. An aggressive anti-union stance was now politically viable. The practical consequences followed. The number of illegal employer actions in union elections, firing organizers, retaliating against union supporters, refusing to bargain in good faith, rose steadily through the 1980s and 1990s as enforcement weakened and penalties were inadequate to deter the violations.

The Compensation Gap

What happened to wages when unions were broken is not a mystery. The Economic Policy Institute has documented that from 1948 to 1973, worker productivity in the United States grew 96.7 percent and typical worker compensation grew 91.3 percent. The two lines tracked each other closely. Workers' share of the productivity gains was roughly proportional to their contribution to producing those gains. Between 1973 and 2023, productivity grew 80.9 percent. Typical worker compensation grew 29.4 percent. The gap between productivity gains and compensation gains widened dramatically, and that gap flowed to the top of the income distribution.

Where did it flow to exactly? The top 1 percent captured approximately 20 percent of national income in 2022, up from 10 percent in the 1970s. CEO compensation at the largest American corporations averaged roughly 21 times worker pay in 1965. By 2020, it averaged 351 times worker pay. These are not ratios that reflect differences in productivity. They reflect differences in bargaining power. When workers had unions, they negotiated a share of the productivity gains. When they did not, they did not.

Amazon and the Present

The decline of union power is sometimes described as a historical story, as if it happened in the past and its effects are over. The effects are not over. The Amazon warehouse in Staten Island, New York, unionized in April 2022 after workers at the facility voted 2,654 to 2,131 in favor of union representation. It was the first Amazon facility in the United States to successfully unionize. The Amazon Labor Union, an independent organization founded by fired Amazon worker Chris Smalls, won despite the company spending millions on union-avoidance consultants, mandatory anti-union meetings, and the full range of legal and illegal tactics that modern corporate United States deploys against organizing efforts. Amazon then refused to negotiate a contract with the elected union, filed appeals with the NLRB, and has continued the operational practices the union was formed to address. Three years after the election, there is no contract. The workers voted for union representation. They have not received it, because the legal framework for enforcing their vote has been eroded to the point where the vote itself is not sufficient to produce the outcome the vote was supposed to produce.

The Amazon warehouse in Bessemer, Alabama, voted against unionization in April 2021. The NLRB subsequently found that Amazon's conduct had been so egregious, including illegal surveillance and pressure on workers, that the results were thrown out and a second election was held in 2022. The second election was also contested, and Amazon has continued to litigate the question. Meanwhile the workers who supported the union effort have been disciplined, fired, or pressured out through the normal mechanisms of employer power in a workplace where the legal protections for organizing are, at this point, largely theoretical.

What happened to Clara Lemlich's 1909 uprising and the Wagner Act that followed is that they were built, deployed, and then systematically dismantled. The dismantling was not complete. The legal framework for organizing still exists. The NLRB still functions, after a fashion. Unions that already exist can still negotiate contracts, though often under conditions that have deteriorated substantially. But the floor that Clara Lemlich and Frances Perkins and the Triangle workers' deaths had established, the floor below which wages and working conditions could not fall, has been lowered by 50 years of deliberate effort. What the workers at the Amazon warehouse in Staten Island discovered when they tried to exercise the rights the Wagner Act established is that the rights have become substantially more difficult to exercise than the statute suggests. The formal rights exist. The mechanisms for enforcing them have been weakened enough that the formal rights do not translate reliably into the actual improvements in wages and working conditions the Wagner Act was designed to produce.

What Remains to Be Rebuilt

The Protecting the Right to Organize Act, the PRO Act, has been introduced in every Congress since 2019. It would strengthen penalties for employer violations of labor law, close loopholes in the definition of who is an employee versus an independent contractor, protect secondary boycotts, and override state right-to-work laws. It has passed the House multiple times. It has never received a Senate vote, because the 60-vote threshold for overcoming a filibuster has been out of reach, and because the interests that benefit from the current framework have consistently organized themselves more effectively than the interests that would benefit from changing it.

Rebuilding labor power does not require re-creating the industrial unions of the 1950s in their exact form. The economy has changed. The composition of the workforce has changed. What the workers of Amazon warehouses and Starbucks stores and delivery platforms need is not identical to what the steelworkers and autoworkers of the 1950s negotiated. What remains true is that without legal frameworks that give workers effective, enforceable rights to organize, negotiate collectively, and compel employers to deal with them in good faith, the wage gains workers can achieve through individual negotiation are limited by the bargaining power they hold as individuals, which is approximately zero in most industries. The Wagner Act worked because it encoded a recognition that the individual worker does not have meaningful bargaining power against an employer and that collective action is the only mechanism by which that asymmetry can be addressed. That recognition remains correct. The framework that encoded it has been systematically weakened. Rebuilding it is not nostalgia. It is arithmetic.

Sources, Chapter Two

The Triangle Shirtwaist Fire is documented in David Von Drehle, "Triangle: The Fire That Changed America" (Grove Press, 2003), and in the extensive archive maintained by Cornell University's Kheel Center for Labor-Management Documentation and Archives (trianglefire.ilr.cornell.edu). Clara Lemlich's leadership of the Uprising of the 20,000 is documented in Annelise Orleck, "Common Sense and a Little Fire: Women and Working-Class Politics in the United States, 1900-1965" (University of North Carolina Press, 1995). For the Wagner Act and its consequences, see Nelson Lichtenstein, "State of the Union: A Century of American Labor" (Princeton University Press, 2002). The productivity-wage gap data is from the Economic Policy Institute (epi.org). Reagan's firing of PATCO controllers is documented in Joseph McCartin, "Collision Course: Ronald Reagan, the Air Traffic Controllers, and the Strike That Changed America" (Oxford University Press, 2011). For the Amazon Labor Union's election and Amazon's response, see Kim Kelly's "Fight Like Hell: The Untold History of American Labor" (Atria Books, 2022) and contemporaneous reporting in the New York Times and More Perfect Union. CEO pay ratios are tracked by the Economic Policy Institute's annual "CEO Compensation" report.

Chapter Three: One Side of the Story

The Fairness Doctrine, the Telecommunications Act, and the Manufacture of Incompatible Realities

What the Fairness Doctrine Was

The Fairness Doctrine was established by the Federal Communications Commission in 1949. Its premise was simple. Broadcasting licenses were grants of access to a public resource, the electromagnetic spectrum. In exchange for that access, broadcasters had an obligation to present controversial public issues in a manner that was honest, equitable, and balanced. The doctrine did not require equal time for all viewpoints. It required that broadcasters cover controversial matters of public importance and that when they did, they present contrasting viewpoints. It was an attempt to use the regulatory authority attached to broadcast licensing to ensure that publicly granted broadcasting privileges served the public interest rather than the interests of whoever owned the transmitter.

Broadcasters hated it. It required them to give airtime to perspectives their owners disagreed with. It made ideologically committed broadcasting commercially risky, because the FCC could respond to one-sided coverage by requiring equal time for the other side, and equal time was expensive. The practical effect was to make commercial radio and television cautious about taking strong positions on contentious political issues. This caution was either a responsible exercise of the public trustee obligation of broadcasting or an intolerable restriction on the First Amendment rights of broadcasters, depending on your perspective. The broadcasters and their attorneys took the second view. Conservative media entrepreneurs took the second view. The Reagan FCC took the second view. In 1987, on a 4-0 vote, it abolished the doctrine.

The Telecommunications Act and the Clear Channel Explosion

The Fairness Doctrine's abolition in 1987 removed the regulatory requirement for balance in broadcasting. The Rush Limbaugh Show launched nationally in syndication within months. Limbaugh had been doing local radio in Sacramento, and his format, three hours a day of unapologetic conservative commentary delivered with entertainment value and contempt for liberals, was exactly what the post-Fairness Doctrine environment made possible. By the early 1990s, Limbaugh's show was carried on more than 600 stations and claimed an audience of 20 million listeners a week. He did not need to present the other side. The Fairness Doctrine was gone. He could present his side exclusively, as often as he liked, with no requirement that any airtime be devoted to contrasting perspectives.

The Telecommunications Act of 1996 removed what remained of the constraints on how much of the broadcast spectrum a single company could own. Before the act, there were strict limits on how many stations any single owner could hold nationally and in any individual market. The act eliminated the national cap entirely and loosened the local market limits dramatically. The results were immediate. In the year after the act's passage, 2,045 radio stations changed hands in transactions worth about $13.6 billion. Clear Channel Communications, which had owned 40 radio stations in 1996, grew to own 1,240 stations by 2002, a 3,000 percent increase in six years, driven by $30 billion in acquisitions. By the early 2000s, Clear Channel operated in 89 of the top 100 markets in the United States. The total number of radio station owners dropped by more than 1,100, nearly 30 percent, between 1996 and the early 2000s. Over the entire period since the act, about 10,000 radio station transactions worth $100 billion have taken place.

The consolidation had predictable consequences for content. Programming decisions that had previously been made by local station managers familiar with local communities were now made by corporate programming departments optimizing for national audiences and advertising efficiencies. Local news coverage, city councils, school boards, county commissioners, local elections, local business, local crime, was expensive and produced content that could not be repurposed across markets. It was cut. Rush Limbaugh's program, Glenn Beck's program, Sean Hannity's program, and the other syndicated conservative talk shows produced by Clear Channel's Premiere Radio Networks could be broadcast simultaneously on a thousand stations at minimal marginal cost. Local content could not. The economics of consolidation drove a homogenization of content that systematically eliminated local political journalism and replaced it with national conservative talk radio.

The act had been sold to Congress and the public as a measure that would increase competition and reduce prices for consumers. A study of its outcomes found that cable and local phone rates went up rather than down after passage. The telecommunications industry lost roughly 500,000 jobs rather than gaining the 1.5 million jobs industry representatives had promised. The diversity of viewpoints available over the publicly owned airwaves declined rather than expanded. The promises were not honored. The consolidation happened anyway, because the act had been written by people whose primary constituents were the media companies that would benefit from it.

The Collapse of Local Journalism

The damage to local journalism extends beyond radio. American newspapers have been collapsing for two decades, driven primarily by the collapse of classified advertising revenue to internet platforms and secondarily by the concentration of display advertising in national digital media. Between 2005 and 2023, about 3,300 local newspapers, 60 percent of all local newspapers in the country, closed or merged. About 43,000 journalism jobs were lost. The United States now has more than 200 counties with no local news source at all, and a further 1,500 counties served by a single weekly newspaper. The News Deserts project at the University of North Carolina estimates that more than 55 million Americans, one in six, live in communities with no meaningful local news coverage.

The consequences are not theoretical. Research by political scientists has found that counties that lose their local newspaper experience lower voter turnout, less electoral competition, and worse government. Higher municipal borrowing costs. Less efficient service delivery. Less oversight of local officials. A 2019 Princeton study found that municipal borrowing costs rose significantly in communities after local newspapers closed, because the absence of local journalism reduced accountability for local officials and made bond markets more uncertain about local governance quality. The market failure of local journalism is producing measurable degradation of local democratic governance.

The Epistemological Consequence

The Fairness Doctrine was based on a premise that is not fashionable in contemporary First Amendment law but that is empirically defensible. Democracy requires a shared information environment. When public resources, the broadcast spectrum, are used to distribute information, those resources carry an obligation to serve the democratic function that shared information provides. The doctrine did not mandate that all broadcasters be neutral. It mandated that they present more than one side of controversial issues.

Its abolition, combined with the consolidation the Telecommunications Act enabled, combined with the founding of Fox News and the rise of nationally syndicated conservative talk radio, produced an information environment in which significant portions of the American public have limited access to information from sources other than those optimized to confirm their existing beliefs and identities. This is not a liberal complaint about conservative media, though the right has built its media infrastructure more deliberately and more effectively than the left. It is an observation about the function that shared information serves in democratic governance. You cannot deliberate collectively about shared problems if the parties to the deliberation are operating from incompatible sets of facts. You cannot adjudicate disagreements by reference to evidence if the parties to the disagreement do not share a methodology for evaluating evidence.

The 2020 election and its aftermath provided the starkest illustration. The claim that the election was stolen was factually wrong, examined by courts and election officials and investigators and found without evidence. It was also believed by a large and consistent majority of Republican voters for years after the election. The persistence of the belief is explicable only in the context of an information environment in which the primary sources of political information for many Republican voters, Fox News, conservative talk radio, social media algorithms optimized for outrage, had an institutional interest in maintaining the audience's belief that everything outside their approved media environment was deceptive, and that only their preferred sources told the truth. Having spent 30 years building an information environment on that premise, they should not have been surprised that their audience believed it.

The Fairness Doctrine was not a sufficient remedy for the epistemological problem democracy faces in the twenty-first century. It was designed for broadcast media in an era before cable and internet, and its specific mechanisms would not translate directly to the current media environment. But the premise underlying it is more relevant now than when it was first established. Democratic governance requires a shared factual reality. When private actors use public resources to distribute information, they incur obligations to that democratic function. The abolition of the doctrine did not cause Fox News. Fox News was built deliberately, by people who had a plan for it going back to 1970. But the abolition removed the only regulatory mechanism that had, however imperfectly, required broadcasters to acknowledge that there was more than one legitimate perspective on contested public questions. The effect of that removal, compounded across 30 years, is visible in the current state of American political discourse.

What Limbaugh Actually Did

Limbaugh's reach was indisputably enormous. By 2019 he claimed an audience of 15 million per episode, and at his peak his contract with Clear Channel's Premiere Radio Networks was worth about $400 million over eight years. He was the most commercially successful radio host in American history.

What Limbaugh built was a media format and a political phenomenon. The format, extended political commentary delivered with confidence, entertainment value, and contempt for liberals, was copied across the AM radio dial by dozens of imitators. The political phenomenon was more consequential. Limbaugh created a daily information environment for millions of conservatives. Mainstream journalism was not biased. It was untrustworthy. The Democratic Party was not wrong. It was malevolent. The conservative movement was at war with forces trying to destroy the country they loved. What Limbaugh produced was identity construction, not reporting. Listeners did not tune in primarily to learn facts. They tuned in to have their understanding of the political world confirmed and their tribal affiliation reinforced.

The political consequences were measurable. Research by Kathleen Hall Jamieson and Joseph Cappella, documented in their 2008 study of Limbaugh's audience, found that regular listeners held more conservative positions on policy questions, were more likely to vote Republican, and were more resistant to factual corrections from mainstream sources, even when those corrections were accurate. The effect was not primarily about persuasion. It was about inoculation. Listeners immersed in an information environment that consistently characterized mainstream media as untrustworthy became resistant to information that came from mainstream sources, even when that information was accurate. The inoculation effect is durable and has been documented in political science research examining media use and belief updating.

Fox News as a Political Machine

Fox News was not just a television network. It was, from its founding, a political machine. It functioned simultaneously as a megaphone for the Republican Party's message, a platform for Republican politicians and candidates, a revenue source for conservative media personalities who were also political actors, and a shaping force on Republican primary politics that had no equivalent on the Democratic side. Roger Ailes, who ran the network from its founding until 2016, had spent his career as a Republican media consultant before becoming a news executive. He never abandoned the consultant's approach to political communication. Identify the audience's fears and grievances. Give those fears and grievances a name and a villain. Tell the audience that you are the only ones telling them the truth.

The effect on Republican politics was profound and has been documented. Research by economists Stefano DellaVigna and Ethan Kaplan documented measurable increases in Republican vote share in cable markets where Fox News launched. Evidence that the network changed votes, not just views. Subsequent research has confirmed that Fox News pushed Republican voters toward more extreme positions, increased polarization, reduced bipartisan legislative cooperation, and functioned as a primary accountability mechanism. Republican politicians who deviated from Fox-approved positions faced hostile coverage that energized primary challengers. The network operated as a veto player in Republican internal politics. A Republican senator who voted for a bill that Fox News decided to oppose would face saturated negative coverage directed at the most engaged Republican primary voters. It was a direct constraint on legislative behavior.

The internal communications of Fox News, revealed in the Dominion Voting Systems defamation lawsuit settled in 2023, provided an extraordinary window into how the network understood its own function. Hosts and executives who privately expressed doubt about claims of 2020 election fraud, claims they had reason to believe were false, decided to amplify those claims publicly because they feared losing viewers to competitors who were making them. The calculation was explicit in the communications. Telling the truth about the election would cost them audience. Losing audience would threaten the business. Truth was subordinate to audience retention. The audience that Fox News had spent 25 years cultivating had been taught to believe that mainstream journalism was untrustworthy. The consequence of telling them something they did not want to hear was that they would go elsewhere, to sources even less constrained by factual obligation. Fox News had built a machine for manufacturing and maintaining audience loyalty, and then found itself trapped by it.

Social Media and the Acceleration

The shift from the broadcast era to the social media era did not solve the epistemological problem. It intensified it. The economic model of social media platforms, advertising revenue tied to engagement, engagement maximized by content that provokes strong emotional reactions, strong emotional reactions most reliably produced by content that confirms existing beliefs and triggers outrage at perceived threats, created an algorithmic system for distributing information that had no analog in previous media history. The Fairness Doctrine was designed for a world in which a small number of licensed broadcasters distributed information to large audiences. Social media inverted the model. A small number of unlicensed platforms distributed information produced by enormous numbers of users, using algorithms optimized for engagement rather than accuracy.

The consequence was the acceleration of the epistemic fragmentation that commercial talk radio and partisan cable news had begun. Facebook's own internal research, leaked in the Wall Street Journal's 2021 Facebook Files investigation, found that the platform's algorithm amplified outrage and polarizing content because that content generated more engagement. The research found that misinformation was more engaging than accurate information. It spread faster. It was shared more. It provoked more comment. The platform's recommendation systems were directing users toward increasingly extreme content as a function of maximizing time on site. The platform's own researchers documented these effects and warned about them. The platform's leadership declined to make the changes that would have reduced them, because those changes would have reduced engagement, and reduced engagement would have reduced advertising revenue.

This is not a technology problem. Technology will not fix it. It is a political economy problem. The incentive structure of advertising-funded social media platforms rewards the distribution of emotionally engaging content. Emotionally engaging content tends to be partisan, outrage-inducing, and identity-confirming rather than accurate, careful, and challenging. The information environment these incentives produce is hostile to democratic deliberation, which requires the willingness to engage with accurate information even when it is uncomfortable and to update beliefs based on evidence. A political system cannot function well when a significant portion of its participants are immersed in information environments specifically designed to prevent belief updating.

What Local Journalism Was

The decline of local journalism is not primarily a story about the internet destroying newspapers. It is primarily a story about the relationship between journalism and democratic accountability. When the Patriot-Ledger of Quincy, Massachusetts covered the city council, the council members knew they would be quoted, their votes would be documented, and their constituents would read about it. This accountability was not occasional. It was continuous. The reporters knew the beat. They knew which council members voted with the developer and which voted against. They knew the history of contested decisions and the relationships among the parties. This is what local journalism provided that no national outlet can replicate. Institutional memory. Proximity. Continuity. The democratic accountability it produced was not glamorous. It was not Watergate. It was the routine functioning of a system in which people who made public decisions understood they were being observed.

When that observation ends, the behavior changes. The Princeton study of municipal bond markets, which found that cities pay higher borrowing costs after their local newspapers close, is measuring something real about how institutional accountability functions. Bond markets price risk. When local governments lose the journalistic oversight that makes them accountable for their decisions, the bond markets conclude that the risk of fiscal mismanagement has increased, and they are right. The 43,000 journalism jobs lost between 2008 and 2020 were not 43,000 people who had been writing celebrity gossip. They were people who covered city councils and school boards and county commission meetings and the proceedings of courts and the decisions of planning departments. Their absence is not noticed until something goes wrong and nobody noticed it coming.

The information environment the Powell network spent 50 years building, Fox News, conservative talk radio, the think tank network that produces the intellectual justifications, does not primarily inform its audience. It organizes it. It gives millions of people a shared identity, a shared set of enemies, a shared narrative about what is happening to the country and who is responsible. This is enormously politically powerful. It is also epistemologically dangerous, because a political movement organized around a shared narrative rather than shared facts can sustain itself indefinitely against evidence. The narrative explains the evidence away. The evidence is part of the conspiracy. The sources are biased. The experts are captured. What you are left with is a political movement that is organized and motivated and almost entirely unpersuadable by argument. Fact-checking cannot solve this. It requires rebuilding the institutions, local journalism, public media, civic education, that create the shared factual baseline democratic deliberation requires.

Sources, Chapter Three

The history of the Fairness Doctrine is documented at the FCC website and comprehensively in Kathleen Hall Jamieson and Joseph Cappella, "Echo Chamber: Rush Limbaugh and the Conservative Media Establishment" (Oxford University Press, 2008). Roger Ailes's 1970 Nixon-era memo "A Plan for Putting the GOP on TV News" was first reported by Gawker in 2011 and analyzed by Tim Dickinson in Rolling Stone's "How Roger Ailes Built the Fox News Fear Factory" (2011). For Television News Incorporated's history, see the Wikipedia article on Roger Ailes. The Clear Channel growth figures, 40 to 1,240 stations, are from the Wikipedia article on iHeartMedia and multiple contemporaneous news accounts. The 10,000 station transactions and $100 billion figure is from Salon's analysis of radio consolidation (2001). The promise of 1.5 million jobs versus actual loss of 500,000 is documented in the EBSCO Research entry on the Telecommunications Act of 1996. For local newspaper collapse, the University of North Carolina's News Deserts project maintains the most comprehensive ongoing data. For the political consequences of local newspaper closures, see Danny Hayes and Jennifer Lawless, "News Deserts: Local Journalism and Democratic Decay" (2021) and the Princeton study on municipal borrowing costs by Gao, Lee and Murphy (2020).

Chapter Four: Who Pays

Progressive Taxation, the Supply-Side Lie, and the IRS They Gutted on Purpose

In 1981, the United States made a choice about who should bear the cost of what government builds. Everything in this chapter follows from that choice.

What the Money Built

The federal income tax became law in 1913 with ratification of the Sixteenth Amendment. The rates were modest at first. A maximum of seven percent on incomes above half a million dollars. The wars changed that. By World War II, the top marginal rate had climbed to 94 percent on income above $200,000. The system was not as simple as the headline rate suggests. It was progressive, meaning only the income above each threshold was taxed at the marginal rate. Deductions, exemptions, and avoidance strategies meant effective rates were substantially lower than statutory rates. But the principle was clear. Those who had accumulated the most wealth would pay the highest fraction of additional income at the margin.

What those rates purchased, over the two decades following the war, was the most ambitious domestic investment program in American history.

The GI Bill, the Servicemen's Readjustment Act of 1944, sent nearly eight million veterans to college or vocational training. By 1947, veterans accounted for nearly half of all college admissions in the United States. Between 1944 and 1952, 4.3 million home loans worth $33 billion were guaranteed. The suburbs were built on this money. The middle class was built on this money. Two million veterans graduated with degrees in science, technology, engineering, and mathematics, supplying the workforce that would drive the postwar economic expansion. The GI Bill has been described by historians as the most transformative piece of domestic legislation since the Homestead Act. It worked because it was funded. The tax revenues existed to pay for it.

The Federal Aid Highway Act of 1956, signed by Eisenhower, built 41,000 miles of interstate highway over the following two decades. Eisenhower had seen what autobahns did for German military mobility and understood what a modern highway system would do for American commerce and defense. The project cost roughly $425 billion in today's dollars. It was funded through federal fuel taxes and general revenues. Revenues a progressive tax structure made available. The interstate system transformed American commerce, reduced transportation costs, enabled just-in-time supply chains, and connected the country in ways that produced decades of economic returns. It was built with public money. It worked.

Public universities expanded dramatically in the postwar decades, their costs heavily subsidized by state governments whose revenues flowed, in part, from progressive federal taxation and grants. The National Science Foundation, established in 1950, funded the basic research that produced the internet, GPS, the human genome project, and dozens of other technologies whose economic returns have dwarfed the investment by orders of magnitude. The National Institutes of Health funded the medical research that extended American lifespans. NASA put men on the Moon. The investment case for progressive taxation in this era is not complicated. The money went in. The returns came out. The returns were enormous.

Not all of this investment reached everyone equally. The GI Bill's benefits were systematically denied to Black veterans through discrimination in lending, segregated colleges, and racially restricted suburbs. The interstate highway system, in many cities, was routed deliberately through Black neighborhoods, destroying communities that had been built over generations. The public investment of the postwar era was real and its returns were real, but its benefits were unevenly distributed in ways that had lasting consequences for racial wealth gaps that persist to this day. Acknowledging this honestly does not undermine the argument for public investment. It makes the argument for more equitable public investment. Investment that actually reaches everyone it is supposed to reach.

The 91 Percent

Dwight Eisenhower served as president from 1953 to 1961. He was a Republican. The top marginal income tax rate during his entire presidency was either 91 or 92 percent. Eisenhower did not campaign to reduce it. He did not make its elimination a priority. He built things with it. The interstate highways. Expanded Social Security. Federal funding for public schools. The National Defense Education Act. On his way out the door, he delivered his famous warning about the military-industrial complex, a speech that reflected his understanding that some concentrations of economic and political power were incompatible with democratic governance.

What did the 91 percent rate actually mean in practice? Not that anyone paid 91 percent of their total income. The rate applied only to income above the threshold, roughly $200,000 for individuals, which is roughly $2 million in today's dollars. Below that threshold, lower rates applied. Effective tax rates for the very wealthy were substantially below the headline figure because of deductions, exclusions, and the fact that much income at the top was not in the form of wages at all. The Tax Foundation has argued that the 1950s wealthy did not pay dramatically more of their income in taxes than the wealthy of today. The Roosevelt Institute has responded that this argument makes a false comparison. The truly wealthy of the 1950s, people with income equivalent to today's top earners, were rarer and poorer than their modern counterparts, precisely because the high rates and strong unions had compressed the distribution. Had 2010's-style wealth concentration existed in the 1950s, those individuals would have faced effective rates far higher than historical data suggest.

What is not disputed is the correlation. In the years when top marginal rates were highest, inequality was lowest. In the years when top marginal rates have been cut most sharply, inequality has grown most sharply. The mechanisms are multiple. High marginal rates directly reduce post-tax income concentration. They change the incentive structures for executive compensation negotiations, because there is less to fight over when most of the top income will be taxed away. They fund the public investments that benefit the bottom and middle of the income distribution more than the top. Whatever the weight of each mechanism, the correlation across the twentieth century is not ambiguous. It is among the most durable findings in the political economy of inequality.

The Napkin

In December 1974, economist Arthur Laffer had dinner at the Two Continents Restaurant in the Washington Hotel with Dick Cheney, Donald Rumsfeld, and journalist Jude Wanniski. They were discussing President Ford's proposed tax increase. Laffer, according to Wanniski's 1978 retelling of the evening, grabbed a napkin and sketched a curve. On one end of the horizontal axis, a tax rate of zero, which obviously generates zero revenue. On the other end, a tax rate of 100 percent, which also generates zero revenue because nobody will bother working if the government takes everything. Somewhere in between those extremes is the revenue-maximizing rate. The argument follows that if the current rate is above the optimum, cutting it will increase revenue.

This is a mathematically valid observation as far as it goes. The problem is that the curve tells you nothing about where on the curve you are. The entire policy argument depends on whether current rates are above or below the revenue-maximizing point. In the United States in the late 1970s and early 1980s, when the top marginal rate was 70 percent, most mainstream economists believed the country was well below the point at which further cuts would increase revenue. Subsequent research has generally confirmed this. Estimates of the revenue-maximizing top marginal rate for the United States range from about 60 to 75 percent, substantially above where it has been at any point since 1981. Reagan's own budget director, David Stockman, later acknowledged that the supply-side proponents had taken the Laffer curve "literally and primitively," that the "whole California gang" had expected "additional revenue to start to fall, manna-like, from the heavens," and that this was not grounded in any serious economic analysis.

George H.W. Bush, running against Reagan for the Republican presidential nomination in 1980, called the supply-side argument "voodoo economics." He was right. He knew he was right. When he became Reagan's running mate he stopped saying so. When he became president he raised taxes in 1990 because the deficit left by the Reagan cuts demanded it, and it cost him his re-election. Supply-side economics has always worked better as a political formula than as an economic one. As a political formula, it is close to perfect. Tell wealthy voters and their corporations that cutting their taxes will benefit everyone, and you have created a perpetual coalition of donors who will fund your campaigns indefinitely. As an economic claim, it has failed every time it has been seriously tested.

The Forty-Year Track Record

Ronald Reagan took office in January 1981 with the top marginal income tax rate at 70 percent. By the time he left office in January 1989, it was 28 percent. This was the most dramatic reduction in top marginal rates in American history. It was sold on three claims. Growth would accelerate. The benefits would trickle down to working Americans. The resulting economic expansion would increase tax revenues, making the cuts self-financing.

Forty years of evidence have returned a verdict on all three claims.

Growth did not accelerate. The average annual growth rate of the United States economy in the 1980s was 3.1 percent. In the 1990s, after Clinton raised the top rate back to 39.6 percent, it was 3.2 percent. In the 2000s, after the Bush cuts, it was 1.8 percent. In the 2010s, after the Obama partial restoration and before the 2017 Trump cuts, it was 2.3 percent. The claim that cutting top rates accelerates growth has not been confirmed by the data.

The benefits did not trickle down. Between 1979 and 2021, the share of national income going to the top 1 percent roughly doubled, from about 10 percent to about 20 percent. The share going to the top 0.1 percent more than tripled. Real wages for the bottom half of American workers grew at about 0.3 percent per year, when they grew at all. The productivity gains of the period, which were substantial, flowed to the top. What trickled down was not prosperity. It was the stagnation of wages for everyone below the top fifth of the income distribution.

The cuts did not pay for themselves. The Congressional Research Service reviewed six decades of tax data in 2012 and found no statistically significant relationship between top marginal tax rates and economic growth. The Congressional Budget Office scored the 2017 Trump tax cuts as adding $1.9 trillion to the deficit over a decade. They did. The Bush tax cuts of 2001 and 2003 added roughly $1.5 trillion to the deficit before the financial crisis made everything worse. Every supply-side cut has produced a deficit. No supply-side cut has been self-financing.

The 2017 Cut

The Tax Cuts and Jobs Act of 2017, passed on strictly partisan lines and signed by Trump on December 22, 2017, was sold as a tax cut for the middle class and as a driver of business investment. The corporate rate was cut from 35 percent to 21 percent, a 40 percent reduction. The top individual rate was cut from 39.6 percent to 37 percent. The estate tax exemption was doubled to $11 million for individuals and $22 million for couples, meaning that families could inherit up to $22 million tax-free, an amount that by any reasonable measure constitutes intergenerational accumulation of substantial wealth rather than a modest family legacy.

The distributional effects were documented. The Tax Policy Center's analysis found that the top 1 percent would receive approximately 83 percent of the benefits of the individual tax provisions when fully phased in, with individual cuts scheduled to expire in 2025 while corporate cuts were made permanent. The promised business investment boom did not materialize. Corporate profits did not translate into capital investment at the rates the bill's supporters had predicted. Instead, American corporations used roughly $1 trillion of the windfall to buy back their own stock, a form of capital return to shareholders that directly benefited owners of corporate equity, who are concentrated at the top of the income distribution, while producing no corresponding increase in productive capacity or worker compensation. The Congressional Budget Office found that the act increased deficits by $1.9 trillion over 10 years. The ratio of public debt to GDP has risen substantially in the years since.

The IRS They Gutted on Purpose

Meanwhile, the Internal Revenue Service, the agency responsible for collecting the taxes the political class had agreed should be owed, was systematically starved. Between 2010 and 2020, the IRS budget was cut by approximately 20 percent in real terms. Staffing fell from roughly 95,000 employees to about 75,000. The audit rate for high-income taxpayers, those with incomes above $1 million, fell from 8.4 percent in 2010 to 2.2 percent by 2019. The audit rate for millionaires became lower than the audit rate for recipients of the Earned Income Tax Credit, a benefit claimed primarily by working families with incomes below $50,000. The IRS was auditing poor families at a higher rate than it was auditing people whose incomes were twenty to fifty times higher, because poor families' returns were simpler, the legal arguments were more uniform, and the enforcement actions could be handled through letters rather than through the complex examinations that high-income returns require.

Complex returns require expertise. The IRS had lost that expertise. Staff with experience in partnership audits, international tax, and high-income examinations retired or left in substantial numbers, and they were not replaced. The 2022 Inflation Reduction Act appropriated $80 billion over 10 years to rebuild IRS enforcement capacity, an investment projected to return several times its cost in collected tax revenue. Before that money could be fully deployed, the second Trump administration, through the Department of Government Efficiency, dismantled much of the rebuilding effort in the first months of 2025. DOGE fired experienced enforcement staff, canceled modernization contracts, and publicly framed IRS enforcement as a partisan campaign against conservatives, a framing that was factually false and politically effective.

What this produces is a tax system that is progressive on paper and regressive in practice. The statutory rates apply to the income the IRS can document. When the IRS cannot document the income, the rates do not apply to it. The ProPublica investigation of 2021, based on IRS data leaked to the publication, documented what had been long suspected but now definitively shown. The very wealthiest Americans, with the most sophisticated tax planning, often pay effective tax rates lower than middle-class families. Jeff Bezos paid zero federal income tax in 2007 and 2011. Elon Musk paid zero in 2018. Michael Bloomberg paid an effective rate of 1.3 percent over a recent six-year period. Warren Buffett, who has publicly advocated for higher taxes on the wealthy, paid an effective rate of 0.1 percent over the same period when measured against wealth accumulation rather than reported income. These rates are legal. They reflect the interaction of the tax code, the IRS's diminished capacity to enforce it, and the tax planning infrastructure available to people wealthy enough to afford it.

What the Counter-Story Would Require

A tax system that actually achieves the progressivity the statutes purport to establish would require three things. A statutory rate structure more like the one that existed before 1981, with top rates substantially higher than 37 percent and an estate tax exemption substantially lower than $22 million. An IRS with the budget and staff to actually audit high-income returns at rates comparable to the rates at which middle-income returns are audited. And a political commitment to maintaining both of those conditions against the sustained, well-funded effort of the people who benefit from the current arrangement to undermine them.

None of these is technically difficult. Each is politically hard, because the people who benefit from the current arrangement have more political power than the people who would benefit from changing it. That asymmetry is not a natural condition. It is the direct product of the institutional apparatus this book has been describing. The think tanks that produce the arguments for supply-side tax policy are funded by the people who benefit from supply-side tax policy. The political campaigns that elect the politicians who vote for supply-side tax policy are funded by the people who benefit from supply-side tax policy. The judicial appointments that produce the legal rulings that restrict efforts to reverse supply-side tax policy flow from an apparatus funded by the people who benefit from supply-side tax policy. The entire system is self-reinforcing. Breaking it requires sustained counter-effort at a scale that has not been organized in the 50 years since Reagan's election.

The specific numbers that measure the consequences of 44 years of supply-side policy are available and precise. The top 1 percent's share of national income has roughly doubled. The top 0.1 percent's share has more than tripled. Real wages for the bottom half of workers have barely grown. Public debt has risen dramatically, primarily because the revenues needed to fund the things the public wanted government to do were cut while the spending continued. The infrastructure has been allowed to decay. The universities have been priced out of reach for much of the middle class. The retirement security of working Americans has been systematically weakened.

These are not metaphors or predictions. They are the documented outcomes of policy choices made by specific people for specific reasons over a specific period of time. The people who made the choices have been explicit, in their own writings, about their goals. The goals have largely been achieved. The country the choices produced is the country you are living in.

Sources, Chapter Four

The history of the federal income tax and the tax rates at various points in American history are documented at the Tax Foundation (taxfoundation.org/federal-income-tax-rates-brackets-history) and at the IRS website. For the Laffer curve's origin and Reagan tax cuts, see David Stockman, "The Triumph of Politics: Why the Reagan Revolution Failed" (Harper & Row, 1986). For distributional analyses of the Trump 2017 cuts, see the Tax Policy Center, "Distributional Analysis of the Conference Agreement for the Tax Cuts and Jobs Act" (December 2017). The CBO score of the 2017 cuts is CBO publication "The Budget and Economic Outlook: 2018 to 2028" (April 2018). ProPublica's investigation of the secret IRS files by Jesse Eisinger, Jeff Ernsthausen, and Paul Kiel, beginning June 8, 2021, documents the effective tax rates of the wealthiest Americans. For the IRS budget and audit rate analysis, see the Center on Budget and Policy Priorities and the Treasury Inspector General for Tax Administration reports. The Congressional Research Service study on top marginal rates and growth is Thomas L. Hungerford, "Taxes and the Economy: An Economic Analysis of the Top Tax Rates Since 1945" (Congressional Research Service, 2012).

Chapter Five: The Casino

Glass-Steagall, Its Repeal, and What We Learned Again About Banks Behaving Like Banks

The Law That Remembered

In 1933, the United States was four years into the Great Depression. Banks were failing. Depositors were losing their savings. The financial system that was supposed to allocate capital and protect wealth had become the mechanism for destroying both. Congress responded with the Banking Act of 1933, known after its co-sponsors as the Glass-Steagall Act. It did a simple thing. It separated commercial banking from investment banking. Institutions that accepted deposits from ordinary citizens and made loans to ordinary businesses were prohibited from engaging in speculative securities trading. Investment banks that underwrote securities and made bets on financial markets were prohibited from accepting deposits. The two activities were walled off from each other because experience had taught that when they operated under the same institutional roof, the investment banking side of the business used depositor funds for speculation, lost them, and brought the whole institution down.

The act also established the Federal Deposit Insurance Corporation, which guaranteed depositors up to a specified amount per account. Together, deposit insurance and the separation of commercial and investment banking produced 66 years of financial stability in the United States. Not because bankers became virtuous. Because the structural incentives changed. Commercial banks competed on prudent lending and customer service. Investment banks competed on their ability to raise capital for productive enterprise. Neither could use the other's business model to transfer risk to depositors who had not signed up for it.

The Pecora Commission hearings of 1932 and 1933, which investigated the causes of the 1929 crash and the subsequent bank failures, produced a detailed record of the abuses that had led to Glass-Steagall. Ferdinand Pecora, the chief investigator, documented how investment bank affiliates of commercial banks had engaged in pump-and-dump schemes, sold worthless securities to unsuspecting depositors, and used depositor funds to prop up speculative positions. The stories were specific. The institutional names were named. The testimony was extensive. Congress passed Glass-Steagall with overwhelming majorities because the evidence was overwhelming.

The Forty-Year Erosion

Glass-Steagall did not fall all at once. It was eroded gradually through a series of Federal Reserve interpretations that expanded the definition of what commercial banks could do. In 1987, the Fed under Alan Greenspan allowed commercial banks to earn up to 5 percent of their revenue from underwriting securities. In 1989, the limit was raised to 10 percent. In 1996, it was raised to 25 percent. Each of these expansions was challenged by consumer groups and defended by the banking industry. Each was approved by a Fed that had come to view the separation as an anachronism rather than a necessary protection. The philosophy that financial markets would regulate themselves through competitive discipline had captured the regulators. The structural protections Glass-Steagall had provided were disassembled in pieces.

The final repeal came in 1999. Citigroup, formed by the 1998 merger of Citicorp, a commercial bank, and Travelers Group, which included the investment bank Salomon Smith Barney, was operating in a gray area the expanded Fed interpretations had opened up. The merger had been announced in April 1998 and required a waiver of the Glass-Steagall provisions that still technically prohibited it. Rather than unwind the merger when the waiver was denied, the industry pushed for legislative repeal. The Gramm-Leach-Bliley Financial Services Modernization Act was signed by Bill Clinton on November 12, 1999. It repealed the Glass-Steagall provisions separating commercial and investment banking. The industry got what it wanted. The wall came down.

Senator Byron Dorgan, a Democrat from North Dakota, was one of the few senators to oppose the repeal. In his floor speech, he said the act would produce a financial crisis within a decade. "I think we will look back in ten years' time," he said, "and say we should not have done this but we did because we forgot the lessons of the past." He was off by a year. The crisis came in 2008.

The Culture That Won

The most important consequence of the Glass-Steagall repeal was not immediately visible in any balance sheet or regulatory filing. It was cultural. Nobel laureate Joseph Stiglitz identified it precisely. When investment banking and commercial banking were merged under a single institutional roof, the investment banking culture won. This is not a metaphor. It is a description of what happens when two organizational cultures with fundamentally different risk tolerances and profit orientations are forced to compete for resources and status within the same institution.

Commercial banking culture, the culture that had prevailed in deposit-taking institutions since Glass-Steagall, was conservative by design. Commercial bankers made money by lending money, collecting interest, and not losing their principal. They were paid to assess risk carefully, to know their borrowers, to build long-term relationships with the communities they served. The downside of bad decisions was real and personal. Loans went bad. Banks lost money. Reputations suffered. The upside of good decisions was steady and modest. A profitable, stable institution over decades.

Investment banking culture was different. Investment bankers made money by originating transactions, collecting fees, and moving risk off their books as quickly as possible. The business model was not about the long-term performance of the underlying assets but about the volume of deals done and the fees collected on each one. Risk was someone else's problem once the transaction was complete. The upside of a good year was enormous. Bonuses dwarfed anything possible in commercial banking. The downside was that if the bets went bad, the losses fell on the institution, on its counterparties, on the broader economy. The incentive structure rewarded short-term transaction volume and punished risk aversion.

When these two cultures merged, the investment banking culture's incentive structure, high fees, high volume, risk transfer, proved more attractive to ambitious employees and more profitable in the short term. Commercial bankers who wanted to advance their careers learned to think like investment bankers. Institutions that wanted to compete with the most profitable parts of the industry restructured themselves around the investment banking model. The result was that banks which had held deposits and made conservative loans for decades began originating mortgages not to hold them but to bundle them into securities and sell them to investors, collecting the origination fees and transferring the credit risk. The customer's creditworthiness mattered only long enough to generate the paperwork. After that, the loan was someone else's problem.

The Machine

The securitization of mortgages was not itself a new idea. Fannie Mae and Freddie Mac had been packaging mortgage loans into securities for decades, under regulatory supervision and with explicit guarantees about the quality of the underlying loans. What was new after the Gramm-Leach-Bliley repeal was the extension of securitization to mortgages that Fannie and Freddie would not touch. The subprime. The no-documentation. The adjustable-rate loans with teaser rates that would reset dramatically after an initial period. The NINJA loans, No Income, No Job, No Assets, that required borrowers to attest to income they did not have.

The machine worked like this. Mortgage brokers originated loans to borrowers they knew could not sustain them in the long run, collecting origination fees and selling the loans to banks. The banks bundled the loans into mortgage-backed securities, collected structuring fees, and sold them to investors. Rating agencies, paid by the banks, assigned investment-grade ratings to tranches of these securities that did not deserve them, because the mathematical models used to assess the credit quality of the bundles assumed that housing prices could not fall simultaneously across the entire country. They had never done so in the postwar period. The models extrapolated from a world in which Glass-Steagall still functioned, in which the originators of mortgages held them on their own books and therefore had a reason to care whether borrowers could pay them back.

By 2006, one-third of all mortgages originated in the United States were subprime or no-documentation loans. In 2005, 43 percent of first-time homebuyers made zero down payment. The five largest investment banks had leveraged themselves to ratios of 30 or 40 to one, borrowing $30 or $40 for every $1 of their own capital to amplify their returns. The SEC had explicitly permitted this in 2004, granting an exemption to the net capital rule for the five largest firms: Goldman Sachs, Lehman Brothers, Merrill Lynch, Bear Stearns, and Morgan Stanley. The exemption allowed them to take on debt levels that would have been considered dangerously reckless in any other era. All five of those firms failed or required government support in 2008.

September 2008

The housing bubble peaked in 2006. Home prices began falling in 2007. By August of that year, pressures were emerging in the market for asset-backed commercial paper, the short-term funding that much of the securitization machine depended on, as investors became nervous about exposure to subprime mortgages. In March 2008, Bear Stearns collapsed and was acquired by JPMorgan Chase in a transaction facilitated by the Federal Reserve. In July, the government took over Fannie Mae and Freddie Mac, placing them in conservatorship on September 7. The companies together owned or guaranteed half of the American housing market.

On September 15, 2008, Lehman Brothers, the fourth-largest investment bank in the United States, filed for Chapter 11 bankruptcy protection. It was the largest bankruptcy filing in American history. $639 billion in assets. $613 billion in debts. The firm had survived the Civil War, two world wars, the Great Depression, and every financial crisis of the previous century and a half. It could not survive the concentrated exposure to mortgage-backed securities and the leverage ratios that the post-Glass-Steagall environment had made possible and the SEC had explicitly permitted. Treasury Secretary Hank Paulson and Federal Reserve Chairman Ben Bernanke spent the weekend trying to arrange a rescue, approaching Barclays and Bank of America as potential buyers. Neither would take on the full scope of the losses. When the negotiations collapsed at 1:45 in the morning, Lehman filed.

The Dow fell 504 points the day of the filing, its worst single-day decline in seven years. The day after, the Federal Reserve lent $85 billion to American International Group, AIG, which had insured trillions of dollars worth of mortgage-backed securities through credit default swaps and now faced obligations it could not meet. Investors began withdrawing money from money market funds at a rate not seen since the Depression. $144 billion in a single day on September 17. The short-term lending markets that financed the day-to-day operations of ordinary businesses froze. The United States was within days of a complete seizure of the credit system.

Congress passed the Troubled Asset Relief Program, TARP, on October 3, authorizing $700 billion for the purchase of toxic assets and the recapitalization of the banking system. The nation was, as the Treasury Department later acknowledged, losing nearly 800,000 jobs a month. Household wealth had fallen by 17 percent, more than five times the decline in 1929. From peak to trough, gross domestic product fell 4.3 percent, making the Great Recession the deepest since World War II. Unemployment reached 10 percent. Home prices fell more than 20 percent on average nationally. Six million households lost their homes to foreclosure. The total loss in household wealth, retirement accounts, home equity, and savings was roughly $11 trillion.

Who Was Rescued and Who Was Not

The banks were rescued. The households were not.

In Stockton, California, the foreclosure crisis hit with a force that turned an ordinary working-class city into a symbol. Stockton had grown rapidly during the housing boom, fueled by subprime mortgages that allowed buyers who could not afford conventional loans to purchase homes at the peak of inflated prices. When the bubble collapsed, the city's homeownership rate, which had been a source of civic pride, became a liability. The unemployment rate rose above 20 percent. The city filed for bankruptcy in 2012, the largest municipal bankruptcy in American history at the time. Libraries closed. Police were laid off. Streetlights went dark to save money on electricity. The people who had taken out the mortgages they were told they could afford, on terms the mortgage originators had designed to be affordable only while rates stayed low, which they did not, lost their homes. The mortgage originators had collected their fees and sold the loans. The investment banks had collected their structuring fees and sold the securities. The rating agencies had collected their fees and assigned investment-grade ratings. Stockton paid.

The racial dimensions of the subprime mortgage crisis were not incidental. Investigative reporting and academic research documented that mortgage lenders systematically steered Black and Latino borrowers into subprime loans even when those borrowers qualified for conventional mortgages. The practice, called reverse redlining, was the precise inverse of the original redlining. Instead of refusing to lend in minority neighborhoods, lenders targeted minority borrowers with the most expensive and risky products. The result was that Black and Latino households, who had been building wealth through homeownership at higher rates than ever before in the early 2000s, lost that wealth at higher rates than white households when the crash came. A Pew Research Center study found that the median wealth of Hispanic households fell 66 percent between 2005 and 2009, and the median wealth of Black households fell 53 percent. The median wealth of white households fell 16 percent. The crash did not distribute its pain equally. It landed hardest on the people who had been told, for years, that they were finally getting their share of the American prosperity they had always been excluded from.