The Post World War II Boom: How America Got Into Gear
In the summer of 1945, as World War II drew to a close, the U.S. economy was poised on the edge of an uncertain future.
Since President Franklin D. Roosevelt’s call in late 1940 for the United States to serve as the 𠇊rsenal of democracy,” American industry had stepped up to meet the challenge. U.S. factories built to mass-produce automobiles had retooled to churn out airplanes, engines, guns and other supplies at unprecedented rates. At the peak of its war effort, in late 1943 and early 1944, the United States was manufacturing almost as many munitions as all of its allies and enemies combined.
On the home front, the massive mobilization effort during World War II had put Americans back to work. Unemployment, which had reached 25 percent during the Great Depression and hovered at 14.6 percent in 1939, had dropped to 1.2 percent by 1944—still a record low in the nation’s history.
A new assembly line at Detroit Tank Arsenal operated by Chrysler which turned out 28-ton tanks by mass-production methods.
Gordon Coster/The LIFE Picture Collection/Getty Images
Even before the war ended, U.S. business, military and government officials began debating the question of the country’s reconversion from military to civilian production. In 1944, Donald Nelson of the War Production Board (WFB) proposed a plan that would reconvert idle factories to civilian production. Powerful military and business leaders pushed back, and plans for widespread reconversion were postponed.
But with the war wrapping up, and millions of men and women in uniform scheduled to return home, the nation’s military-focused economy wasn’t necessarily prepared to welcome them back. As Arthur Herman wrote in his book Freedom’s Forge: How American Business Produced Victory in World War II, U.S. businesses at the time were still “geared around producing tanks and planes, not clapboard houses and refrigerators.”
What Caused the Great Depression?
Throughout the 1920s, the U.S. economy expanded rapidly, and the nation’s total wealth more than doubled between 1920 and 1929, a period dubbed “the Roaring Twenties.”
The stock market, centered at the New York Stock Exchange on Wall Street in New York City, was the scene of reckless speculation, where everyone from millionaire tycoons to cooks and janitors poured their savings into stocks. As a result, the stock market underwent rapid expansion, reaching its peak in August 1929.
By then, production had already declined and unemployment had risen, leaving stock prices much higher than their actual value. Additionally, wages at that time were low, consumer debt was proliferating, the agricultural sector of the economy was struggling due to drought and falling food prices and banks had an excess of large loans that could not be liquidated.
The American economy entered a mild recession during the summer of 1929, as consumer spending slowed and unsold goods began to pile up, which in turn slowed factory production. Nonetheless, stock prices continued to rise, and by the fall of that year had reached stratospheric levels that could not be justified by expected future earnings.
Great Depression: American Social Policy
America in the 1920s was a prosperous nation. Savings during the decade quadrupled. 1 A “housing boom” enabled millions of Americans to own their own home. By 1924, about eleven million families were homeowners. Automobiles, electricity, radio, and mass advertising became increasingly influential in the lives of average Americans. Automobiles, once a luxury for rich Americans, now gave industrial workers and farmers much greater mobility. Electricity put an end to much of the backbreaking work in the American home. Electric refrigerators, irons, stoves, and washing machines eventually became “widespread. 2 On the farm, electric tools such as electric saws, pumps, and grinders made farmers more productive. By 1922, radios were common sources of news and entertainment for American families. With improvements in transportation and communication came increases in the mass advertising industry. In addition to all of this, corporations increasingly offered workers fringe benefits and stock-sharing opportunities. 3
The Great Depression
The overall prosperity of the United States in the 1920s overshadowed the chronic poverty of certain vulnerable populations. These were the same populations that had always been at risk in American history: children, older Americans, minorities, female-headed families, people with disabilities, and workers with unstable or low-paying jobs. According to James T. Patterson, author of America’s Struggle Against Poverty: 1900-1994, about one-fourth of the population in southern rural areas consisted of poor sharecroppers and tenant farmers. 4 Over a third of these small farmers were African Americans. This is what Patterson refers to as the “old poverty.” 5 The “new poverty” began with the famous stock market crash of 1929 and the onset of the Great Depression. This is when many middle and upper-income families first experienced poverty in America. These were hard-working people who fully shared the values and ideals of the American dream, people who had enjoyed the strong economy of the 1920s and had bought the homes, refrigerators, and automobiles. The sudden and severe downturn of the American economy left many of these people in shock and denial. Some became suicidal. Between 1929 and 1933, unemployment in the United States jumped from 3.2 percent to 24.9 percent, almost a quarter of the official labor force. 6 This represented 12.8 million workers. 7 Unemployment in some cities was as high as 80 percent, 8 out of 10 workers. 8 During this period, consumer spending declined 18 percent, manufacturing output dropped 54 percent, and construction spending plummeted 78 percent. Eighty percent of production capacity in the automobile industry came to a halt. By 1932, many politicians, businessmen, and journalists started to contemplate the possibility of massive revolution in the United States. 9 In fact, thousands of the most desperate unemployed workers began raiding food stores. Reminiscent of the food riots during the breakdown of the feudal system in Europe, this looting became widespread by 1932. Demonstrations by the poor demanding increased relief often resulted in fights with the police. In places like Harlem, the “sit-down strike” became part of the strategy during these relief demonstrations. A Pittsburgh priest named Father James R. Cox attracted 60,000 people to a protest rally 12,000 of these followers later joined Cox in Washington to protest in front of President Herbert Hoover. When 5,000 war veterans demonstrated in Washington in the spring of 1932, Hoover sent none other than General Douglas MacArthur and Major Dwight Eisenhower to break up the rally. One observer describes the treatment of the veterans:
“The police encircled them. There was some brick throwing. A couple of police retaliated by firing. A man was killed and another seriously wounded….To my right…military units were being formed….A squadron of calvary was in front of this army column. Then, some staff cars, and four trucks with baby tanks on them, stopped near the camp. They let the ramps down and the baby tanks rolled out into the street….The 12th Infantry was in full battle dress. Each had a gas mask and his belt was full of tear gas bombs….They fixed their bayonets and also fixed the gas masks over their faces. At orders, they brought their bayonets at thrust and moved in. The bayonets were used to jab people, to make them move….The entire block was covered by tear gas. Flames were coming up, where the soldiers had set fire to the buildings housing protesters to drive these people out.” 10
The Political Response—Franklin D. Roosevelt and The New Deal
One observer pointed out to Franklin D. Roosevelt (FDR) upon taking office that, given the present crisis, he would be either the worst or greatest president in American history. Roosevelt is said to have responded: “If I fail, I shall be the last one.” 11 By the time Franklin Roosevelt was elected in 1932, the traditional ideologies and institutions of the United States were in a state of upheaval. 12 Americans who had grown up promoting the ideology of the “deserving and undeserving poor” and the stigma of poor relief were now standing in line for relief. Private nonprofit organizations such as Community Chests, although valiant in their effort, were overwhelmed with requests, unable to meet the needs of their communities. State and local governments, ultimately responsible for their poor throughout American history, now looked for financial assistance. What was needed was an expanded institutional partnership between the federal government and the other sectors of American society in promoting social welfare. In the past, the federal government had been active in other areas such as railroad development and war veteran pensions. However, the American belief, as earlier expressed by President Franklin Pierce to Dorothea Dix, was that the federal government should not be involved in providing poor relief. 13 But now the size of this national crisis required a national solution. The federal government was in the best position to initiate and coordinate national efforts among public, private, and nonprofit sectors of society. As the crisis deepened, progressive leaders and average Americans increasingly demanded that the federal government take greater responsibility in relieving and preventing poverty.
One of the more radical policy proposals to address the Great Depression was put forth by Senator Huey Long from Louisiana and a second by Dr. Francis Townsend from California. 14 Long (who was later assassinated) proposed a “share the wealth” program where millionaires would be taxed to fund pensions for anyone over 60 years of age. The cost of the program, to be funded by an income tax, was projected to be $3.6 billion, a colossal amount of money at the time. Townsend proposed a special sales tax to pay every American citizen over 60 (except convicted felons) $200 per month. The total cost of the proposal was estimated to be $2.4 billion. About 25 million people signed petitions in support of Townsend’s plan! Consequently, the Roosevelt Administration established a two-tier federal system of insurance and relief programs. But to address the social unrest throughout the nation, he took immediate action to create job opportunities. He did so by establishing several federal agencies and programs. 15 One was the Federal Emergency Relief Administration (FERA), which was created by the Federal Emergency Relief Act in 1932.
As its name suggests, FERA was given primary responsibility for managing the effort to distribute federal relief funds to individual states. The relief funds were used to sustain unemployed families during the immediate crisis. The Civilian Works Administration (CWA) was actually part of FERA. This federal program created jobs in public works. These public sector jobs included road repair, the digging of drainage ditches and the maintenance of local parks. The Public Works Administration (PWA), created in 1933, also focused on public works. However, in contrast to the CWA, it focused on complex public works such as dams and airports. Another program started in 1933 was the Civilian Conservation Corps (the CCC, of course!) The target population of this program was unemployed youth. That is, the Civilian Conservation Corps provided jobs for youth in various parks. The U.S. Army was used to supervise the youth. Furthermore, Congress passed the Wagner-Peyser Act in 1933. This legislation provided federal funding to individual states to develop employment offices. Only 23 states had such services before 1933. And finally, though not directly job-related, emergency food programs were set up to prevent starvation. For instance, surplus agricultural goods were distributed to the poor. Also, a relatively small-scale “food stamp” program was established for needy federal workers.
Federal reforms during the FDR Administration also included reforms to stabilize the economic sector. 16 These included creation in 1933 of the National Recovery Administration (NRA). This controversial program, which was declared unconstitutional by the Supreme Court in 1935, temporarily threatened capitalist ideology by directly intervening in the “supply and demand” workings of the market. More precisely, this federal initiative sought to stabilize the economy by establishing wage and price agreements to curb the slashing of prices and wages during the depression. To further support product prices, production quotas were established to deter the “dumping” of surplus inventories of products on the consumer market. Similarly, the Agricultural Adjustment Agency was created to curtail farm production in order to maintain higher farm prices (and prevent further bankruptcies in the farm sector). Also established in 1933 was the Federal Deposit Insurance Corporation (signified by the FDIC window sticker at your local bank). A primary responsibility of this entity was to restore public confidence in the banking system. The FDIC worked with participating banks to insure consumer bank deposits against bank insolvency. The federal government also collaborated with banks to address the millions of farms and homes threatened with foreclosure. For example, the federal government directly purchased from banks and refinanced (at a lower interest rate) the mortgages of needy farmers through passage of the Emergency Farm Mortgage Act and the Farm Relief Act. Both were enacted in 1933.
A year later the National Housing Act established the Federal Home Administration (FHA). Through this program the federal government insured home mortgages and home improvement loans, allowing banks to refinance the loans of needy families at lower interest rates. Additional economic reforms included the establishment of the Tennessee Valley Authority (TVA) in 1933 and the Securities and Exchange Commission (SEC) in 1934. The goal of the TVA was to facilitate economic development in that region of the country. To this end, dams and generating plants were constructed, providing inexpensive electric power to the region. The TVA also developed flood-control projects, manufactured and sold fertilizer, and reforested large tracts of land. Regarding the Securities and Exchange Commission, many people felt that rampant speculation in the stock market played a significant role in causing the stock market crash and subsequent depression. Therefore, the Securities and Exchange Commission took on the responsibility of regulating speculation abuses by investors and stockbrokers.
Question for Discussion: Presidents and Disabilities Franklin D. Roosevelt is generally considered to be one of the three greatest presidents in American history, along with Lincoln and Washington. FDR also happened to have a disability, coping with infantile paralysis or “polio” throughout much of his adult life. Because the disease left his legs paralyzed, he could not walk without assistance. 17 Yet, during his campaign for president, FDR traveled 13,000 miles by train and made 16 major speeches. 18 Throughout his presidency, people were amazed at his energy and optimism. He held office longer than any president in American history, leading the United States through two of its biggest crises in the Twentieth Century, the Great Depression and World War II. Could Roosevelt be elected president today? How would the press cover his disability? How would the voters react to a candidate who could not walk without assistance? This first set of reforms, as previously stated, was an emergency stop-gap measure. From November of 1934 to November of 1936, the Roosevelt Administration implemented a second set of reforms meant to define an ongoing responsibility of the federal government, a responsibility for social welfare similar to that found in European nations. 19 The major piece of legislation passed during this period was the Social Security Act of 1935.
This legislation constituted a package of social programs consisting of both insurance and poor relief (later referred to as “public assistance” or “welfare”). With respect to insurance, the act contained both unemployment insurance and old age pensions (commonly known as “Social Security”). Unemployment insurance was very unpopular with business leaders. To illustrate, as late as 1931, Henry Ford persisted in blaming mass unemployment on individual laziness. He claimed there was plenty of work for those who wanted it! 20 Yet, packaging unemployment insurance with more popular programs such as old age pensions, Roosevelt was able to pass the legislation. The Social Security Act also contained several federal poor relief programs. Meant to be a continuing federal responsibility, these programs included Old Age Assistance, Aid to the Blind, and Aid to Dependent Children (ADC). 21 ADC, as the name suggests, targeted relief to poor children in single parent families. It was not until 1950 that the single parent became officially eligible for assistance also. Note that prior to the New Deal, relief was a tool used by social workers to rehabilitate. 22 To get relief, a person had to accept rehabilitation services from a social worker (including a significant dose of moral instruction!) With the New Deal, poor relief became a right of American citizens meeting certain eligibility standards, including of course, financial need. In other words, poor relief became, not a “means” to rehabilitation, but rather, an “end in itself.” The Social Security Act promoted cooperation between the federal government and the states in providing poor relief through the use of “matching funding formulas.” 23 That is, for every dollar of state funding expended in the Old Age Assistance, Aid to the Blind, and Aid to Dependent Children programs, the federal government contributed a specified percentage of funding. Yet, the legislation allowed each state to determine eligibility standards and levels of benefits. Also contained in the legislative package were a number of smaller scale health and human service programs.
These included child welfare and maternal health programs in Title V of the act and public health programs in Title VI of the legislation. During this second round of reforms, the Roosevelt Administration continued to confront massive unemployment and labor unrest. Numerous strikes took place throughout the country. To support the rights of union organizers, the Wagner Act was passed in 1936. 24 This legislation established the National Labor Relations Board. The board enforced the right of workers to start their own unions. For instance, specific procedures for starting unions were outlined, including voting procedures for choosing a collective bargaining agent. The Roosevelt Administration also implemented major federal initiatives during this “second New Deal” that were later terminated. 25 One was the Works Progress Administration (WPA), which replaced the Federal Emergency Relief Administration created at the start of the New Deal. About 85% of program participants were receiving poor relief. Program eligibility was limited to one member of each family. Because this was typically a male, the program was considered by some to be discriminatory. In any case, the WPA employed two million people a month building libraries, schools, hospitals, parks, and sidewalks. 26
Eleanor Roosevelt was a strong advocate of a major program located within the WPA called the National Youth Administration. 27 A forerunner of modern student financial assistance, this program allowed high school and college students to finish their education by providing part-time public sector jobs. It also established rural camps where youth could learn trade skills. The WPA also funded several projects which put people in the arts to work. 28 For example, the New Deal established the Federal Theater Project, which created jobs for actors and playwrights and entertainment for laborers. In addition, a Federal Writers Project and a Federal Art Project were funded. In so doing, writers were put to work preparing items such as tourist guides to American states and cities, while artists painted murals on the walls of public buildings. After 1936, the Roosevelt Administration met greater opposition to its reform agenda from Republicans and conservative Democrats.
There were several reasons for this opposition. 29 First of all, the New Deal had not succeeded in ending the depression. The national economic troubles continued despite the broad array of reforms. Secondly, many political and business leaders felt uncomfortable with Roosevelt’s continuing spending deficit. (To fund the New Deal and stimulate economic growth, the Roosevelt Administration spent more than the federal government was actually receiving in tax revenue.) A third reason for the opposition to further reform was the fear of socialism in America. The New Deal with its massive public employment and national poor relief programs was a fundamental change in America’s institutional structure, a change that threatened the ideology of the nation’s conservative leaders. Adding to this fear was the growing power of labor unions across the country. Roosevelt, after all, had supported legislation (Wagner Act) to facilitate this development, despite the opposition of business leaders. All of these developments led to a growing resentment by conservative Republicans and Democrats of Roosevelt’s Administration, the so-called “brain trust.” Hence, the growing opposition to additional social reform. Despite this opposition, the Roosevelt Administration did manage to get the Wagner-Steagall Housing Act passed in 1937. 30 This act established the U.S. Housing Authority, which provided low-interest loans to local government for the development of public housing. Another late New Deal success was the Fair Labor Standards Act, passed in 1938. This legislation established minimum wages and maximum work hours. (Remember that both minimum wages and maximum work hours were part of the policy agenda of the earlier Progressive Era.) However, to appease southern interests, the legislation did not cover farm labor.
The Role of Social Work in the New Deal
By the beginning of the Great Depression, social work in the United States had experienced much growth and maturation as a professional discipline. Responding to the criticism that social work was made up of kind-hearted people doing activities that almost anyone could do, Mary Richmond’s 1917 publication, “Social Diagnosis,” provided a “body of knowledge” for professionalization. 31 The book emphasized casework techniques that focused on the person in their environment. That is, although Richmond held the sociological perspective that individual problems were rooted in the social environment (unemployment, etc.), her book adopted a medical model process of differential diagnosis of individual cases. Based on this careful collection of client information, treatment would then consist of some combination of individual and environmental change. (It should be noted, however, that Richmond was not a great enthusiast for “wholesale” social reform, preferring instead “retail” interventions.) As the decade of the 1920s progressed, the social work profession increasingly reflected the conservative trend across the nation. 32 Times were good jobs were plentiful. Once again, social problems such as poverty and unemployment were traced to the individual.
Psychiatric social work, led in part by Smith College, became the rage within the profession. In the process, the psychoanalytic work of Sigmund Freud, which became popular nationally, provided social workers with needed theory and individual treatment methods. In the 1920s, society viewed individual dysfunction as a sign, not of immorality so much as, emotional disorder. As John Ehrenreich put it, individual need was not a matter for Saint Peter as much as it was for Saint Sigmund. In any case, the emphasis on casework facilitated the professionalization of social work for numerous reasons. 33 Casework was much less threatening to the middle and upper classes than cause-related social work, better known as social reform. In fact, business and professional people were a ready clientele for psychoanalysis. To establish itself as a profession, social work needed the support of these middle and upper-income groups. It needed their fees for service it needed their sanction. Thus the profession of social work with its growing emphasis on casework fit the social, economic, and political needs of the conservative and prosperous 1920s.
By 1929, there were 25 graduate schools of social work. 34 Several professional organizations had been established, including the American Association of Social Workers in 1921. In addition, to further knowledge based in research, several professional journals were developed, including “The Compass,” which was later renamed, “Social Work.” When Franklin Roosevelt took office, he made several social workers prominent figures in his administration. This is despite the fact that the profession as a whole was reluctant to return to a social reform (i.e., “macro”) emphasis. 35 Private nonprofit organizations remained the dominant provider of casework by social workers. Yet, during the New Deal, public agencies primarily distributed relief funds to the needy. This is where the action and the jobs were to be found. And, as stated, social workers played major roles in policy development. FDR’s wife, Eleanor Roosevelt, was probably the most influential person in the White House. Although she did not hold a “social work” degree, Eleanor received on-the-job training working in New York settlement houses. 36
In fact, her approach to the role of First Lady reflected the settlement philosophy of “research and reform.” Her trips around the nation and the world collecting information for her husband are legendary. She attracted much press coverage and seemed to be everywhere. She was his eyes and ears, his data collector. He knew he could count on her to bring back detailed information concerning public sentiment and social need. All of this “research” was a prerequisite for developing the social policy of the New Deal. Harry Hopkins, a social worker with settlement house experience, was the next most influential person to the President. In fact, it was Eleanor who first observed Hopkins as a passionate, young social worker in New York and referred him to her husband. 37 After managing Roosevelt’s relief program in New York, Hopkins was selected to head the Federal Emergency Relief Administration, and later its successor, the Works Progress Administration. 38
A third prominent member of the Roosevelt Administration with social work training and settlement house experience was Frances Perkins. Perkins was the first woman appointed to the President’s Cabinet in U.S. history, serving as Secretary of the Department of Labor. 39 Early in her career, she worked at two Chicago settlement houses, Hull-House and Chicago Commons. 40 In 1909, she attended the New York School of Philanthropy (which would become the Columbia University Graduate School of Social Work) to learn survey research methods. A year later, she received her Master’s Degree in Political Science from Columbia University. Before becoming Labor Secretary, Perkins had headed the Roosevelt’s New York State Industrial Board, a position in which she advocated for safer factory and labor standards. 41 Other influential social workers in the Roosevelt Administration included Grace Abbott, Paul Kellogg, Adolph Berle, Henry Morgenthau, Jr., and Eduard Lindemann. 42
In addition to these prominent policy development roles, the New Deal created thousands of new “rank-and-file” jobs in social work. In fact, the Federal Emergency Relief Act required that every local public relief administrator hire at least one experienced social worker on their staff. 43 This requirement introduced social work ethics and methods into every county and township in America. During the 1930s, the number of employed social workers doubled, from about 30,000 to over 60,000 positions. This job growth created a major shift in social work practice from primarily private agency settings and clinical roles to public agencies and social advocacy. The New Deal also expanded the scope of social work from a primarily urban profession to a nationwide profession practicing in rural areas as well.
Did You Know?
Harry Hopkins, a social worker, was so respected by President Franklin Roosevelt that, before Hopkins’ health started to deteriorate, some believed that Roosevelt was grooming him to be the next President of the United States. 44 During World War II, Roosevelt sent Hopkins to be his special representative in talks with both Winston Churchill and Joseph Stalin.
Successes and Failures of the New Deal
The New Deal had many shortcomings. 45 As stated earlier, it was World War II that did the most to solve unemployment during the Great Depression. And although the Social Security Act contained some relative small health programs, the New Deal as a whole established no major national health program. Furthermore, to appease southern politicians and get some reform legislation passed, Roosevelt did relatively little to help African Americans. 46 Many of these citizens were employed as domestic servants, migrant workers, and farm laborers. New Deal legislation concerning old age pensions, unemployment insurance, and minimum wages did not cover workers in these occupations. Perhaps most regrettable from an ethical standpoint, the New Deal contained no anti-lynching legislation – even though the beating and lynching of black citizens was still a common occurrence in some parts of the nation.
If America as a nation suffered during the Great Depression, African Americans and other minorities suffered worst of all. 47 Eleanor Roosevelt was probably the most powerful political ally of African Americans during the Roosevelt Administration. As historian Doris Kearns Goodwin has noted, Franklin Roosevelt thought in terms of what could be done politically, while Eleanor thought in terms of what should be done ethically. 48 While inspecting conditions in southern states for her husband, Eleanor discovered discrimination against African Americans in several New Deal programs. For instance, African Americans in southern work relief programs under the WPA received lower wages than their white counterparts. As a result, Eleanor made sure that black leaders received a hearing at the White House, resulting in a 1935 executive order from the President barring discrimination in WPA programs.
In the context of the times, actions such as these showed African Americans that Franklin and Eleanor Roosevelt did care about them. More importantly, this advocacy gave young African Americans a glimpse of the potential power of the federal government regarding civil rights. What ever its shortcomings, the New Deal prevented many Americans, black and white, from starving to death during the Great Depression. While challenging the ideologies of the status quo in the United States, it reformed national institutional structures to meet the massive needs of millions of Americans in poverty. In doing this, the New Deal created a major federal health and human service system in addition to the services of local public and private agencies. The Social Security Board, set up to administer the Social Security Act, later became the United States Department of Health, Education, and Welfare. 49 And the Social Security Act became, and still is, the foundation of the American health and human service system.
Personal Profile: Mary McLeod Bethune
Mary McLeod Bethune, the daughter of former slaves, became head of the Division of African-American Affairs within the National Youth Administration in 1936. She used this position to advocate for the needs of African Americans during the Great Depression, directing a more equitable share of New Deal funding to black education and employment. 50 Born in 1875 in Mayesville, South Carolina, Bethune received a scholarship to Scotia Seminary for Negro Girls in Concord, North Carolina. She later attended the Moody Bible Institute in Chicago from 1894 to 1895. 51 In 1904, she founded the Daytona Educational and Industrial School for Negro Girls in Daytona Beach, Florida, a school that later merged with the Cookman Institute of Jacksonville to become Bethune-Cookman College. An educator, organizer, and policy advocate, Bethune became one of the leading civil rights activists of her era. 52 She led a group of African American women to vote after the 1920 ratification of the 19th Amendment to the Constitution (giving women the right to vote). In her position in the National Youth Administration, she became the highest paid African American in the federal government and a leading member of the unofficial “Black Cabinet” of the Roosevelt Administration. She later became the first African American woman to have a monument dedicated to her in Washington, D.C.
Critical Analysis: Business, the Great Depression, and the New Deal
Given the primary role that the private for-profit market plays in American social welfare, the Great Depression represented the greatest failure of the business sector in American history. As a result of the massive economic collapse in the wake of the stock market crash in 1929, the federal government assumed a much larger role in promoting social welfare. This new partnership among U.S. institutional sectors was quickly developed, at times, over the opposition of business leaders. To illustrate, both the U.S. Chamber of Commerce and the National Association of Manufacturers considered the Social Security Act too radical. 53 Yet, there was much less opposition to the Social Security Act (with its employer contributions) than expected by the Roosevelt Administration. In fact, some prominent business leaders such as Gerard Swope of General Electric and Marion Folsom of Eastman Kodak publicly supported the legislation. At the same time, many social reformers attacked the Social Security Act and other New Deal legislation for being too moderate, too sexist, and too racist. Were they correct? Should the New Deal have replaced, rather than cautiously reformed, many U.S. institutions? Were Roosevelt and the New Deal too accommodating to the interests of conservative business and political leaders? Did America miss a fundamental opportunity for significant progress in terms of social and economic justice?
Social Policy in Post-War America Economic Context: Automobiles, Suburbs, and Corporate Social Responsibility
The late 1940s and the decade of the 1950s witnessed an increasingly strong U.S. economy. The victory of the United States and its Allies in World War II left the United States economy positioned for world leadership. The economic infrastructures of Europe, Japan, and the Soviet Union had suffered tremendous destruction during the war, while the United States’ economy, boosted by war production, recovered from the Great Depression. As the nation entered the 1950s, the U.S. economy boomed, facilitated by federal government policies, especially in the automobile and housing industries. In fact, there was a large, pent-up demand for most products. General Motors was the world’s largest, richest corporation and would soon pass the billion dollar mark in gross revenues. 54 The Interstate Highway Act of 1956 provided billions of dollars for highway construction, thereby fueling the demand for automobiles by a growing population. Millions of Americans saw the opportunity to keep their urban industrial jobs while living in the suburbs. Once again, the federal government (working in partnership with the private banking industry) made possible low-interest home mortgages for these consumers, mortgages guaranteed by federal agencies such as the Veteran’s Administration and the Federal Housing Authority.
In addition, developer William J. Levitt began mass-producing affordable homes for middle-class Americans. While the economy grew, American businesses began to shift their priorities for charitable giving. Experiences of the Great Depression, New Deal, and World War II prompted American businesses to increasingly direct donations to community groups other than the traditional health and human services of the local community chests. The transition was facilitated by a 1953 ruling of the Supreme Court of New Jersey. The ruling legitimized corporate charitable giving, not only in the traditional terms of “direct benefit” to the corporation, but also in terms of the broad social responsibilities of corporations to the nation. 55 Previous to this court ruling, corporate charitable gifts could be legally justified to stockholders only if the donation was a direct benefit to employees. For example, a donation by a railroad company to a local YMCA that provided housing for railroad workers was legal. The ruling interpreted “direct benefit” to mean a benefit to the free enterprise system and not solely to the corporation or its employees.
Thus, a legal precedent was established for corporate giving to a wider range of causes, including educational, cultural, and artistic organizations. At the same time, American corporations were becoming more aware of their responsibility to a wide range of community groups. 56 Throughout the 1930s, the business sector faced resentful, hostile public opinion as a result of the collapsed economy and widespread suffering. The subsequent New Deal legislation, as previously stated, was perceived by business as an enormous threat to the free market system. In addition to the unprecedented increase in the federal government’s responsibility for national social welfare, the business sector feared future increases in government regulation. Thus, business was presented with the option of acknowledging its broader social welfare responsibilities on a voluntary basis or through increased government regulation. As in the Progressive Era, business leaders responded to the threat of further regulation with a renewed emphasis on management professionalism and corporate social responsibility. 57
The idea of business management as the trustee for society at large was increasingly stressed in the business sector. Business management became more responsive to multiple groups in its environment: stockholders, employees, retirees, consumers, government, and local communities. For example, in 1954, General Electric became the first corporation to match employee and retiree contributions to charity with a corporate donation (i.e., “matching gifts”). 58 Furthermore, this broad range of stakeholders began efforts to hold corporations more accountable for their policies and social impact (eventually resulting in the “consumer movement” and “ethical investing”).
The Political Context: McCarthy and The Red Scare
Although the federal government worked with the business sector during the l950s to build homes and highways, there was relatively little new social reform passed at the federal level. 59 Major New Deal programs such as Social Security survived the conservative political climate of the 1950s thanks to strong support by America’s growing middle class. However, the administrations of Harry Truman (1945-1952) and Dwight Eisenhower (1953-1960) were relatively dormant with respect to major new social reform. The legislation that was passed included the 1946 National School Lunch Program, the 1946 National Mental Health Act (providing grants to states for mental health services), and the 1954 School Milk Program. 60 One of the primary reasons for the lack of major new social reform during this period was the national concern about the growth of communism. As indicated earlier, some of the big government programs of the New Deal had been criticized for being communistic.
American labor unions, to varying degrees, were influenced by Communist members. However, now the Soviet Union and China had emerged from World War II as military powers capable of rivaling the U.S. around the world. Events such as the postwar Soviet expansion in Eastern Europe alarmed a U.S. population that had recently witnessed the global aggression of Adolf Hitler. 61 At the same time, Communist Parties were gathering strength in countries such as France and Italy. 62 Consequently, the spread of communism became the number one voter concern. 63 Perhaps even more alarming to U.S. political leaders were government reports that the Soviet Union, in its quest for world domination, was secretly developing atomic weapons and sponsoring espionage activity in the United States. President Truman responded to (and fueled) this “Red Scare” by setting up the Federal Employee Loyalty Program in 1947. 64 The program’s goal was to eliminate subversive employees in the U.S. government.
In the same year, the House Un-American Activities Committee (which included a young Congressman named Richard Nixon) began a series of investigations of Communist infiltration of American labor unions, government, academia, and motion picture industry. During these investigations, a senior editor from Time magazine, Whittaker Chambers, admitted to being a former member of the Communist Party and identified a former top U.S. State Department official and Secretary-General of the founding United Nations conference, Alger Hiss, as a Communist doing espionage work for the Soviet Union. The Red Scare became even more frightening in 1949 when President Truman announced that the Soviet Union had detonated an atomic bomb and when Mao Tse-tung declared communist sovereignty over the entire Chinese mainland. Then in 1950, Alger Hiss, was found guilty of perjury in denying that he had committed espionage for the Soviet Union. 65 By the time that Senator Joseph McCarthy later that year claimed to have list of Communist working in the U.S. State Department on national policy, the Red Scare had become hysterical.
Implications for the Social Sector and Social Work
This sociopolitical environment generated much public support for a “Cold War” anti-communist foreign policy. Yet, it also turned public support against further social reform. 66 The writings of Karl Marx were banned from bookstores. Universities refused to invite “controversial” speakers. Radical militant unions were expelled by the Congress of Industrial Organizations (“CIO”). In the end, this anti-communist sentiment along with a strong economy resulted in relatively little interest in major social legislation by the Truman and Eisenhower Administrations. The conservative trend of the 40s and 50s was, again, reflected in the social work profession. That is, the focus of the social work returned to professional status and to individual treatment (i.e., casework) rather than the social reform of the New Deal era. 67 In 1952, the Council on Social Work Education was established providing a standard accrediting body, and three years later, several professional organizations were merged to form the National Association of Social Workers (NASW). Furthermore, during the 1950s, a “psychosocial” orientation to casework evolved, merging techniques from competing schools of thought (“diagnostic” verses “functional”).
Based in part on the writings of Heinz Hartman, Melanie Klein, Paul Federn, and Anna Freud, more attention began to be paid by therapists to ego functions. More attention was also given to use of the client-therapist relationship in the present (as opposed to the recovery of repressed unconscious information) and to issues of separation, through the use of “termination” in therapy. (See the writings of Margaret Mahler, Rene Spitz, and John Bowlby) In addition, foreshadowing the age of “managed health care,” caseworkers began examining techniques associated with brief therapy. Finally, Erik Erikson’s 1950 publication, Childhood and Society, brought increased interest by social workers in psychosocial development across the lifespan. In summary, the emphasis of the 1950s in social work was casework. Then came the 1960s! ContentSelect For more information on related social work topics, use the following search terms: The New Deal Federal Art Project Franklin D. Roosevelt Federal Writers Project Federal Emergency Relief Admin. Fair Labor Standards Act Civilian Works Administration Wagner-Steagall Housing Act Civilian Conservation Corps Mary Richmond Social Security Act of 1935 Sigmund Freud National Labor Relations Board Eleanor Roosevelt Works Progress Administration Harry Hopkins National Youth Administration Frances Perkins Federal Theater Project Mary McLeod Bethune Red Scare
Suburbanization in the United States after 1945
Mass migration to suburban areas was a defining feature of American life after 1945. Before World War II, just 13% of Americans lived in suburbs. By 2010, however, suburbia was home to more than half of the U.S. population. The nation’s economy, politics, and society suburbanized in important ways. Suburbia shaped habits of car dependency and commuting, patterns of spending and saving, and experiences with issues as diverse as race and taxes, energy and nature, privacy and community. The owner occupied, single-family home, surrounded by a yard, and set in a neighborhood outside the urban core came to define everyday experience for most American households, and in the world of popular culture and the imagination, suburbia was the setting for the American dream. The nation’s suburbs were an equally critical economic landscape, home to vital high-tech industries, retailing, “logistics,” and office employment. In addition, American politics rested on a suburban majority, and over several decades, suburbia incubated political movements across the partisan spectrum, from grass-roots conservativism, to centrist meritocratic individualism, environmentalism, feminism, and social justice. In short, suburbia was a key setting for postwar American life.
Even as suburbia grew in magnitude and influence, it also grew more diverse, coming to reflect a much broader cross-section of America itself. This encompassing shift marked two key chronological stages in suburban history since 1945: the expansive, racialized, mass suburbanization of the postwar years (1945–1970) and an era of intensive social diversification and metropolitan complexity (since 1970). In the first period, suburbia witnessed the expansion of segregated white privilege, bolstered by government policies, exclusionary practices, and reinforced by grassroots political movements. By the second period, suburbia came to house a broader cross section of Americans, who brought with them a wide range of outlooks, lifeways, values, and politics. Suburbia became home to large numbers of immigrants, ethnic groups, African Americans, the poor, the elderly and diverse family types. In the face of stubborn exclusionism by affluent suburbs, inequality persisted across metropolitan areas and manifested anew in proliferating poorer, distressed suburbs. Reform efforts sought to alleviate metro-wide inequality and promote sustainable development, using coordinated regional approaches. In recent years, the twin discourses of suburban crisis and suburban rejuvenation captured the continued complexity of America’s suburbs.
1945–1970: Era of Mass-Suburbanization
Postwar Real Estate Development
Postwar suburbia was built upon a prewar metropolitan landscape characterized by “segregated diversity,” a heterogeneous mix of landscapes, functions, and populations that emerged in the late 19th century . Prewar commuter suburbs with lush landscaping and large houses abutted farms and orchards, modest streetcar suburbs, and Main Street shopping districts. Elsewhere, smokestacks broke the rural skyline alongside worker housing. As geographers Richard Harris and Robert Lewis conclude, “Prewar suburbs were as socially diverse as the cities that they surrounded.” 1 Ironically, this heterogeneous landscape, and especially the open spaces lying between and beyond it, was the setting for a massive wave of postwar suburbanization that was characterized by similarity and standardization. 2
This history originated in the chaotic transition to peacetime society after 1945 . World War II migrations, military deployment, and demobilization compounded a housing shortage that dated back to the Depression. In 1945 , experts estimated a shortage of 5 million homes nationwide. Veterans returned to “no vacancy” signs and high rents. As late as 1947 , one-third were still living doubled up with relatives, friends, and strangers. American family life was on hold. 3
The solution to this crisis emerged from a partnership between government and private enterprise that exemplified the mixed Keynesian political economy of the postwar era. The Federal government provided a critical stimulus to suburbanization through policies that revolutionized home building and lending, subsidized home ownership, and built critical suburban infrastructure, such as the new interstate highway system. 4 Private enterprise, for its part, applied new mass production techniques and technologies tested during the war to ramp up home building. Key to this partnership was a New Deal–era agency, the Federal Housing Administration (FHA). At the heart of FHA policy was a mortgage insurance program that took the risk out of home lending and made the long-term (25–30 years), low-interest home mortgage the national standard. The FHA also granted low-interest construction loans to builders and established basic construction guidelines that set new nationwide building standards. Along with a companion program in the Veterans’ Administration (VA) created by the 1944 GI Bill, the FHA stimulated a flood of new construction that brought the price of home ownership within the reach of millions of families. “Quite simply,” concludes historian Kenneth Jackson, “it often became cheaper to buy than to rent.” 5 Jackson also notes that these programs had a pro-suburban bias. FHA and VA requirements for standard setbacks, building materials, lot sizes, and other features ruled out loans to large sections of urban America while giving preference to new homes on the suburban fringe. By the 1950s, as many as one-third of home buyers in the United States received support from the FHA and VA programs, and home ownership rates rose from four in ten U.S. households in 1940 to more than six in ten by the 1960s. The vast majority of these new homes were in the suburbs. 6
Equally important to the postwar boom was a revolution in construction. In response to pent-up demand and new federal supports, a cohort of builder-developers modernized home building to achieve mass production. The new builders were young, bold, and creative many were the children of immigrants. Using techniques pioneered by prewar builders, such as Fritz Burns of Los Angeles, and refined through work on large war construction projects, contractors streamlined home building, employing standardized parts and floor-plans, subassembly of doors and windows, and subdivision of labor to minimize the need for skilled or unionized workers. 7 The scale of building took off (see Figure 1). Whereas “large builders” in prewar America might have built 25 homes per year, by the late 1940s, large firms were building several hundred homes per year. Annual housing starts leaped upward from 142,000 in 1944 to an average of 1.5 million per year in the 1950s. 8
Figure 1. Pioneer mass-builder Fritz Burns developed Westchester in the late 1930s, devising many of the mass-production techniques adopted by postwar suburban builders through the United States.
Emblematic of the new builders was William J. Levitt, who joined his immigrant father in the construction business in the 1930s. After experimenting with mass production during WWII, in the late 1940s the Levitts built what would become the most famous housing development of the age, the 17,000-home Levittown on Long Island, New York. By the mid-1950s, Levitt was the nation’s largest builder, with an annual production of more than 2,000 houses. 9 While large-scale builders such as Levitt attracted the lion’s share of media attention, more typical were smaller-volume and custom builders who constructed fewer than 250 homes per year, but they too turned out homes uniform in appearance and amenities, reflecting the broad standardization of the industry and the landscapes it was producing. 10
The typical postwar home of the late 1940s was the “minimum house,” a reference to the FHA’s minimum building standards. They were small, often cramped for families in the midst of a baby boom, but they were considered entirely modern with their up-to-date appliances, mechanical systems and utilities (with costs for everything neatly rolled into a 25-year mortgage). The average home in 1950 was 983 square feet (down from 1,140 in 1940 ). It had 5 to 6 rooms—typically two bedrooms, one bathroom, a living room and kitchen on a single floor. 11 The size and simple construction of these homes encouraged owners to remodel as their families changed. At Levittown, Long Island, the “Cape Cod” house model included a half-story “expansion attic” upstairs. By the mid-1950s and 1960s, as consumers demanded more space, builders increased home sizes, introducing open floor plans and new designs such as split-levels and expansive, horizontal ranch homes for buyers at different price points. By the mid-1950s, mass suburbs that had started out with a mix of incomes were sorting out into neighborhoods and communities that were increasingly homogenous in terms of class. 12
The media hailed developers like Levitt as “community builders” because they not only subdivided land and built houses but created whole communities from scratch. Despite these accolades, historian Dolores Hayden points out that the drive for profit pushed community planning to the back burner in much of postwar suburbia. Developers often set aside space for civic facilities, but local taxpayers were responsible for the cost of parks, playgrounds, libraries, and other public amenities. Levittown, Long Island, for example, was built without public sewers or even adequate septic tanks, forcing homeowner/taxpayers to make expensive upgrades after the Levitts moved on. Smaller builders were often even more frugal. Thus, for many new suburbanites, the companion to low housing prices was a high tax bill. The ongoing struggle of many suburbanites to build a sense of community in the unfinished civic landscape of mass suburbia was a legacy of this era.
New residential suburbs represented just one element of the postwar suburban trend. By the early 1950s, commercial developers, corporate headquarters, big retailers and other businesses, were also migrating to the suburban fringe, setting the stage for a wholesale reorganization of metropolitan economies by the end of the century. Aided by federal tax policies such as accelerated depreciation that subsidized new buildings over the maintenance of existing ones, retailers like Macy’s and Allied Stores opened new suburban branches to capture consumer dollars that traditionally flowed to their downtown stores. Architect-developers like Victor Gruen—who designed many early suburban shopping centers, including the nation’s first indoor shopping mall, the Southdale Center in suburban Minneapolis, 1956 —evangelized the new shopping center as a modern civic center, a privately-built “public” space that would replace the traditional downtown. 13 Corporate headquarters, and other offices also began a slow shift to suburban locations. Attracted by the prestige value of elite suburbia, Fortune 500 companies such as General Foods, Reader’s Digest and Connecticut General Life Insurance built landscaped campus “estates” in suburbs such as Westchester County, New York, and Bloomfield, Connecticut, during the 1950s, signaling a trend that peaked in the 1980s. 14 In the Washington, D.C., area, government agencies also shifted to suburbia, led by the Central Intelligence Agency, which broke ground on its new campus headquarters in Langley, Virginia, in 1957 . 15
The City-Suburb Divide: Urban–Suburban Inequality in the Postwar Era
At the metropolitan scale, the suburban shift in population and investment shaped divergent futures for U.S. cities and suburbs—a drain of people and resources from cities to their suburbs that the columnist William Laas labeled simply, “suburbitis.” 16 Older industrial centers, especially, faced serious challenges. The relocation of factories to suburbs and other lower cost locations sapped resources that had sustained city neighborhoods since the 19th century . Urban job losses stoked unemployment and poverty. Declining tax revenues forced cut-backs on infrastructure, schools and other services, which reinforced the cycle of suburbanization. By the 1960s, commentators pointed to a full-fledged “urban crisis.” Meanwhile, the suburbs boomed. 17
The metropolitan political structure of the United States played a hand in this divergence. As independent political entities, urban and suburban municipalities competed for business, people, and tax dollars. Cities and suburbs both used municipal powers such as land use and tax policy to maximize economic advantages within the town limits, but in the postwar period, suburbs held a clear advantage. In California’s East Bay region, for instance, the historian Robert Self shows that suburban civic leaders shaped zoning policies, infrastructure spending, and tax rates to capture flows of people and capital. They attracted new factories and other investment and bolstered services for local residents, while excluding unwanted groups such as blue-collar workers, African Americans, and other people of color. By contrast, the city of Oakland faced waves of capital flight, job losses, and growing tax and service burdens for a population that included rising numbers of African Americans and Latinos, who were prevented from moving by racial barriers in the suburban housing market. 18 The fortunes of cities and suburbs remained linked throughout the postwar decades, but the balance of prestige and power within metropolitan regions had clearly shifted.
Race, Ethnicity, and Exclusion
Mass suburbanization had equally dramatic consequences for race in postwar America. Suburbia beckoned with opportunity for millions of whites, but it remained rigidly segregated and broadly exclusive throughout the postwar decades. Mass suburbs supported ethnic and racial assimilation, where Italians, Poles, Greeks, Jews, and other European-Americans found a common social ground that solidified their identity as “whites.” 19 The beneficiaries of racially structured federal policies, millions of “not yet white ethnics” (as historian, Thomas Sugrue described them) attained symbols of white, middle-class status, such as college educations, pensions, small businesses, and homes of their own. 20 Mass suburbia tied these benefits together in a coherent spatial package, providing a setting for common experiences, aspirations, and interests. And because these communities were prefaced on the principle of racial exclusion, the new suburbs reinforced solidarities of race while downplaying the significance of ethnic, religious, and occupational differences. Further reinforcing this merger of race and suburbia were the ever-present images in the national media of happy, white families celebrating the postwar suburban dream.
At the same time, African American, Asian American, and Latino families battled for access to the suburbs, challenging not only the presumed whiteness of suburbia but the ideology of white supremacy implicit in postwar suburban ideology. In response, white suburbanites in concert with other crucial players—including government—created a web of discrimination that secured links between race, social advantage, and metropolitan space. Mechanisms of segregation included collusion by real estate brokers, homebuilders and lenders, discriminatory federal housing guidelines, local neighborhood associations, municipal land use controls, and the threat of violence. FHA underwriting guidelines, for example, explicitly required racial segregation until the early 1950s. In most cases, that spelled exclusion from a program that did so much to lift millions of whites into the middle class. By 1960 , African Americans and other people of color had received just 2% of FHA-insured mortgages. 21 Added to the barriers of institutional racism, recent historical studies suggest that acts of violence and intimidation against nonwhite neighbors—including arson, bombings, death threats, and mob assaults—numbered in the hundreds during the decades after World War II. In the Chicago suburb of Cicero, for instance, rumors that a black family had rented a local apartment in 1951 provoked a mob to ransack the building. This bleak side of postwar urban history led historian Arnold Hirsch to refer to the 1940s and 1950s as “an era of hidden violence.” 22 African Americans were targeted in most of these attacks, but discrimination also affected Asian Americans and Latinos, albeit in less predictable and capricious ways. In one well-publicized instance, a Chinese American couple, Grace and Sing Sheng, responded to the objections of white neighbors, who opposed their purchase of a house in suburban San Francisco in 1952 , by suggesting a vote. Opponents prevailed 174–28 in the informal canvas, and the disillusioned Shengs decided to move elsewhere. 23
Despite such obstacles, growing numbers of minority families found footholds in postwar suburbia. Between 1940 and 1960 , the number of African American suburbanites increased by 1 million, amounting to 2.5 million by 1960 —approximately 5% of the total suburban population. Regional variations typified this movement. In the South, where African Americans had lived on the metropolitan fringes for decades, developers built more than 200,000 new homes and apartments by 1960 . In many cities, explicit planning for “Negro expansion areas” preserved access to places where blacks could move without upsetting segregation or provoking violence. Developments such as Collier Heights in west Atlanta, Washington Shores near Orlando, and Hamilton Park in north Dallas created suburban-style footholds for a growing black middle class. 24 Outside the South the proliferation of new suburban municipalities, each with control over local land use, limited construction for minority families. As a result, African Americans and other nonwhites struggled to find housing in existing city and suburban neighborhoods. Older black communities in suburbs such as Evanston, Illinois, Pasadena, California, and Mount Vernon, New York, welcomed new residents. In every region, most of these new suburbanites settled near existing minority communities, with the result that racial segregation expanded in metropolitan America even as court decisions and mass mobilization for civil rights upset the legal structures of Jim Crow. Movement into white neighborhoods was fiercely contested. And communities of color—in suburbs and cities alike—faced ongoing pressures such as school segregation, poor services, redlining, lax zoning enforcement, and reckless “slum clearance” that forced residents to organize politically as racial communities, sharpening the connection between race and place of residence. For whites and nonwhites alike, race emerged as part of the physical structure of the metropolis, reinforced by the separate and unequal spaces that they occupied. 25
Social Life of Postwar Suburbanites
The social history of postwar suburbia remains a fairly understudied area by historians. It is defined as much by sociologists and journalists observing suburbanites at the time, as by historians who have produced case studies of individual suburbs. Such studies offer useful starting points, especially the detailed accounts of social life that focused on the iconic mass-produced suburbs of the Levittowns and Park Forest, Illinois. Although they were hardly typical postwar subdivisions, they attracted a lion’s share of scrutiny. 26
Demographics established an important basic context. Right after World War II, new suburbs attracted a remarkably homogenous population, comprised of relatively young, white married couples with kids. Heterosexual families with distinct roles for men and women were the accepted norm. In 1953 , just 9% of suburban women worked outside the home, compared to 27% nationally. 27 In the era’s best known development, Levittown, New York, residents were all white, ranged from 25 to 35 years old, were married less than seven years, and had an average of three children. The husband was employed and the wife was a homemaker. 28 There was notable religious and ethnic diversity, however, with a mix of Protestants, Catholics, and Jews, while approximately 15% were foreign born. 29 Suburbs like Levittown attracted both white- and blue-collar workers, who together, through their capacity to buy homes and the consumer goods of suburban life, defined the expanding American middle class. 30 Approximately 41% of men and 38% of women had some college education many of the men were veterans. 31 Over time, however, suburbanites of different incomes sorted into different communities, creating greater socioeconomic homogeneity within suburbs, but growing class stratification and inequality across suburbia as a whole. As historian Lizabeth Cohen points out, the rising trend toward “hierarchy and exclusion” meant that residents by the 1960s “participated in more homogeneous, stratified communities [and] the contact they did have with neighbors connected them to less diverse publics.” 32
The mass-produced suburb became the subject of intense public scrutiny, representing a proliferating built form that appeared to present a new, untested social canvass. Observers wondered, was this setting producing new patterns of life and behavior? Under this microscope, certain salient themes emerged about suburban social life. For one, postwar suburbanites were active participants in their neighborhoods. A number of accounts documented this pattern, but perhaps the most influential portrait was by Fortune editor William H. Whyte in his 1956 bestseller Organization Man. The final third of that book provided a detailed portrait of Park Forest, Illinois, which he dubbed the “dormitory” of the organization man, and characterized as a “hotbed of Participation” with a capital “P.” 33 Whyte was one among many journalists from mass circulation magazines like Harpers, Fortune, and Look who delved into social experience on the new suburban frontier.
Whyte found neighbors who were closely connected, and immersed in a culture of borrowing and lending, participation in local clubs and civic groups, and social intimacy. Neighbors were not merely acquainted. They bonded on multiple levels—in the minutiae of the everyday demands of child raising and running homes, in mutual concerns about local civic issues, and even in intellectual and spiritual life. 34 Suburbia seemed to encourage a habit of joining. Similarly, studies of the Levittowns reveal that early residents relied upon one another, especially the many isolated, carless housewives. Neighbors gave each other rides, formed babysitter co-ops, gathered regularly for television viewing parties, and created a nurturing social environment. 35 Even yard work became a socializing experience, as neighbors shared tips with one another. 36 Without fences to separate neighbors, children freely circulated from one backyard to the next, and acted as a kind of social “glue” for their parents. 37 As a child in Levittown, New York, Martha Mordin later recalled, “Living here was like being in an extended family. There were lots of mothers. If you couldn’t talk to your own mother, you could talk to someone else’s mother.” 38 One Levittowner concluded simply, “Had we stayed in the city, I never would have joined anything.” 39 Even sociologist Herbert Gans, whose participant-observer study of Levittown, PA, was meant to challenge an emerging suburban critique of hyperactive socializing and conformity, conceded that suburbanites engaged in high levels of community engagement. 40
The centrality of women was another feature of postwar suburban social history. In these dormitory suburbs, husbands typically commuted to work during the day leaving their wives at home to dominate daily life in the community. The postwar return to domesticity was driven by powerful media imagery and platitudes by national leaders that valorized the housewife, infusing her role as household consumer and manager with patriotic overtones in the context of the Cold War. 41 Popular magazines schooled suburban women in the ways of scientific housekeeping, pushing products and the latest techniques for cleaning, entertaining, and child rearing. Yet despite this image of quiescent domesticity, women were active community participants in suburbia—as the “telephoners, organizers, and arrangers of community life.” 42 They built social networks, joined clubs, and engaged in politics. Although suburban men tended to dominate positions of local leadership, women did much of the everyday work to keep social and civic life vibrant. 43
“Culture Wars” over the Postwar Suburbs
The spread of mass suburbs touched off a virtual “culture war” in America between suburbia’s boosters and its critics. This debate pulled in a range of participants, from advertisers, real estate developers and politicians to journalists, academics, and filmmakers. In the course of debating the relative merits of mass suburbia, each side put forth vivid—if often distorted—images of suburban life, swinging wildly between the extremes of utopia and dystopia. The suburban portrayals and images they generated had deep and lasting impact on the ways that many Americans came to view the suburbs, even up to our own day.
On one side were the boosters—business interests and politicians with a stake in selling suburban homes and the consumer goods to fill them. To them, the suburbs represented the fulfillment of the postwar “American dream”—a warm, happy place filled with healthy families and friendly neighbors, living cozy lives in homes brimming with the latest products and appliances. Magazines, television commercials, and real-estate developers peddled this image tirelessly, depicting contented white families thriving in suburbia. A logical collusion infused their efforts. Real-estate interests plugged the homes themselves, while shelter magazines ran articles on suburban living alongside vivid advertisements for refrigerators, range-tops, television sets, cleaning products, and other household goods. These ads invariably depicted happy homemakers set against a backdrop of gleaming, modern suburban interiors. The picture came full circle on television sitcoms like Leave It to Beaver and Father Knows Best , which offered benign, family-centered stories of generational quarrels and reconciliation, all sponsored by advertisers eager to tap into the lucrative suburban market. The result was a reinforcing web of suburban salesmanship. 44 Political leaders, too, celebrated suburban living, linking suburban consumption to the health of the republic itself. And they elevated the suburban home to a gleaming symbol of American superiority during the Cold War. In the so-called “Kitchen Debates” of 1959 , which took place at an exposition of U.S. products in Moscow, Vice President Richard Nixon sparred with Soviet Premier Nikita Khrushchev over the merits of capitalism versus communism, while standing in a six-room $14,000 ranch house assembled by a Long Island sub-divider and furnished by Macys. Nixon used this slice of everyday suburban life as the ultimate Cold War propaganda weapon. 45
On the other side were the critics, who believed suburbia was inflicting profound damage on the American character. Academics, novelists, filmmakers, and designer-planners, among others, blamed mass suburbia for some of the most disturbing social trends of the era. Homogeneous suburban landscapes, they believed, spawned homogenous people, who followed the dictates of blind conformity. Bland, monotonous, isolating landscapes oppressed women and pushed bored kids toward juvenile delinquency. Female-centric suburban life distorted gender relations and left men emaciated. And the list went on. Not only did suburbia trivialize life, as Lewis Mumford wrote, but it fostered “the temptation to retreat from unpleasant realities, to shirk public duties, and to find the whole meaning of life in the most elemental social group, the family, or even in the still more isolated and self-centered individual. What was properly a beginning was treated as an end.” 46 The final result was a devastating turn away from civic obligation. Even suburban family life was lambasted, portrayed as the polar opposite of the carefree innocence depicted on popular television sitcoms. Novelists and filmmakers in particular depicted all manner of suburban domestic dysfunction: alcoholism, adultery, inept parenting, wounding anxieties, deeply troubled marriages, and fraught sexuality, all concealed beneath a smiling public face. These themes animated such classic postwar films as Mildred Pierce ( 1945 ), Rebel Without a Cause ( 1955 ), Man in the Gray Flannel Suit ( 1956 ), and The Graduate ( 1967 ), and the fiction of Richard Yates, John Cheever, and John Updike. 47
Several social scientists in the 1960s set out to challenge what they called “the myth of suburbia” created by this polemical discourse. Sociologists like Bennett Berger, Herbert Gans, and William Dobriner found that moving to suburbia did not actually change people, as both the critics and boosters suggested. Instead, suburbanites continued to make life choices based upon such factors as class, ethnicity, religion, and personal preference. 48 Their scholarship challenged the notion that environments shape human behavior (or environmental determinism). 49 Yet for the most part, their voices were drowned out by the shriller, more lurid suburban depictions filling the bookstores, airwaves, and movie houses.
Scholars have examined the political culture of postwar suburbia since the mass suburban boom began, tracing critical trends that have shaped U.S. politics at large, including ideals of localism, meritocratic individualism, home owner entitlement, and aversion to general taxation. In the 1950s, political scientist, Robert Wood explored the fragmented municipal landscape of suburbia and the localism that characterized its political culture much as it had since the 19th century . This ideal of localism manifested in campaigns around municipal incorporation and zoning controls, annexation, taxation, school policy and local services from potholes to public swimming pools. Other political identities blossomed at the local level. Many suburbanites derived their core political identity—as white middle-class, taxpaying homeowners—within the context of their suburban neighborhoods, often politically independent municipalities. Suburbanites made a direct connection between their role as taxpayers and their right to a particular quality of life, delivered through services like good schools and safe streets. They developed a sense of entitlement to these advantages, which they perceived as the just rewards of their individual efforts to achieve home ownership. Historians, in turn, have exposed the limitations in this thinking by underscoring the broad web of governmental policies that subsidized and privileged white suburban homeowners. 50 Nevertheless, the notion of homeowner entitlement based on meritocratic individualism remained as a core element of postwar suburban political culture, one that transcended party lines and mobilized suburbanites against an array of perceived threats, ranging from communists to free-spending liberals, the urban poor, excluded minorities, and inefficient government. 51 Their local efforts in the postwar years came to influence national politics and the political parties. The rich scholarship on suburban politics produced by historians in recent years challenges an earlier image of suburban civic banality painted by some postwar critics, and highlights the national importance of suburban politics.
Recent scholarship has documented the crucial role suburbia played in the rise of postwar conservatism, while newer studies linked suburbs to centrist and liberal politics. Two important works on suburban conservatism centered on California. Historian Lisa McGirr’s study of Orange County demonstrated how this prosperous region—characterized by high-tech defense industry, an all-white well-educated populace, and Christian evangelicalism—was a potent breeding ground for modern-day conservatism. In the late 1950s and 1960s, these “suburban warriors” coalesced into a remarkable grassroots movement that attacked communism locally and globally, opposed big government, and supported the protection of property rights and Christian morality. Michelle Nickerson’s study of Los Angeles documented a similar movement of suburban women who rallied against communism and racial integration, particularly in public education. The infrastructure they created—study groups, newsletters, bookstores, and clubs—represented a crucial formative aspect of a maturing Republican Party. Both studies argued that localized suburban politics in this era deeply shaped conservatism at the national scale. 52 Another cluster of studies highlighted racial politics in postwar suburbia, and the dogged efforts of suburbanites to resist federal civil rights mandates to integrate neighborhoods and public schools. Evoking the language of “white rights,” colorblind meritocratic individualism, and homeowner entitlement, suburbanites across the nation—and across the partisan spectrum—resisted school desegregation, court-ordered busing, open housing laws, and public housing, in battles that began locally but ultimately influenced federal policy and the national parties. As Matthew Lassiter and others have shown, this bipartisan suburban movement elevated the issues of metropolitan politics onto the national stage by asserting the interests of suburban taxpayers, etching deeply entrenched patterns of inequality across metropolitan areas. The Republican Party was first to connect with this voting bloc at the national level, using it to win electoral majorities in seven of the ten Presidential elections from 1968 to 2004 , but Democrats likewise supported suburban political mandates during the postwar years. 53
Recent scholarship has also explored the presence of liberal and progressive politics in postwar suburbs, through grassroots movements for affordable housing, pacifism, and desegregation in housing and schools. Sylvie Murray’s study of families in the suburban fringe of eastern Queens, New York, for instance, shows that women played critical roles in liberal political causes. Residents like a young Betty Friedan mobilized to enhance their quality of life, which included a vision of integrated schools and multicultural neighbors, and they sought the active hand of government to achieve these ends. Lily Geismer’s study of Boston reveals the strength, and also the limits, of liberal activism in suburbia. White liberals in suburbs like Brookline and Newton actively supported a racially open housing market, but they rarely challenged the high economic hurdles that ensured that most people of color could not afford these neighborhoods. 54
Local politics in the first two Levittowns exemplified these and other themes. In the 1950s, conflicts over the Levittown, NY, schools broke out between advocates of progressive versus traditional education, refracting larger differences between liberals and conservatives. 55 More dramatic was the race riot that erupted in 1957 when the first black family moved into Levittown, PA. Their arrival was facilitated by a small, dedicated group of local activists, mainly leftists and Quakers, who were committed to civil rights. When William and Daisy Myers stepped forward to become black pioneers in Levittown, their arrival sparked massive grassroots resistance. For several weeks, hundreds of white residents gathered in the evenings, hurling rocks and yelling epithets (see Figure 2). They burned a cross and sprayed “KKK” on the property of Myers supporters. Yet other white residents stepped up to support the Myers family. As the conflict turned Levittown into a “civil rights battleground,” the event illustrated the presence of both pro- and anti-integrationists in the suburb. Eventually, emotions simmered down. Yet as Tom Sugrue shows, the end result signaled the limits of racial liberalism in places like Levittown where blacks remained a miniscule percentage of the population for decades, as well as in other metro areas where sharp class and race inequality persisted. 56
Figure 2. A crowd of approximately 400 people line the street a block from the Myers’ home after state police had moved them. The crowd is protesting the first black family moving into Levittown, 1957.
Suburbanites played key roles in other political movements as well. Levittowners engaged in environmental activism by the late 1960s, part of a broader push among suburbanites nationally. 57 As ironic as this may have appeared to those who saw Levittown as the epitome of denuded nature, many Levittowners in fact had a discernable connection to their natural environment, fostered both by the developers and their own labors in yards and gardens. When Levittown began feeling the effects of local factory pollutants and encroaching overdevelopment, residents united in action. This new politics coalesced around Earth Day in 1970 , then fanned out into a range of grassroots activities. Residents gathered for garbage clean-up days, opposed a proposed nuclear power plant, held environmental teach-ins, circulated petitions, and picketed the nearby U.S. Steel plant over industrial pollutants. 58 Suburban politics in the postwar years, thus, encompassed a range of political players who embraced such wide-ranging impulses as environmentalism, racial liberalism, and feminism.
1970–Present: Growth and Diversification
Land Development and Real Estate
Changing economic conditions in the United States after 1970 reshaped suburbia, as they did much of American life. The decline of manufacturing across the industrial heartland, the rise of service employment, high-tech growth, deregulation, and globalization altered the context for metropolitan life, resulting in regional shifts, growing economic volatility, polarization of wealth and income, and unstable futures for cities and suburbs alike.
New trends in the housing market set the tone for events that affected American suburbs through the following decades. New patterns of real estate investment in the 1960s and 1970s, for example, foreshadowed the coming merger of real-estate development and global financial markets that triggered recurrent housing volatility, including the global financial crisis of 2008 . After 1960 new real estate investment trusts (REIT’s), investment banks, major pension and insurance funds, and Fortune 500 corporations poured capital into metropolitan real-estate development. 59 Financial deregulation at the state and federal levels opened up additional sources of investment for real-estate lending, and government sponsored financial entities, such as FNMA (colloquially known as “Fannie Mae”) facilitated the flow of money to real estate through the sale of mortgage-backed securities to investors worldwide. By 2015 , mortgage-backed securities were valued at almost $9 trillion, representing almost two-thirds of funds invested in the U.S. mortgage market. 60
Flush with new capital, real-estate firms dramatically increased the scale and scope of development after 1960 . In places like California, development schemes reached gargantuan proportions that made even postwar developments like Levittown look small by comparison. Miles of pristine coastline and 10,000-acre cattle ranches succumbed to the bulldozer, giving birth to future suburban cities such as Irvine, Thousand Oaks, Temecula, and Mission Viejo, and laying the template for the pervasive sprawl that characterized modern suburban life (see Figure 3). The period saw the rise of the first truly national development firms, corporate real-estate enterprises such as Ryan and Pulte Homes, Kaufman and Broad, and Levitt, which had operations in multiple U.S.—and even international—markets. By the early 21st century , firms such as these and their successors were building tens of thousands of units per year, replicating standardized architecture and community planning across the United States. By the peak of the housing bubble in 2005 , the top five largest builders each closed on more than 30,000 houses for the year. 61 Their robust activity symbolized the dramatic expansion of suburban areas and rise in the suburban population in the U.S. with a majority of Americans living in suburbia by 2010 (see Table 1).
Table 1. The Growth of Metropolitan and Suburban Areas in the United States, 1940–2010
Metropolitan Area Population (includes city and suburbs)
a. Metropolitan areas for 1970 and 2010 defined according to Census definitions for 1970 and 2010 (urbanized counties adjacent a central city(ies) of 50,000 or more). See U.S. Bureau of the Census, Census of Population and Housing: 2000, vol. I, Summary of Population and Housing Characteristics, pt. 1 (Washington: Government Printing Office, 2002), Appendix A-1 Office of Management and Budget, 2010 Standards for Delineating Metropolitan and Micropolitan Statistical Areas Notice, Federal Register, vol. 25.123 (June 28, 2010), 37249–37252.
Sources: U.S. Census Bureau, 2010 Census, Summary File 1, American FactFinder U.S. Bureau of the Census, Census of Population: 1970, vol. I, Characteristics of the Population, pt. 1, U.S. Summary, section 1 (Washington: Government Printing Office, 1973), 258.
Figure 3. Irvine, California, exemplified the massive scale of suburban developments after 1960, as well as their multiple functions—including suburban housing, offices, retail, and industry. Irvine housed more than 60,000 people by 1980, and 212,000 by 2010.
The scale and scope of real-estate investment and development ushered in new levels of volatility in U.S. real-estate markets after 1970 that had been subdued in the long postwar boom. The early 1970s witnessed the first in a series of modern boom-and-bust cycles that rocked the housing market through the 21st century . Builders reached the all-time record in U.S. housing starts of 2.36 million in 1972 and the largest three-year total in U.S. history between 1971 and 1973 . The bust, when it came, in 1974, sparked widespread bankruptcies and job losses, helping to drag the U.S. economy into recession. With frightening regularity, housing market crises returned in the early 1980s, the early 1990s, and late 2000s, creating a nauseating series of economic rollercoaster rides for homeowners and renters alike. Across these same decades, the United States witnessed a generalized rise in housing prices. Driven by growing suburban land-use restrictions, a shortage of buildable land in many metro areas, and increased supplies of mortgage capital, the long upward price spiral was a boon to property owners who hung on for the long term, but it produced a burden of rising property taxes and a growing crisis in affordability. Aspiring homeowners faced outsized home prices, pushing them to work more hours, drive greater distances, and take on greater loads of debt to purchase a suburban home. By 2013 , nearly 40 million American households were paying more than 30% of their income for housing, and as many as 10% of homeowners were paying upwards of 50%. 62 In booming suburban markets, such as Orange County, California, Long Island and Westchester County, New York, and Fairfax County, Virginia, children were priced out of the suburban areas where they grew up. Public and private efforts to expand home ownership in the 1990s and 2000s met this reality with increasingly risky credit arrangements—interest-only and stated income loans, and ballooning adjustable-rate mortgages that were almost unheard of in postwar suburbia. These arrangements were at the center of the housing meltdown and global economic crisis in 2008–2010 . For households left out of the market—blue-collar families, singles, young couples, many people of color—the spiral in housing prices was an obstacle to building wealth and a continuing source of economic disparity in America’s post-Civil Rights era.
After 1970 , the economic ascendancy of suburbia that had been building since 1945 reached maturity. In a landmark, 1976 study, Geographer Peter Muller explored the rise of the “Outer City,” his term for the hubs of retail activity, office parks, super-regional shopping malls, and gleaming business headquarters that clustered along the nation’s metropolitan highway exchanges. Muller concluded that “suburbia,” was now the “essence of the contemporary American city,” no longer “sub” to the “urb.” 63 By the early 1970s, suburban employment outnumbered city jobs for first time, and suburban “edge cities,” such as the Washington D.C. Beltway Schaumburg, Illinois Boston’s Route 128 corridor Seattle’s high-tech suburbs such as Redmond and Bellevue and California’s Silicon Valley played a central role in the nation’s economy. By the 1990s, fewer than one in five metropolitan area jobs were located within three miles of the old central business district, whereas almost half were located ten or more miles from that center. 64 While many central cities rebounded in the 2000s with new high-tech and innovation clusters of their own, they now represented just one part of the complex and poly-nucleated metropolitan economies that Muller explored in the 1970s. Analysts increasingly recognized these metropolitan economies as the drivers of the nation’s economy, competing independently in the global marketplace against other metro areas worldwide. By 2010 , this metropolitan ascendance was evident: the 100 largest metro areas in the United States were responsible for three-quarters of the nation’s gross domestic product (GDP). 65
After 1970 , these trends played out across a pervasive physical pattern of suburban sprawl. Between 1982 and 2012 , metropolitan regions ballooned in area, with real-estate development consuming 43 million acres of rural land, an area larger than Washington State. By 2002 , according to one estimate, the United States was losing two acres of farmland per minute to suburban development. 66 Low-density, auto-dependent development characterized fast-growing regions such as Atlanta and Los Angeles, where residents endured multi-hour commutes from new “drive ‘til you qualify” subdivisions in the rural fringe. Even in comparatively slow-growing metro areas such as Pittsburgh and Detroit, rates of suburban sprawl outpaced population growth. 67 By the early 21st century , Americans were driving more miles, spending more time in the car, and using more energy than ever before. Per-capita automobile miles driven tripled between 1960 and 2003 , resulting in gridlock, taxi-parenting, and hectic patterns of family life on the road that became common features of modern suburbia. 68 The fiscal costs of sprawl were equally significant. As U.S. suburban areas spent billions on new infrastructure, existing urban systems deteriorated, resulting in continued disparities between growing suburbs and declining urban and inner suburban places. Even on the fringe, however, low-density sprawl frequently failed to pay for itself. As Myron Orfield shows, fast growing “exurban” areas were among the most fiscally stressed metropolitan communities, with major imbalances in tax revenue versus needs for service. 69 To make ends meet, thousands of suburban governments employed “fiscal zoning,” restricting apartments, raising lot sizes, and setting aside large areas for big box retailers to boost tax revenues. As many Americans looked forward to more sustainable ways of living, the sprawling landscapes of modern suburbia represented a major policy challenge. 70
In contrast to the era of postwar “sitcom suburbs,” recent decades witnessed the construction of more varied types of suburban housing. In response to rising home prices, land shortages, and the growing crisis in affordability, corporate real-estate firms built millions of new suburban condominiums, attached homes, and apartments after 1970 . Many of these were built as part of common interest developments (CIDs), and planned neighborhoods governed by homeowners’ associations, which were ruled according to strict covenants, conditions, and restrictions (CC&Rs). Home to fewer than 1% of the population in 1970 , by 2015 as many as one in five Americans lived in a community governed by a private association, and homeowners’ associations dominated the market for new construction. In 2014 , for example, 72% of new single-family homes were built as part of a homeowners’ association. 71 For builders, CIDs squeezed more units (and dollars) out of finite acreage. Suburban municipalities, for their part, welcomed CIDs—despite higher densities and more affordable housing types—because private amenities such as pools, parks, and playgrounds reduced public expenditures. Finally, in a context of rising home prices, townhome and condo developments were among the few affordable housing options for many families. For first-time homebuyers, retirees, empty nesters, and families without children, condominiums and townhomes provided flexibility lacking in postwar “sitcom suburbs.” At the same time, the rise of CIDs signaled a shift away from the suburban dream of a single-family home in communion with nature. While average house sizes grew (2,450 square feet in 2014 ), lot sizes shrank. 72
The proliferation of gated CIDs after 1990 raised additional debates about privacy, exclusivity, and social division across metro America. Research by anthropologist Setha Low suggested that rather than making residents safer, gated communities tended to intensify fears of crime and social distrust. 73 Such concerns were brought into deadly focus in 2012 by the shooting of Trayvon Martin, an unarmed black teenager killed by a neighborhood watch volunteer in the gated enclave where he lived near Orlando, Florida.
One of the most striking features of American suburbs since 1970 has been rapid social diversification, marking a return to suburbia’s historic diversity. 74 After 1970 , a wide cross-section of Americans settled the suburbs, including singles, divorced adults, gays, lesbians, the elderly, the poor, and perhaps most significantly an array of ethnic and racial groups. The proportion of working women also rose substantially, shattering earlier images of suburban housewives trapped at home. As more Americans settled in the suburbs, suburbia looked increasingly like America itself.
A confluence of forces underlay these changes. Trends affecting suburban women and families included the aging of baby boomers, the rise in feminism, economic slumps, and soaring inflation of the 1970s that pushed many women into the labor force. The suburban influx of racial and ethnic groups was spurred by new waves of immigration from Asia and Latin America in the wake of the 1965 Hart-Celler Act, and the passage of the federal Civil Rights Act of 1964 and Fair Housing Act of 1968 , which improved minority job prospects and curbed housing discrimination. These factors created new demographic realities, while policy changes opened suburban areas—growing at a fast clip—to groups that were once fervently excluded.
One change was the suburban family, earlier typified by a working husband, homemaker wife, and the requisite two or three children. By 1970 , scholars noted an increase in divorced, separated, and single adults living in the suburbs, as well as an uptick in working women. One study of the suburbs of Nassau County, New York, found that these trends accelerated from 1960 to 1980 . By 1980 , two of every five adults lived in a non-nuclear family arrangement (single, separated, divorced, or widowed). Families were having fewer children, and more than half of married women with children aged six to seven years worked outside the home. The author attributed the changes to an aging population, liberalized divorce laws, and the 1970s economic downturn which “propelled more married women into the labor market.” 75 These trends continued over the next decades. By 2000 , the suburbs contained more nonfamily households (29%)—mostly young singles and elderly people living alone—than married couples with children (27%). There were also substantial proportions of married couples with no kids under 18 (29%), and rising numbers of single parents, divorced, unmarried partners, and adult relatives living in suburban homes. 76 By 2010 , 75% of suburban homes did not contain a married-couple family with kids, exploding the older image of the “Leave it to Beaver” domicile. 77 And while statistical data on the social geography of LGBTQ (lesbian, gay, bisexual, transgender, queer/questioning) are elusive, a range of evidence suggests that gays and lesbians also migrated to suburbia. 78 One catalyst was a shift in federal housing housing/borrowing eligibility guidelines by the U.S. Department of Housing and Urban Development (HUD), which altered its definition of family away from heterosexual “marital or biological attachments” toward a more pluralist concept that would include “any stable family relationship,” including LGBTQ households. 79 The powerful image of heterosexual “married with children” families in suburbia was giving way to the more complex family structures that mirrored national social change.
Ethnic and racial diversification was also significant. While African Americans, Latinos, and Asian Americans comprised just less than 10% of the suburban population in 1970 , by 2010 they represented 28%. Minorities have propelled the bulk of recent suburban population gains in the 100 largest metropolitan areas, as demographer William Frey has noted. 80 Even more striking are data on specific population groups. For example, from 1970 to 2010 the number of black suburbanites climbed from 3.5 to nearly 15 million, comprising 39% of all African Americans. Even faster growth occurred among Latinos and Asians, who endured less severe housing discrimination than blacks. By 2010 , 46% of Latinos and 48% of Asians nationally resided in the suburbs. And in the largest 100 metro areas, the proportions were even higher—62% of Asian Americans and 59% of Latinos. Immigrants comprised a significant portion of new suburbanites as well. By 2013 , suburbia housed 50% of foreign-born residents in the United States, and numbers were even greater in the biggest metropolitan areas, where most immigrants lived. 81
Even the iconic postwar suburbs reflected these changes, though in different ways. In Park Forest, the site where William Whyte documented social conformity in his 1956 best-seller Organization Man , liberal activists initiated a program of “managed integration” in the 1960s and 1970s to recruit African-American neighbors gradually. As a strategy designed to stave off white flight, the approach seemed to work at first: from 1970 to 1990 , the proportion of blacks rose from just 2.3% to 24.4% of the local populace. From 2000 to 2010 , however, a process of racial resegregation accelerated large numbers of whites left the suburb and the proportion of African Americans rose from 39.4 to 59.8%. 82 Lakewood, a postwar mass-produced suburb in southern Los Angeles, drew national attention in the 1950s when it became the largest development in the country with 17,500 homes, surpassing even Levittown. Lakewood eventually became a site of robust multiethnic diversity: by 2010 , the population was 40.9% white, 30.1% Latino, 16% Asian, and 8.3% black, making it one of L.A.’s most racially balanced cities. 83 The irony was thick. The very suburbs once reviled for their monotonous landscapes—which supposedly churned out monotonous, conforming people—became the staging ground for racial and ethnic diversity. In some ways, this was no surprise because these communities, which were originally built to be affordable, maintained that quality once nonwhite buyers gained the economic wherewithal to become suburban homeowners and the barriers to racial segregation fell. In Levittown, New York, by contrast, whites maintained an overwhelming majority, comprising more than 80% of the population as late as 2010 . 84 All three places illustrate trends in suburbia since 1970 —growing diversity alongside persistent racial segregation.
Figure 4.1. Park Forest, Illinois, 2010 (total population: 21,975).
Figure 4.2. Lakewood, California, 2010 (total population: 80,048).
Figure 4.3. Levittown, New York, 2010 (total population: 51,881).
Another notable trend was the rise of class inequality across suburbia, with growth in both poor and rich suburbs. While poor people had long resided in the periphery, after 1970 a different set of pressures accelerated the trend. A crucial factor was economic restructuring, which created an “hourglass economy” characterized by high- and low-paying jobs, a shrinking middle class, and falling income levels for most Americans. The effects of these structural changes reverberated across suburban space. Many inner-ring suburbs contended with aging housing and infrastructure and high service needs from a growingly poor, immigrant populace. Deindustrialization in the 1970s and 1980s, in turn, devastated older industrial suburbs, which suffered a difficult combination of job loss, white flight, and environmental degradation in the wake of industry’s departure. 85 Suburban poverty accelerated after 2000 , as Elizabeth Kneebone and Alan Berube have shown, driven by two economic recessions and the continued effects of restructuring and globalization. 86
In addition to these economic forces, public policy also played a role. Reversing years of policies that protected the rights of suburbs to exclude the poor, the federal government gradually promoted the dispersal of low-income families into suburban areas through policies such as the 1974 Section 8 voucher program, which granted a housing allowance to people with qualified incomes who could then choose their own housing on the open market—thus untethering them from public housing projects concentrated in poor urban areas. Fair-share housing initiatives and modest inclusionary zoning and affordable housing programs also helped open the suburbs to poorer people. 87 As a result, the number of poor people in the suburbs climbed. In the 1980s and 1990s, poor populations increased in both cities and suburbs however, in the 1990s and 2000s, the rate of increase in suburbs was twice that of cities. During the 2000s, moreover, for the first time more poor people lived in suburbs than in cities—signaling that metropolitan America had “crossed an economic Rubicon.” By 2010 , 55% of the metropolitan poor lived in the suburbs, while one in three poor Americans overall lived in the suburbs, “making them home to the largest and fastest-growing poor population in the country.” 88
At the same time, affluent suburbs proliferated, especially around high-tech and financial hubs like the Silicon Valley, California, Boston’s Route 128, and Fairfield County, Connecticut. Wealthy executives, tech workers, and professionals clustered in these upscale areas, their outsized salaries driving real estate prices to stratospheric levels. Despite the much-heralded return to the city by millennials and the “creative class” (i.e., workers in “knowledge intensive industries” such as computer science, medicine, the arts, and education) as late as 2014 , the super-rich concentrated in tony suburbs of the bi-coastal economy in places like Greenwich, Connecticut, Coral Gables, Florida, and Newport Beach, California. 89 The middle class, meanwhile, faced mixed prospects in suburbia. 90
In the 21st century , American suburbs have come to house a cross section of America itself, including the poor, the rich, and a broad array of racial and ethnic groups and family types. Inequality was reproduced across suburbia, while ethno-racial diversity set the stage for emerging forms of suburban lifeways and politics.
Suburban variations led to disparate social experiences, yielding a mosaic of suburban social histories after 1970 . While it is impossible to offer a synthesis of this history due to wide variations among suburbanites themselves as well as the nascent nature of the scholarship, certain salient themes have emerged on social life and suburban ideals in the post-1970 era.
One group of scholars has emphasized a trend of social disconnection, especially among whites. After the intensive sociability of the 1950s and 1960s, suburbanites after 1970 appeared to swing to the other extreme—social alienation and detachment. This was apparent in ethnographer M.P. Baumgartner’s book The Moral Order of a Suburb . Conducting field work in a suburb of New York City in the late 1970s, Baumgartner was interested in exploring how people handled conflict in their town. What she found was a culture of tolerance and avoidance. The suburb lacked “social integration,” and instead was defined by a sense of indifference between neighbors. Avoidance as a strategy was thus logical: “It is easy to end a relationship that hardly exists.” 91 She attributed this lack of neighborhood connectedness to the privatism of families the high mobility of homeowners, making it hard for them to form lasting bonds and the compartmentalizing of social life (at work, at church and synagogue, and at school). Other scholars extended this theme in exploring fear and privatism in suburbia, characterized in the extreme by the rise of privatized, gated neighborhoods. 92 By 2000 , observers from political scientist Robert Putman in his landmark book Bowling Alone to proponents of the New Urbanism agreed that suburbs fostered social and civic disconnection. 93 It was no coincidence that this change occurred at the moment many suburbs were diversifying. One study of Pasadena, California, over this time period found that racial integration in the 1970s had variable effects on community engagement among suburbanites, pushing some whites into their own insular social communities, reorienting the nature and purpose of local clubs and organization as they saw their numbers decline, and creating some pockets of multiracial social vibrancy. 94 This comports roughly with the findings of some political and social scientists, who observed decreased levels of “social capital” in communities experiencing ethno-racial diversification. 95
At the same time some suburbanites were retreating, others created new cultures and lifeways in the suburbs. Scholarship on ethnic suburbia, in particular, documented this from several angles. Anthropologist Sarah Mahler investigated the lives of working-poor Salvadoran immigrants living in substandard housing in Long Island, New York. The enormous economic pressures they faced, from the challenge to survive on low wages while also supporting families in El Salvador, altered the social dynamic in the Salvadoran community, away from co-ethnic reciprocity toward more individualistic survival. “Burdened with debt and remittance responsibilities,” Mahler writes, immigrant suburbanites “frequently must wring this surplus out of their own deprivation, forgoing everything but an ascetic existence.” 96 A more robust ethnic culture developed in the middle-class suburbs of the west San Gabriel Valley, California, where Asian and Latino residents fostered community life and values around ideals of racial inclusivity. Wendy Cheng describes this as a “moral geography of differentiated space . . . a world view that challenged and opposed whiteness as property.” 97 Remaining in these communities as white residents fled, Asian American and Mexican American residents valued the comfort and familiarity of interracial spaces. Groups like the Boy Scouts reflected this multiethnic sensibility, which in turn stimulated high levels of participation both by boys and their parents. 98
Figure 5. Suburban childhood, 1989. Mark Padoongpatt, age 6, the son of Thai immigrants, stands in front of his suburban home in Arleta, a neighborhood in Los Angeles’ San Fernando Valley inhabited by Mexican Americans, Salvadorans, Filipinos, Vietnamese, Thais, African Americans, and Anglos.
Figure 6. Suburban strip in Tukwila, Washington, 2014. These shops catered to the community’s African immigrant residents.
Figure 7. Hsi Lai Buddhist Temple, located in a suburban residential area of Hacienda Heights, California, 2014.
Robust displays of ethnic culture and identity were strongest in what geographer Wei Li described as “ethnoburbs,” defined as “suburban ethnic clusters of residential and business districts . . . [that] are multiracial/multiethnic, multicultural, multilingual, and often multinational communities.” 99 In ethnoburbs, ethnic culture is constantly refreshed by the transnational flow of immigrants, capital, and businesses. In contrast to older sociological models that considered the suburbs a site of Americanization, ethnoburbs reinforced and sustained ethnicity within suburbia (see Figures 5–7).
Ethnoburbs appeared across the country, in places like the San Gabriel Valley and Silicon Valley, California, Langley Park, Maryland, Palisades Park, New Jersey, Upper Darby, Pennsylvania, and Chamblee, Georgia. In ethnic suburbs, some residents forged new suburban ideals around such values as robust public life, defying long-standing suburban traditions of privatism. For example, Thai residents of the east San Fernando Valley, California, held lively weekend food festivals at the Wat Thai Buddhist Temple, a quasi-public community space. Asian-Indian residents of Woodbridge, New Jersey, celebrated days-long Navrati festivals under enormous tents, with music, dancing, and vendors selling traditional Indian food and dress. Along Whittier Boulevard—which traversed the suburbs of Montebello, Pico Rivera, and Whittier, outside of Los Angeles, Mexican-American youth developed a cruising culture tied to the use of suburban public space. 100 These practices helped cultivate community, and as Mark Padoongpatt writes of the Wat Thai festivals, “fostered a public sociability that went against dominant and even legal definitions of suburbia.” 101
Politics across Diverse Suburbia
Suburban politics after 1970 came to reflect these differences as well, revealing political leanings as varied as suburbanites themselves. One powerful strand worked to sustain suburban privilege. The solid tradition of tax-averse homeowner politics remained strong, and in the post-civil rights era, white suburbanites, especially, increasingly deployed a discourse of colorblind meritocratic individualism to defend their rights, claiming that suburbs were open equally to all and race and class played no role in who lived where. In general, this politics worked to protect suburbanites’ fiscal resources, to defend their quality of life, and to maintain class and racial segregation. Suburban citizens framed these efforts in terms of their hard-earned rights as taxpaying homeowners, which they felt were under siege by free-spending liberals, minorities, the urban poor, inefficient government, and even drug pushers. This political agenda manifested in several ways. One was a full-fledged tax revolt movement. In 1978 , California taxpayers resoundingly passed Proposition 13, a measure that placed severe limits on property tax rates. This campaign led the way for similar tax revolts in other states and helped propel former California governor, Ronald Reagan, a fervent supporter of Prop 13, to the White House in 1980 . Reagan embraced many of the core principles of this campaign—cutting taxes and government power—suggesting the national resonance of suburban political ideals. 102 Second, many suburbanites opposed initiatives seeking to close the gap between cities and suburbs in terms of wealth, opportunity, and race. Across the country, suburbanites mobilized against busing for school integration, open housing, affordable housing, and Section 8 tenants. 103 In a similar way, nonpartisan suburban NIMBY (Not In My Back Yard) campaigns proliferated against public and nonprofit projects such as group homes, AIDS clinics, daycare centers, garbage dumps, and nuclear power plants. 104 Their actions suggested that suburbanites sought to reap the benefits of metropolitan belonging while minimizing its burdens. A third manifestation was suburbia’s role in the war on drugs. As recent work by Matt Lassiter shows, it created a policy approach that perceived a binary of “white suburban addict-victims and minority ghetto predator-criminals.” This construct reinforced in American political culture a tendency to demonize urban minorities while rendering white suburbanites as innocent victims, an oversimplification that belied more complex realities. 105 The cumulative effect of these efforts, many of them successful, was to reinforce inequality across metropolitan space. Moreover, these efforts attracted the attention of politicians at the national level, who increasingly cultivated these voters’ support through federal policies and judicial appointments that supported suburban prerogatives. During the Richard Nixon presidency, for instance, the Federal government limited its support for fair housing, metropolitan school integration, and the dispersal of affordable housing. 106
In the wake of civil rights laws that broke down explicit racial barriers in the housing market, suburban exclusion increasingly pivoted on class, fueling class segregation since 1980 . 107 Local governments played a crucial role in this. Some suburbanites withdrew into privately owned and governed residential enclaves, known as Common Interest Developments (CIDs) or what Evan McKenzie labelled “privatopias” for their capacity to concentrate local resources under tight local control. 108 In CIDs, civic and social belonging was restricted to select groups defined by ownership as opposed to citizenship. Suburbanites also used local zoning and building regulations as colorblind tools to exclude low-income residents through tactics like “exclusionary”—or “snob”—zoning, environmental protection codes, land trusts, historic preservation, and no-growth activism, which effectively shut out affordable housing. “Snob” zoning, for example, required large lots and floor areas and limited construction to single-family homes while prohibiting apartments and other multifamily dwellings. While housing and civil rights activists recognized this trend as early as the 1960s, it intensified over the following decades. These local initiatives were pushed not only by whites, but also some affluent Asian Americans who recognized value in suburban exclusivity. 109 During the 1970s, the U.S. Supreme Court in two cases upheld these broad municipal powers, defining “general welfare” in terms of the existing residents of a community, making it impossible for new, poorer residents to enter in and have a civic voice. As race disappeared from the rhetoric of suburban exclusion, it was replaced by a class-oriented discourse of property values, landscape aesthetics, tax rates, congestion, and environmental protection, often captured by the catchall phrase “quality of life.” One result was a crisis in affordable housing, with deep ramifications for African Americans, Latinos, and other minorities who generally earned less money than whites. By harnessing the power of local government, suburbanites maintained exclusionary practices using new tools and approaches. 110
This suburban outlook continued to influence the political parties and their agendas at the national level. The Republicans remained aligned with this suburban worldview, and by the 1990s the Democrats too—traditionally a city-based party—recognized the importance of the “suburban vote” and altered its ideology and platforms to win over this critical bloc. For the Democrats, this adjustment was enormous, forcing the party to recalibrate its traditional commitment to the urban poor, minorities, and labor (and their demand for public programs), with a new commitment to middle-class suburban voters (and their aversion to taxes and social welfare spending, and their reluctance to imperil their own property values). Some saw this adjustment—known variously as the “third way” or the “New Democrats”—as the effective death of liberalism others saw it as a realistic shift toward the political center. In either case, the influence of suburban political culture on the party’s shifting values was enormous. 111 In her study of Boston’s Route 128 suburbs, Lily Geismer described a hierarchical set of values among suburban liberals that justified their support of progressive causes like racially open housing while simultaneously opposing affordable housing. As progressive citizens who lived in suburbia, they viewed themselves as somehow apart, as “separate from, and not responsible for, many of the consequences of suburban growth and the forms of inequality and segregation that suburban development fortified.” 112 This captured a central dilemma of suburban Democrats.
Parallel to this suburban politics of defensive self-interest, a contrasting strand of progressive, social justice politics grew in the suburbs, particularly those experiencing ethno-racial change. As social diversification increased, so did new political agendas and forms of political organization, revealing “progressive potential in places once dismissed as reactionary.” 113 Progressive organizations included the Suburban Action Institute, established in 1969 to wage legal battles against exclusionary zoning, and Long Island’s Workplace Project and the Southwest Suburban Immigrant Project of Chicago, which campaigned to secure better education, workplace rights, and immigration reform. One study deployed Henri Lefebvre’s concept of “right to the city” to analyze progressive suburban activism. It focused on Maywood, California, southeast of Los Angeles, a suburb of working-class Latino immigrants (including the undocumented) who claimed rights by virtue of inhabitance in particular places. They mobilized around the issue of immigrant rights, challenging the local police practice of using DUI (driving under the influence of alcohol) checkpoints to identify and criminalize undocumented immigrants, who were charged high fees for towing, impounding and fines, which amounted to “a municipal tax on immigrants.” 114 A grassroots movement successfully challenged this policy, and went on to win seats on the city council which ultimately declared Maywood a “sanctuary city.” 115 Maywood and neighboring Latino suburbs also waged environmental justice campaigns. 116 Other progressive initiatives were launched in places like Alviso and Richmond, Ca., Silver Spring, Md., Shaker Heights, Oh., and suburbs around Cincinnati and Chicago. 117
The cumulative effects of suburban expansion since 1970 ranged from the toll on the environment, to the fiscal drain on both cities and outer suburbs, to the stubborn persistence of class and race segregation, to the everyday burdens of long commutes and social isolation and stimulated a wave of reform. Initiatives were wide ranging, some winning more public favor than others. All of these efforts sought to mitigate the effects of suburban sprawl through more equitable, diverse, and sustainable forms of metropolitan development.
Some of these initiatives stemmed from the growing recognition that metropolitan areas had become the drivers of the national—and global—economy. As such, they held more importance than ever as the fulcrum of the nation’s economic health. Scholars like Bruce Katz, Mark Muro, and Jennifer Bradley argue that the stakes are high when it comes to metropolitan well-being because they compete against other global metropolises in a race for capital and investment. Only those with “future growth plans that minimize traffic gridlock, pollution, ugly sprawl, and environmental devastation” can hope to succeed. Because the national economy hinges on vibrant, high-functioning metros, they contend, the federal government must reorient its economic development policies toward enhancing their power and resources (e.g., by funneling infrastructure money directly to metro areas instead of the states). 118
Other regional reformers extend this logic, arguing that metro-wide equity is crucial to metropolitan health and competitiveness in the global marketplace. Recognizing the negative effects of suburban political balkanization, which gives individual suburban municipalities the powers to act in their own narrow self-interest and veto wider social obligations, these reformers sought ways to overcome this suburban intransigence. They crafted programs that operated on a regional scale and emphasized the mutual benefits to all metropolitan players, suburban and urban alike, with regional equity and prosperity as the intertwined end goals. Urban analysts such as David Rusk, Myron Orfield, Peter Dreier, Manuel Pastor, and Chris Benner argued that metropolitan regions work best when class disparities are lessened, poverty is reduced, and communities across the board share both the benefits (like jobs) and obligations (like affordable housing) of metropolitan citizenship. As one study noted, “[C]ities and suburbs have become interdependent parts of shared regional economies. A number of recent studies have indicated that problem-ridden cities and declining suburbs go hand in hand. In other words, suburban islands of prosperity cannot exist in a sea of poverty.” 119 Poverty and inequality, they contend, drag down an entire metro region. For the good of all metropolitan players (e.g., rich, poor, businessmen, and workers) equity is a prerequisite if a metropolis stands any chance in the global economic race. 120
One plan to level the metropolitan playing field was proposed by legislator and legal scholar Myron Orfield, based on initiatives he spearheaded in Minneapolis–St. Paul during his term in the Minnesota State Legislature ( 1991–2003 ). Orfield’s approach was predicated on his detailed demographic analysis of American suburbs, which showed a wide range from prosperous to severely fiscally stressed. All suburbs, he argued, served to benefit from greater regional equity. To achieve this, he called for regional tax-base sharing that would lessen wasteful competition among suburbs and gradually equalize their resources, provide regionally coordinated planning of housing and infrastructure, and facilitate the formation of strong, accountable regional governing bodies. Orfield couched this as a “win–win” for cities and all suburbs based on their shared interests in metropolitan success. 121
A related reform movement is known as “Smart Growth.” This approach calls for the close coordination of metropolitan land-use planning to support efficient and environmentally friendly development. Seeking to stop the relentless push of outward sprawl, it supports higher-density, mixed-use developments closer to existing communities and job centers, metro growth boundaries, the preservation of open space for parks, farmland, and native habitat, and in-fill projects. The rationale is to move away from wasteful and environmentally draining sprawl toward denser, more environmentally sustainable development. 122 Oregon was a pioneer in the Smart Growth movement, passing the nation’s first statewide land use act in 1973 , which established growth boundaries for metro areas such as Portland. Other regions followed with similar legislation, including Minneapolis–St. Paul in 1994 , Maryland (which passed a Smart Growth Act in 1997 ) as well as Florida, Arizona, New Jersey and Pennsylvania. In 1998 alone, 240 state and local ballot initiatives related to land use and growth, with voters approving more than 70% of these initiatives by 2000 , more than 550 growth-related initiatives appeared on ballots, 72% of which passed. 123
Figure 8. The Del Mar Station project in Pasadena, Ca., exemplifies New Urbanism principles. It is a transit-oriented development that combines apartments (including 15 percent affordable units), retail, restaurants, and a plaza, all adjacent to a Metro station. In 2003, it won a Congress for New Urbanism Charter Award.
An influential off-shoot of Smart Growth is New Urbanism, a movement of designers, architects, developers, and planners which coalesced in the late 1980s. 124 In 1993 , they founded the Congress for New Urbanism to promote the principles of compact, mixed-used, walkable developments—tenets that completely inverted the design of post-WWII suburbs. As their charter states, “We advocate the restructuring of public policy and development practices to support the following principles: neighborhoods should be diverse in use and population communities should be designed for the pedestrian and transit as well as the car cities and towns should be shaped by physically defined and universally accessible public spaces and community institutions urban places should be framed by architecture and landscape design that celebrate local history, climate, ecology, and building practice.” 125
Smart Growth and New Urbanism are not without their critics. Some decry their tendency to promote gentrification, drive housing prices upward, and insufficiently provide for low-income residents. Because Smart Growth often limits the amount of developable land, it tends to help established homeowners by driving up their property values, while locking everyone else out. Describing Smart Growth in Los Angeles, Mike Davis characterized it as homeowner exclusivism, “whether the immediate issue is apartment construction, commercial encroachment, school busing, crime, taxes, or simply community designation,” with only the flimsiest link to environmentalism. 126 The same squeeze on land can promote gentrification. Smart Growth pioneer Portland, Oregon, for example, landed at the top of recent lists on metro areas with accelerating gentrification. Ringed by strict growth boundaries, the city became denser and housing prices and rents spiked, fueling gentrification. The trend hit the African American community especially hard. The city’s core lost 10,000 blacks from 2000 to 2010 historically black neighborhoods like King, Woodlawn, and Boise-Eliot transitioned to majority white. The result is what one account dubbed “the racial failure of New Urbanism.” 127
Suburban Crisis, Suburban Regeneration
In recent years, the suburbs came under a new round of criticism, this time perhaps the harshest yet. While bands like Green Day and Arcade Fire wailed on the suburbs for killing youthful freedom and joy, echoing generations of suburban critics, writers like Fortune Magazine’s Leigh Gallagher took it a step further by declaring “The End of the Suburbs” in her 2013 best-selling book. The alarm was justifiably stoked by the Great Recession of 2007–2009 , which devastated millions of American families who lost their homes to foreclosure, or saw their suburban home values plummet. Many questioned the wisdom of home ownership, which in turn cast doubt on the viability of suburbia altogether. These concerns, along with worries over sprawl’s negative impacts on climate change and millennials’ desires for more urban styles of living, fueled a back-to-the-city movement. Writers like Gallagher contended this was the end of the line for the suburbs. Americans were finally turning their backs on the form, reversing a long history of sprawling development. “Speaking simply,” she wrote, “more and more Americans don’t want to live there anymore.” 128
Yet different trends suggested otherwise. Immigrants, young families, seniors emotionally attached to their homes, and others continued gravitating toward suburban homeplaces, for a host of reasons—whether good schools, nostalgia, ethnic familiarity, jobs, or few good alternatives. Recent data suggests a return of suburban growth, after a post-recession slowdown. 129 In turn, contemporary suburbia is showing signs of change, adaptation, and stasis—all at once. As Manuel Pastor noted at a recent roundtable discussion on Suburban Crisis, Suburban Regeneration “the suburbs have a future, but the future ain’t what it used to be.” Some suburbs have transformed into ethnoburbs that support the values and needs of new immigrants, some have spawned social justice movements, others are adapting to aging populations through innovative retrofitting, while still others persist seemingly untouched, clinging to deeply entrenched traditions. 130 The discourse of suburbia’s demise may have attracted much public attention, but it masks the fascinating ways that the nation’s suburbs continue to claim a central, dynamic place in American life.
Discussion of the Literature
The historical scholarship of post-1945 suburbia has flourished in recent decades, pushing the boundaries of urban history scholarship. As suburbia’s role in postwar American life has grown stronger and broader, historians have responded by exploring multiple angles of this influence.
A foundational text is Kenneth T. Jackson’s Crabgrass Frontier: The Suburbanization of the United States ( 1985 ), which provided the first comprehensive overview of American suburban history. Adopting a definition of suburbia that emphasized their white, affluent, and middle-class character, Jackson surveyed the major stages of suburban development, starting with the elite 19th-century romantic suburbs, then tracing the gradual democratization of the form from streetcar and automobile suburbs to postwar mass-produced suburbs. While Jackson identified the broad forces that underlay this evolution, his emphasis on federal policy was a seminal contribution, outlining how Washington, D.C., not only subsidized massive postwar suburbanization but created racial/class exclusion in the process. The results were devastating for cities and the minorities and poor left behind. Along with Robert Fishman in Bourgeois Utopias ( 1986 ), Jackson established a normative portrait of suburbs as residential spaces of affluent white privilege. In a 30-year retrospective on Crabgrass Frontier , Dianne Harris noted that because the book established a clear set of characteristics for the suburbs (i.e., racial and economic homogeneity, gender roles, and architectural similarity), historians since had “a template with which to compare and contrast, and yes, to push back against.” 131 Working around the same time as Jackson and Fishman, historians Carol O’Connor, John Archer, Mary Corbin Sies, and Michael Ebner traced the roots of elite and socially exclusive suburbs and their subsequent trajectories into the 20th century . 132
Iconic postwar suburbs like Levittown were the focus of a cluster of studies that followed, including Barbara Kelley’s analysis of quotidian architectural practices in Levittown, Long Island Dianne Harris’ edited collection on Levittown, Pennsylvania and Elizabeth Ewen and Rosalyn Baxandall’s historical survey of Long Island’s suburbs. Like others before them, these works often took a local focus, digging deeply into the culture, architecture, politics, and institutions of specific suburban sites. 133
Other scholars pushed the boundaries of analysis, both geographically and demographically. One important current has been dubbed “the metropolitan turn.” Scholars in this school analyzed suburbs not in isolation but as fully embedded in the metropolitan political economy. These works pulled the lens back to explore not only “the ideological, political, and economic issues that bound city and suburb together in the postwar world” but also the “tensions that divided suburbs as they competed for business, development, and investment in the politically and socioeconomically fragmented metropolis.” 134 Jon Teaford’s pioneering work analyzed the politics and governance of metropolitan fragmentation. Since 1990 , Thomas Sugrue, Robert Self, Matthew Lassiter, and Kevin Kruse have produced influential works that investigated the ways suburbs proactively created and protected advantage—in the realms of business growth, politics (from conservative to centrist), wealth, and infrastructure—establishing enduring patterns of metropolitan inequality. 135 The “metropolitan turn” is also exemplified in recent scholarship by Lily Geismer, Andrew Highsmith, Ansley Erickson, Andrew Needham, Allan Dietrich-Ward, and Lila Berman, among others, who explored suburbs within a metropolitan scale of analysis, around issues such as liberalism, schooling, environmentalism, and religion. 136
Scholars have also explored the role of metropolitan spaces producing social distinctions such as race, gender, and sexuality. From the early postwar years, activist scholars such as Robert Weaver, Charles Abrams, and Clement Vose pioneered a large body of literature documenting discrimination in housing and the disadvantages of racial segregation in U.S. metro areas. In the 1980s and 1990s, scholars extended these insights, exploring the social and spatial production of inequality in metropolitan contexts. Feminist scholars such as Dolores Hayden illuminated the ways that separate and unequal assumptions about gender were built into the spaces of postwar suburbia. 137 Arnold Hirsch, George Lipsitz, and Thomas Sugrue revealed how biased federal housing policies, unequal access to homeownership, and suburbanization helped to forge an enlarged sense of white racial identity in postwar America that was attached to distinct social advantages—what Lipsitz called “a possessive investment in whiteness.” 138 Scholars such as David Freund, Eric Avila, and Robert Self show that ideas about race and white supremacy became inscribed in spaces ranging from suburban property markets to municipalities to popular culture and to the metropolis as a whole. 139
Another current of analysis pushed demographic boundaries, challenging the assumption that suburbs were white, middle class by definition. They argued for a more expansive profile that incorporated class, race, and ethnic diversity. Revisionist scholars such as Bruce Haynes, Andrew Wiese, Emily Straus, Matthew Garcia, Jerry Gonzalez, and Becky Nicolaides explored histories of African Americans, Mexican Americans, and white working-class suburbs. They identified distinct lifeways, cultures, and politics that in some cases stood apart from mainstream white suburbs, though in others replicated their class-driven concerns in the postwar period. 140
This focus on diverse suburbia carried forward in studies of the post-1970 era. This work offers some of the most robust challenges to the trope of suburbia as the domain of white middle-class privilege. This approach reflects not only a revisionist analytical perspective but also the changing realities of life in suburbs where immigrants, ethnic groups, racial minorities, and the poor have had time to settle in. Geographers and demographers began by mapping out changing demographic patterns in metropolitan areas, establishing a critical baseline for qualitative scholarship. 141 Subsequent scholars explored the internal dynamics and histories of these communities. An early focus was on ethnic suburbs. Pioneering studies by Timothy Fong, Leland Saito, and John Horton explored the explosive racial politics that erupted in Monterey Park, California, when it transitioned from all-white to multiethnic, while scholars like Wei Li and Min Zhou theorized new models of race and space around processes of ethnic suburban settlement. Asian-American suburbanization, in fact, emerged as a particularly robust field of inquiry perhaps because Asians gained an early foothold in postwar suburbs and became the “most suburban” of all ethnic groups. These studies explored the nature and implication of settlement patterns, spatial practices, transnational connections, political and cultural practices, and internal community dynamics. 142 More recently, historians have explored social justice politics in suburbia, some such as Lily Geismer emphasizing the limits of racial liberalism, others identifying vigorous progressive activism around issues like immigrant rights. This latest wave of scholarship, perhaps more than any, offers bold alternatives to the orthodox narrative, recognizing in suburbia multiple politics, culture, lifeways, and values which reflect the outlook of their diverse inhabitants. 143
Historical sources on postwar suburbia exist in multiple locales, depending on the scale of analysis. For localized research on individual suburbs, sources often exist in local libraries, historical societies, or state historical societies. Materials may include local newspapers, clip files, real-estate promotional material, oral histories, and records of local institutions. Because local newspapers are rarely digitized, most are available on microfilm or in original paper form. Municipal city halls may contain city council and planning department records, local ordinances, design review board minutes, mayoral papers, and the records of other local governing bodies, though some local public documents have been deposited in local or state archives. Some university libraries also hold material relating to suburban neighborhoods, while some specialized archives—such as the Huntington Library in Los Angeles, the Chicago Historical Society, and Detroit’s Walter P. Reuther Library of Labor and Urban Affairs—contain a wealth of local history materials, maps, booklets, real-estate ephemera, and private and public organization records. For the Levittowns, no intact corporate archive exists according to Dianne Harris. 144 For Park Forest, Illinois, and Lakewood, California, good holdings exist at the local public libraries.
At the county and metropolitan level, records may be available in county government offices—including property records such as building, deed, and mortgage records, which are indispensable to histories of real-estate development. Regional governing and planning bodies and university libraries may also hold regional reports on metropolitan transit, infrastructure, housing, planning, and the like. On the history of metro-wide politics, around issues such as busing, redevelopment, public housing, and environmentalism, university archives often hold the papers of key individuals, agencies, or advocacy groups. It is worth exploring the special collections in local universities of the metro area under study.
The National Archives holds a number of important collections that reflect federal policies on metropolitan areas such as the Housing and Home Finance Agency/Department of Housing and Urban Development, which includes the Home Owners Loan Corporation (e.g., individual city survey files and maps a growing body of this material has been digitized on a few websites), the Federal Housing Administration and the Public Housing Administration.
The built landscape itself is an excellent source for exploring the history of post 1945-suburbia, since much of this landscape is still intact. Homes, commercial districts, parks, streetscapes, job clusters, and physical barriers between segregated suburbs, as well as New Urbanist complexes and physical growth boundaries in Smart Growth cities, are all important markers of the suburban past. In 2002, the National Park Service also issued its own standards for historic preservation of America’s suburbs, called “Historic Residential Suburbs: Guidelines for Evaluation and Documentation for the National Register of Historic Places.” Although somewhat outdated, it reflects the preservation field’s understanding of America’s suburban past.
Labor Unions During the Great Depression and New Deal
In the early 1930s, as the nation slid toward the depths of depression, the future of organized labor seemed bleak. In 1933, the number of labor union members was around 3 million, compared to 5 million a decade before. Most union members in 1933 belonged to skilled craft unions, most of which were affiliated with the American Federation of Labor (AFL).
The union movement had failed in the previous 50 years to organize the much larger number of laborers in such mass production industries as steel, textiles, mining, and automobiles. These, rather than the skilled crafts, were to be the major growth industries of the first half of the 20th century.
Although the future of labor unions looked grim in 1933, their fortunes would soon change. The tremendous gains labor unions experienced in the 1930s resulted, in part, from the pro-union stance of the Roosevelt administration and from legislation enacted by Congress during the early New Deal. The National Industrial Recovery Act (1933) provided for collective bargaining. The 1935 National Labor Relations Act (also known as the Wagner Act) required businesses to bargain in good faith with any union supported by the majority of their employees. Meanwhile, the Congress of Industrial Organizations split from the AFL and became much more aggressive in organizing unskilled workers who had not been represented before. Strikes of various kinds became important organizing tools of the CIO.
Ashby, Leroy. 1985. "Partial Promises and Semi-Visible Youths: The Depression and World War II." In American Childhood: A Research Guide, ed. Joseph M. Hawes and N. Ray Hiner. Westport, CT: Greenwood Press.
Cohen, Robert, ed. 2002. Dear Mrs. Roosevelt: Letters From Children of the Great Depression. Chapel Hill: University of North Carolina Press.
Cohen, Robert. 1993. When the Old Left Was Young: Student Radicals and America's First Mass Student Movement, 1929 – 1941. New York: Oxford University Press.
Elder, Glenn H., Jr. 1999. Children of the Great Depression: Social Change in Life Experience. Boulder, CO: Westview.
Fass, Paula. 2000. "Children and the New Deal." In Childhood in America, ed. Paula Fass and Mary Ann Mason. New York: New York University Press.
Hawes, Joseph M. 1991. The Children's Rights Movement: A History of Advocacy and Protection. Boston: Twayne.
Lindenmeyer, Kriste. 1997. "A Right to Childhood": The U.S. Children's Bureau and Child Welfare, 1912 – 1946. Urbana: University of Illinois Press.
Komarovsky, Mirra. 1949. The Unemployed Man and His Family: The Effect of Unemployment upon the Status of the Man in Fifty-Nine Families. New York: Dryden.
Modell, John. 1989. Into One's Own: From Youth to Adulthood in the United States, 1920 – 1975. Berkeley: University of California Press.
Reiman, Richard. 1993. The New Deal and American Youth: Ideas and Ideals in a Depression Decade. Athens: University of Georgia Press.
Thompson, Kathleen, and Hilary MacAustin. 2001. Children of the Depression. Bloomington: Indiana University Press.
Tyack, David, Robert Lowe, and Elisabeth Hansot. 1984. Public Schools in Hard Times: The Great Depression. Cambridge, MA: Harvard University Press.
Emancipation and Reconstruction
At the outset of the Civil War, to the dismay of the more radical abolitionists in the North, President Abraham Lincoln did not make abolition of slavery a goal of the Union war effort. To do so, he feared, would drive the border slave states still loyal to the Union into the Confederacy and anger more conservative northerners. By the summer of 1862, however, enslaved people, themselves had pushed the issue, heading by the thousands to the Union lines as Lincoln’s troops marched through the South.
Their actions debunked one of the strongest myths underlying Southern devotion to the “peculiar institution”—that many enslaved people were truly content in bondage𠅊nd convinced Lincoln that emancipation had become a political and military necessity. In response to Lincoln’s Emancipation Proclamation, which freed more than 3 million enslaved people in the Confederate states by January 1, 1863, Black people enlisted in the Union Army in large numbers, reaching some 180,000 by war’s end.
Did you know? During Reconstruction, the Republican Party in the South represented a coalition of Black people (who made up the overwhelming majority of Republican voters in the region) along with "carpetbaggers" and "scalawags," as white Republicans from the North and South, respectively, were known.
Emancipation changed the stakes of the Civil War, ensuring that a Union victory would mean large-scale social revolution in the South. It was still very unclear, however, what form this revolution would take. Over the next several years, Lincoln considered ideas about how to welcome the devastated South back into the Union, but as the war drew to a close in early 1865, he still had no clear plan. In a speech delivered on April 11, while referring to plans for Reconstruction in Louisiana, Lincoln proposed that some Black people–including free Black people and those who had enlisted in the militaryserved the right to vote. He was assassinated three days later, however, and it would fall to his successor to put plans for Reconstruction in place.
Baseball as America
By October 1928, the question of the color line in towns like Baltimore had seemingly been answered. The schools were segregated by law, while churches, theaters, and neighborhoods were segregated by custom. Black and white residents ate at different restaurants, slept in different hotels, and even visited their loved ones in separate hospitals. Children played at segregated YMCA branches. Adults attended social and political functions of segregated clubs. But at least one event during that month demonstrates that race relations were never quite as simple as they may appear. That month, the Baltimore Black Sox of the Eastern Colored League defeated an all-white All-Star team composed of some of the best players in the major leagues. Ten thousand fans witnessed the game, and there were no reports of racial violence. Despite efforts to prevent black fans from attending by raising gate prices throughout the day, several thousand black fans witnessed their team prevail.
The Negro National League was the first commercially successful African American baseball league. In 1924, the champion of this league, the Kansas City Monarchs, defeated the champion of the Eastern Colored League, Pennsylvania’s Hilldale club and claimed the title as the champion of what became known as the “Colored World Series.”
The victory of the Black Sox was not an uncommon scene throughout the 1920s. In fact, Negro League teams had a winning record against the all-white major leaguers that challenged them. The record was ironically aided by organized baseball’s attempt to prevent these games from happening. Following a series of victories by teams like the St. Louis Stars, New York Black Yankees, and Homestead Grays of Pennsylvania over their local major league teams, Commissioner Kenesaw Mountain Landis ruled that major league clubs could no longer challenge black teams. However, these contests were the most popular exposition games of the season, and they sold tickets and filled ballparks. As a result, white major leaguers simply assembled their own teams of “all stars” composed of players from area teams. Given the desire of players to maximize their share of the gate receipts, these all-star teams often lacked the depth of regular season pitching rosters. As a result, Landis’s ruling increased the tendency of the Negro League teams to prevail over whites.
One must be careful not to exaggerate these symbolic victories over Jim Crow. Placed in a larger context, these baseball games pale in comparison with the progress that was forged in classrooms and courtrooms. Yet for the thousands who attended these games, especially those laboring behind the color line, these victories had profound meaning. For example, in 1925, an all-black, semipro team in Wichita, Kansas, defeated a team representing the local Ku Klux Klan. The schools of Wichita remained segregated the next morning, but surely those who witnessed the game thought about the larger meaning of the afternoon’s events.
From a sociological point of view, the Monarchs have done more than any other single agent in Kansas City to break down the damnable outrage of color prejudice that exists in the city…[When]…both races sit side by side and root for their particular favorite and think nothing of it, then after a while the same relation may be carried to the workshop, and the ball grounds may be the means of causing someone to be employed where he would not otherwise have been considered, just because “he sat next to me out at the ball park Sunday—he’s a pretty good fellow.”
—Kansas City Call (African American newspaper), October 27, 1922
As a touring exhibit demonstrated nearly a century later, baseball was America in the 1920s. The national pastime mirrored the diversity of the nation and any town with more than a few hundred residents sponsored a team that was the pride of the community. On any given Sunday afternoon, nearly as many Americans could be found at the local ballpark as had attended church in the morning. The teams mirrored the diversity of the congregants. German immigrants in North Dakota and Jewish immigrants in New York City commemorated each Fourth of July by playing the American game, a celebration of their new nation and a proud display of their ethnic unity as they challenged teams from other immigrant groups.
Women’s teams had been competing since Vassar College’s first team took the field in 1866, most famously as part of the touring “Bloomer Girls” teams of the turn of the century. Native American teams toured as well, blurring the lines of sport, showmanship, and accommodation to the expected stereotypes of the white audiences. Japanese American teams like the Fresno Athletics defeated the best college and semipro teams on the West Coast. When not playing for the Yankees, Babe Ruth toured the nation throughout the 1920s as his team of all-stars took on all of these diverse local players. “Organized baseball” consisting of the Major League and its Minor League affiliates had drawn the color line since the late nineteenth century, but barnstorming teams such as Ruth’s were more concerned about revenue than the regulations of their commissioner. As a result, Ruth welcomed the competition of African American baseball greats such as Josh Gibson Sometimes referred to as the “black Babe Ruth,” Gibson compiled the most impressive career statistics in the history of the sport, leading some scholars of the Negro Leagues to argue that Ruth should be called the “white Josh Gibson.” Gibson played among many of the greatest ballplayers of all races in the United States, the Caribbean, and Latin America, but owing to race he was excluded from the Major Leagues. , who many believe was the greatest slugger of the era. Ruth also played alongside Japanese American stars such as Kenichi Zenimura, the founder of the Fresno Athletics.
Asian Americans on the West Coast formed competitive baseball teams. This 1913 poster advertises a touring team composed of Asian Americans who lived in Hawaii and played against college teams throughout the American West.
In addition, thousands of white and black players from the Major Leagues and Negro Leagues played in Cuba, the Dominican Republic, Mexico, and various Caribbean and Latin American countries each summer. These tours resulted in the discovery of hundreds of great Latino ballplayers, many of whom traveled and played in the United States on international touring teams or as players on Negro League teams. These ballplayers were role models, ambassadors, leading men in their community, and some of the first and most visible activists against segregation as they traveled through the nation.
The celebrity status of a team might erode racial barriers. At other times, black players confronted segregation directly by demanding respect and equal accommodations. However, one must remember that these men were ballplayers, managers, and owners above all else. Team members were most concerned with their ability to play the game they loved, and owners had a vested interest in minimizing racial conflict. They could not afford to take chances with alienating white spectators or demand equal accommodations at the risk of being placed in jail during an important road trip. As a result, the teams worked to avoid confrontation by planning their trips along familiar routes, patronizing black-owned businesses, and staying with black families in small towns without black-owned restaurants and hotels.
A handful of African American teams sought refuge from America’s binary color line by choosing names such as the Cuban Stars, thereby blurring the line between Afro-Caribbean and Afro-American. About fifty Latino players with light complexions and surnames that reflected the European Spanish heritage of many Caribbean islanders were even deemed “racially eligible” to play for Major League teams. The inclusion of foreign and American-born players of Latino heritage further demonstrated the middle ground between black and white. The complexion of most Caribbean islanders was usually too dark to pass as “Castilian” or any of the other creative euphemisms managers sought to apply to a talented ballplayer they wanted to convince the rest of the world was a descendent of European conquistadors. The existence of these charades, as well as several attempts to “pass” a black player as Native American, demonstrated that race was a social construction rather than a scientifically identifiable category.
Review and Critical Thinking
- How does the Sheppard-Towner Act reflect the political environment of the 1920s and government expectations at that time? Why might the AMA choose to oppose such measures, and why would this organization present social welfare programs for women and children as analogous to Socialism?
- Ford became infamous for his negative views of the working class. Why might someone with such views voluntarily pay such high wages?
- Why did labor union membership decline during the 1920s? What were the arguments for and against union membership during this era?
- How did the emerging field of marketing affect the United States during the 1920s? What were the goals of marketers, and how were their tactics different from the ways goods were promoted in previous generations?
- How did baseball reflect American life and culture during the 1920s? How do the Negro Leagues and the experiences of racial and ethnic minorities in sport demonstrate the opportunities and challenges faced by nonwhites at this time?
The Great Depression
But it was to no avail. On Monday, the market continued its sell-off, falling 13 percent further. On Tuesday, October 29, the damage continued. When the closing gavel finally fell just before eight in the evening of that calamitous day, stocks had lost an additional 12 percent. Stunned crowds of investors filled the streets outside the New York Stock Exchange on Wall Street. The Great Depression, the greatest economic and financial crisis in American history, was underway.
The &ldquoBlack Tuesday&rdquo stock market crash has attained iconic status in American lore. But contrary to present-day misconception, the crash neither initiated nor was chiefly responsible for the depression that followed.
Americans were no strangers to economic downturns stretching all the way back to the Panic of 1819. Most such episodes, including the then-recent Panic of 1907, when the stock market fell almost 50 percent from its 1906 high, and the post-war recession of 1918-1921, in which the American economy shrank by a greater degree than it did during the Great Depression, tended to be severe but brief interruptions in otherwise robust economic growth. In late October 1929, there was little reason to believe that the latest market calamity would be otherwise.
But this time, things would turn out very differently. For although this economic downturn, like all other downswings in the business cycle, had been brought about by unwarranted credit expansion on the part of the banking system, spearheaded in this instance by the brand-new Federal Reserve, the actions of bankers and politicians before and after the stock market crash turned a much-needed market correction into an economic apocalypse.
The roots of the Great Depression extended back as far as the Panic of 1907. That panic, primarily a bankers&rsquo affair that resulted in runs on numerous banks and trusts, especially in New York City, was over in a few weeks with minimal impact on the public at large. J.P. Morgan organized bankers and financiers to arrange new lines of credit amongst themselves and buy up stocks of otherwise healthy corporations. The panic was thus solved expeditiously by market forces and affected parties like Morgan acting in their own self-interest. Yet it persuaded many on Wall Street that the time had arrived for America to have a central bank, as England and most of the wealthy nations of Europe had had for many years. Bankers, like Jacob Schiff of the investment firm Kuhn, Loeb, and Co., were vociferous in demanding a central banking authority to stabilize the allegedly chaotic banking system.
Six years later, in 1913, they got their wish when the Federal Reserve Act, which created the Federal Reserve, was passed. The act was shepherded through Congress by Senator Nelson Aldrich, with the secret support of many of America&rsquos &mdash and the world&rsquos &mdash wealthiest men, like international banker Paul Warburg and the aforementioned Schiff.
The most powerful man at the new Federal Reserve was Benjamin Strong, the strong-willed and secretive head of the Federal Reserve Bank of New York from 1914 until his death in 1928. It was Strong, far more than the several chairmen of the Fed who came and went during his tenure, who was most influential in shaping U.S. monetary policy during the 1920s. Strong had been present at a secret meeting on Jekyll Island, Georgia, in 1910, where what became the Federal Reserve System was planned. Strong was also well connected in international banking circles, especially with Montagu Norman, governor of the Bank of England. It was Strong&rsquos relationship with Norman, probably more than any other factor, that led to the Fed&rsquos inflationary (monetary expansion) policies of the 1920s and set the stage for the 1929 bust.
Montagu Norman, whom economist Murray Rothbard aptly termed &ldquothe Mephistopheles of the inflation of the 1920s,&rdquo had great difficulty propping up Britain&rsquos postwar finances. Under pressure to restore British currency to the prewar gold standard &mdash which would have required credit contraction to offset the effects of wartime inflation &mdash Norman chose instead to open the money spigots wider. As a consequence, the British pound continued to lose value and, more alarmingly for British bankers, British gold migrated across the Atlantic to the United States, where it found more stable valuation in the U.S. dollar. As long as the disparity between American and British monetary policy continued, so would the flight of gold from a weaker to a stronger currency. From Norman&rsquos viewpoint, something had to be done.
That something was greater financial cooperation between Britain and the United States, with the former taking the lead. As detailed by Rothbard in America&rsquos Great Depression:
The &ldquoisolationism&rdquo of U.S. foreign policy in the 1920s is almost wholly a myth, and nowhere is this more true than in economic and financial matters…. On Norman&rsquos appointment as Governor [of the Bank of England] during the War, Strong hastened to promise him his services. In 1920, Norman began taking annual trips to America to visit Strong, and Strong took periodic trips to visit Europe. All of these consultations were kept highly secret and were always camouflaged as &ldquovisiting with friends,&rdquo &ldquotaking a vacation,&rdquo and &ldquocourtesy visits.&rdquo The Bank of England gave Strong a desk and a secretary for these occasions, as did the Bank of France and the German Reichsbank. These consultations were not reported to the Federal Reserve Board in Washington. Furthermore, the New York Bank and the Bank of England kept in close touch via weekly exchange of private cables.
What did Montagu Norman want of his American counterpart? Nothing less than for the Federal Reserve to inflate the dollar to protect the British pound and allow Norman&rsquos easy-money policy to continue. In other words, if the dollar were devalued in concert with the British pound, the flight of British gold to America would cease. The American public would eventually pay the price for the Fed&rsquos monetary inflation, but at least British and European politicians and bankers would be off the hook.
All of this was sold to Strong and other American bankers as a necessary step to allow Great Britain and other European nations to return to the pre-war gold standard &mdash which never took place. What happened instead were several episodes of coordinated global inflation that allowed the European nations to return, not to a full-fledged gold standard (which would have included the resumption of the minting of gold coins) but to a gold bullion standard, in which only large amounts of currency could be redeemed in exchange for gold bars &mdash suitable for international finance, but irrelevant to ordinary citizens forced to traffic thenceforth in paper money backed only by very dubious guarantees. Gold as a fully convertible international currency had been abandoned, and the age of paper money was ushered in.
This &ldquoclose international Central Bank collaboration of the 1920s,&rdquo Rothbard remarks, &ldquocreated a false era of seemingly sound prosperity, masking a dangerous world-wide inflation. As Dr. [Melchior] Palyi has declared, &lsquoThe gold standard of the New Era was managed enough to permit the artificial lengthening and bolstering of the boom, but it was also automatic enough to make inevitable the eventual failure.&rsquo &rdquo
As for Benjamin Strong&rsquos motives, a memorandum from one of his staffers, cited by Rothbard, speaks volumes about the mentality of international bankers during the 1920s who were setting up the world for a colossal fall:
He [Strong] was obliged to consider the viewpoint of the American public, which had decided to keep the country out of the League of Nations to avoid interferences by other nations in its domestic affairs, and which would be just as opposed to having the heads of its central banking system attend some conference or organization of the world banks of issue…. He said that very few people indeed realized that we were now [i.e., in 1928, when the first signs of trouble were appearing on the horizon] paying the penalty for the decision which was reached early in 1924 to help the rest of the world back to a sound financial and monetary basis.
In other words, Strong and his counterparts overseas were acting with flagrant disregard for the well-being of their respective citizenries and, in many cases, the laws of the land. Because Americans in the 1920s were still deeply suspicious of the motives of the international power elites, Strong carried out his pro-British, internationalist agenda in secrecy. He died in 1928, leaving millions of Americans to foot the bill for years of monetary exuberance in the service of foreign interests. When the inflationary bubble finally burst in late 1929, few ruined investors understood that their losses were part of the price to be paid for the machinations of Benjamin Strong and other international bankers.
The onset of the Great Depression very nearly coincided with the presidency of Herbert Hoover. According to conventional wisdom, it was Hoover, the pro-free market Republican conservative, who was responsible for the Great Depression &mdash by allegedly permitting the chaotic excesses of the market to make things worse after the 1929 crash. Nothing could be further from the truth.
Like most modern-day Republican leaders, Hoover the politician publicly sang the praises of the free market &mdash while working sedulously to hamper the workings of the market with a welter of intrusive new government programs.
Early in the Hoover administration, such interventionism mostly took the form of presidential hectoring of industry leaders, pressuring them to keep wage rates at what the government deemed to be optimal levels, making threats against supposedly wicked stock speculators, and agitating for more public-works projects to create jobs.
But in 1931, things took a turn for the worse. The nations of Europe one by one went off the gold standard entirely, repudiating their obligations to redeem debt in gold into the bargain. Especially calamitous was Britain&rsquos renunciation of the gold standard on September 20 of that year, despite earnest assurances of the perfidious Montagu Norman to the head of the Netherlands Bank just two days previously that England had no such intention. The European crisis touched off by the exodus from the gold standard wrought havoc in American banking and sowed further distrust in the American public that their leaders would soon follow Europe&rsquos example. Stocks of gold held by American banks declined precipitously as the public redeemed its paper money for gold. Moreover, bank reserve levels dropped as the public, spooked at the prospect of bank failures, converted their savings into legal tender.
National productivity declined steeply throughout the year as businesses closed their doors and unemployment rose. By 1932, Hoover was ready to take more drastic measures. To cover a burgeoning federal deficit, President Hoover agitated for, and Congress passed, a huge tax increase. The Revenue Act of 1932 raised income, corporate, stock transfer, and estate taxes, and restored or created whole new tax categories, including gift taxes and a wide range of new sales taxes on items ranging from gasoline to automobiles to luxury items like furs and jewelry.
Adding to the folly of a huge tax hike in the midst of a depression was a range of new government programs destined to interfere further in the already hobbled American economy. Chief among them was the Reconstruction Finance Corporation (RFC), a credit agency designed to lend public monies to local governments, banks, agriculture, and a wide range of other industries. During 1932 the RFC loaned more than $2 billion to corporations teetering on the brink of insolvency &mdash 80 percent of them railroads and banks. Where private capital tried to penalize such businesses for imprudent investment, the Hoover administration conducted one of the biggest bailouts in history &mdash an event drearily familiar in our day, but more of a novelty in the still-comparatively laissez-faire climate of the early &rsquo30s. &ldquoAny attempt &hellip to save the weaker debtors necessarily prolongs the depression,&rdquo columnist John T. Flynn pointed out at the time. &ldquoThe quicker the correction comes, the quicker the regeneration of the road will come.&rdquo
Besides the RFC, the Hoover administration took a number of other actions, including the creation of a new Home Loan Bank System &mdash the beginning of federal government involvement in the housing industry &mdash and bullying of stock traders, in effect forcing them to impose regulations on short-sellers, whom Hoover believed were especially to blame for the stock market collapse.
Such measures might seem relatively benign in a time when a Congress can spend hundreds of billions of dollars on a new farm bill or maintain near-total regulatory controls on American industry without a peep being raised, but in the 1930s, when the federal regulation of business was still practically unknown, they were revolutionary. While Hoover&rsquos mini-New Deal was dwarfed by what was to follow under the Roosevelt administration, it set the tone for a new, overweening role for the federal government in supervising and regulating nearly every aspect of American financial and economic activity.
The American electorate, rightly ascribing the depression&rsquos unnatural length and severity to President Hoover&rsquos policies, hustled him out of office in 1932. His replacement, the former governor of New York, Franklin Delano Roosevelt, wasted no time making his predecessor look benign by comparison.
After declaring a four-day bank holiday the day after his inauguration in early March 1933, Roosevelt (and a compliant Congress) followed with the first of a long series of federal outrages: the Emergency Banking Act. This bill provided for new federal inspection of banks and conferred upon federal inspectors the authority to close down banks deemed insolvent. More ominously, the bill gave the Treasury the authority to confiscate all privately owned gold and compel Americans to accept the government&rsquos fiat (unbacked) money in exchange. Thus did the federal government, having driven gold into private hands by devaluing the dollar through inflation, neatly evade responsibility for its own misdeeds. Cowed Americans surrendered their gold to the federal government, and although foreign investors kept the right to exchange American currency for gold until 1971, American citizens were forbidden from owning gold until January 1, 1975, when all restrictions on the private ownership of gold in the United States were finally lifted.
Still worse was to come. Candidate Roosevelt, in his speech accepting the Democratic Party presidential nomination, had promised Americans a &ldquonew deal,&rdquo which he characterized as both &ldquoa political campaign&rdquo and &ldquoa call to arms.&rdquo Following the Emergency Banking Act, Roosevelt moved to bring agriculture under the federal umbrella, creating a new system of farm subsidies and production controls under the aegis of the Agricultural Adjustment Administration (AAA). Although the AAA was correctly ruled unconstitutional by the Supreme Court in 1936, Roosevelt soon replaced it with other, similar programs which later, more pliant courts refused to invalidate. Centrally planned agricultural production has been a feature of the American economy ever since.
In June of 1933, Congress passed the National Industrial Recovery Act which, mainly through its centerpiece program, the National Recovery Administration (NRA), set about transforming America into a centrally planned economy along socialist lines. Under the NRA, a welter of new regulations imposed price controls and production standards on numerous goods and services. Although the NRA, like the AAA, was eventually ruled unconstitutional, it too set an unhappy standard for future government regulation of just about every economic activity imaginable.
Under the cover of an acute economic crisis created by government itself, President Roosevelt waged nothing less than a counter-American Revolution, a comprehensive repudiation of the Constitution and of the twin values of federalism and the free market. Probably no American president since has done violence to the Constitution on so many fronts, establishing for future generations the melancholy precedent that the federal government should do whatever it deems proper in the name of upholding the &ldquogeneral welfare.&rdquo Indeed, the Roosevelt administration slyly used the term &ldquowelfare&rdquo to describe government handouts and make-work programs that more self-reliant prior generations of Americans had derisively called &ldquothe dole.&rdquo Under the Roosevelt administration, the foreign ideology of socialism made a home in Washington, putting to flight any lingering sympathies for limited government power.
Thanks to the New Deal, the agony of the Great Depression was drawn out until the onset of the Second World War. But the greatest casualty of that sad chapter in American history was not the bankruptcies, the fortunes lost, the livelihoods destroyed, or the millions of Americans plunged into poverty. It was the almost irreparable damage done to the U.S. Constitution. The Great Depression provided the pretext for repudiating sound money, empowering secretive international financial elites, establishing a federal government stranglehold on agriculture, and in general imposing a regime of industry controls that mirrored the so-called &ldquoreforms&rdquo of fascist and socialist regimes in the Old World.
Although we have so far been spared another depression as severe as the Great Depression, the unhappy legacy of that time persists, writ large. Today, however, most Americans take the revolutionary fruits of that time for granted: fiat money Social Security federal welfare programs farm subsidies federal intervention in housing, education, finance, and numerous other economic sectors federal firearms laws &mdash the Hoover-Roosevelt legacy goes on and on.
Next to war, nothing is so dangerous to liberty as economic turmoil. The American Great Depression, fostered from start to finish by our own federal government, with the help of wily bankers and financiers, allowed government to magnify its powers in the name of rescuing us from ourselves &mdash when in fact it is from government abuse that we need to be rescued, then as now. In the long run, freedom, not government, is the best cure for economic crises. Had the likes of Benjamin Strong, Herbert Hoover, and FDR believed this, we would never have had a Great Depression.
Transforming the Waterfront
Davidson, Janet F., and Michael S. Sweeney. On the Move: Transportation and the American Story. Washington, D.C.: National Geographic Society, 2003.
Fairley, Lincoln. Facing Mechanization: The West Coast Longshore Plan. Los Angeles: Institute of Industrial Relations, University of California, Los Angeles, 1979.
Federal Maritime Commission. Seminars on the Container Revolution. For the Committee on Commerce, U.S. Senate. Washington, D.C.: Government Printing Office, 1968.
Finlay, William. Work on the Waterfront: Worker Power and Technological Change in a West Coast Port. Philadelphia: Temple University Press, 1988.
Fitzgerald, Donald. “A History of Containerization in the California Maritime Industry: The Case of San Francisco.” Santa Barbara, Cal.: University of California, Santa Barbara, Ph.D. diss., 1986.
Gibson, Andrew, and Arthur Donovan. The Abandoned Ocean: A History of United States Maritime Policy. Columbia: University of South Carolina Press, 2000.
Goldblatt, Louis. Men and Machines: A Story about Longshoring on the West Coast Waterfront. San Francisco: International Longhoremen’s and Warehousemen’s Union and Pacific Maritime Association, 1963.
International Longshoremen’s and Warehousemen’s Union. The ILWU Story: Six Decades of Militant Unionism. San Francisco: International Longshoremen’s and Warehousemen’s Union, 1997.
Larrow, Charles P. Harry Bridges: The Rise and Fall of Radical Labor in the United States. New York: Lawrence Hill, 1972.
McKenzie, David R., Mark C. North, and Daniel S. Smith. Intermodal Transportation: The Whole Story. Omaha, Neb.: Simmons-Boardman Books, 1989.
Mills, Herb. “The San Francisco Waterfront: The Social Consequences of Industrial Modernization,” in Case Studies on the Labor Process, ed. Andrew Zimbalist. New York: Monthly Review Press, 1979.
Mills, Herb, and David Wellman. “Contractually Sanctioned Job Action and Workers’ Control: The Case of the San Francisco Longshoremen,” Labor History 28, 2, 1987.
Minor, Woodruff. Pacific Gateway: An Illustrated History of the Port of Oakland. Oakland, Cal.: Port of Oakland, 2000.
Niven, John. The American President Lines and Its Forebears 1848-1984: From Paddlewheelers to Containerships. Newark: University of Delaware Press, 1987.
Nutter, Ben E. “Oral History for the Port of Oakland.” Oral History Series, Regional Oral History Office, Bancroft Library, University of California, Berkeley, 1994.
Regal, Charles. “20 Years of Containerization in the Pacific,” Ampersand, spring 1978.
St. Sure, Paul, Harry Bridges, and Irving Bernstein. “The U.S. West Coast Agreement,” Fairplay Shipping Journal, Cargo Handling Supplement, 215, May 13, 1965.
Schwartz, Harvey. “Harry Bridges and the Scholars: Looking at History’s Verdict,” California History 59, 1, spring 1980.
Schwartz, Harvey, ed. Harry Bridges, a Centennial Retrospective: An Oral History of the Origins of the ILWU and the 1934 Strike. San Pedro, Cal.: Harry Bridges Institute, 2001.
Schwendinger, Robert J. International Port of Call: An Illustrated Maritime History of the Golden Gate. Woodland Hills, Cal.: Windsor Publications, 1984.
Swados, Harvey. “West-Coast Waterfront—the End of an Era,” Dissent 8, autumn 1961.
Theriault, Reg. Longshoring on the San Francisco Waterfront. San Pedro, Cal.: Singlejack Books, 1980.
Weir, Stan. “The Retreat of Harry Bridges,” New Politics 8, winter 1969.
Wellman, David. The Union Makes Us Strong: Radical Unionism on the San Francisco Waterfront. New York: Cambridge University Press, 1995.
Winslow, Calvin, ed. Waterfront Workers: New Perspectives on Race and Class. Urbana: University of Illinois Press, 1998.
Worden, William L. Cargoes: Matson’s First Century in the Pacific. Honolulu: University Press of Hawaii, 1981.