Final paper

7.1 A Return to Normalcy

Some elements of prewar society persisted into the 1920s, including concerns over private economic power and government responsibility for social problems. Racial and ethnic divisions and tensions that had grown in earlier decades endured and even intensified. Overall, though, the decade following World War I represented a shift in temperament and culture for the United States. The idealism and reform impulse of the Progressive era were replaced by conservatism, materialism, and a rising consumer culture. Americans turned away from imperialism and involvement in foreign affairs and back toward isolationism. Among the most striking changes of the 1920s was the state of American politics (Cooper, 1990).

Harding and Coolidge

With his health failing at the end of his second term and struggles over the League of Nations continuing, Woodrow Wilson had ceased to be a viable leader for the Democratic Party by 1920. In the election that year, the Democrats nominated Ohio governor James M. Cox for president, with Franklin D. Roosevelt for vice president. The other commanding national political presence, former president Theodore Roosevelt, had died in his sleep on January 5, 1919. On the 10th ballot held at the convention, the Republicans nominated conservative Ohio senator Warren G. Harding. Harding’s running mate, Calvin Coolidge, had most recently served as the governor of Massachusetts.

Newly enfranchised female voters swelled the electorate, so that 8 million more people voted in the 1920 election than had in 1916. They cast their ballots for Harding by a large margin because he was seen as sympathetic to their concerns. During the campaign he sent a personal letter to Carrie Chapman Catt endorsing suffrage, and he sent a campaign staffer to be on hand for the Tennessee legislature vote that ratified the 19th Amendment. The election was a landslide, with Harding earning 16 million votes to Cox’s 9 million. Campaigning from federal prison, Socialist Eugene V. Debs claimed just over 3% of the vote, demonstrating that more than a million American voters did not find representation of their interests in the dominant parties.

Harding’s administration represented a turn away from reform and toward conservative policies. He argued that the nation needed “not heroism but healing, not nostrums but normalcy, not revolution but restoration,” by which he meant an emphasis on economic growth that would result in community and harmony. He offered America a normalcy that represented an end to reform and war and aimed to substitute them with small-town simplicity full of nostalgia and tradition (Payne, 2009).

In international affairs, Harding opposed Wilson’s advocacy for membership in the League of Nations, but also Theodore Roosevelt’s arguments about military leadership, pacts, and alliances, even within the Western Hemisphere. Harding largely avoided discussing the growing global interconnections between nations and economies, although the president well knew that it was impossible to insulate the United States from the world economy and global politics. Instead, he dealt with international issues quietly while he publicly advocated a return to an unconcern over foreign affairs and gave Americans the impression that they could accept or reject involvement in world concerns when and where they pleased.

On the domestic front, Harding supported the efforts of conservative Republicans to court big business and subvert the gains made by labor during the war. Harding’s probusiness orientation faced some challenges at the state level when Progressive Republican governors were elected in Wisconsin and Montana. For the most part, however, conservative Republican leaders surged forward with their agenda (Cooper, 1990).

A series of scandals also characterized Harding’s presidency. He appointed his close friends and allies to important political positions, and several members of the so-called Ohio Gang took advantage of their place in the Harding administration to advance their own agendas. It is unclear if Harding was fully aware of the actions of his appointees, since many of the scandals came to light only after his death.

The Teapot Dome affair, involving the lease of navy petroleum reserves in Wyoming and California to private companies without public bidding, was the subject of a congressional investigation. The scandal resulted in the bribery conviction of Harding’s secretary of the interior, Albert B. Fall, who had negotiated the leases. Other Harding administration scandals involved corruption in the Justice Department, perpetrated by his attorney general and former campaign manager Harry M. Daugherty, and in the Veterans’ Bureau, where director Charles R. Forbes was accused of putting his own economic gains ahead of the needs of returning veterans.

A New Economic Vision

In 1921 the nation’s economy was in a severe slump. Demobilization resulted in high unemployment, and investments fell below the rate of inflation, leaving all Americans with less buying power. The end of wartime production resulted in thousands of layoffs, and the nation entered a period of economic adjustment that required intervention. Even Americans still employed found that their incomes did not stretch far enough to cover household needs, and the purchase of extra consumer goods was out of the question for most households.

To deal with the economic concerns, Harding called a President’s Conference on Unemployment. Its participants recommended a controversial public works expansion and a bonus bill to reward veterans for their service, but both failed in Congress. Instead, the administration cut taxes and created a budget bureau to oversee and limit the spending of government funds. Once the Federal Reserve slashed interest rates, investment recovered, and by 1923 many industries actually faced a labor shortage (Perrett, 1982).

Harding’s approach to the presidency was in many ways the opposite of his Progressive predecessor (McGerr, 2005). He supported more individual freedom and greater limitations on government activism, and he was far more favorable to and tolerant of big business. He demonstrated his convictions by appointing officials to the Interstate Commerce Commission and the Federal Reserve Board that he believed would change those agencies’ policies to make them much more supportive of business.

He also strove to enact legislation that gave corporations more power. He signed legislation to restore a higher tariff that supported American production, and he encouraged federal agencies such as the Federal Trade Commission and Interstate Commerce Commission to cooperate with businesses rather than merely regulate them. Harding also supported business by taking a more hands-on approach to breaking labor strikes.

Challenges for Labor

Using both the “carrot” and the “stick,” business in the 1920s sought to erode worker protections and union membership. The stick, or punitive tactics, some employers used included forcing newly hired workers to sign so-called yellow dog contracts in which they agreed not to join unions; if they did, they would be fired.

More employers engaged in an open shop movement, arguing that they wanted to give their employees the ability to decide on their own whether to join a union. Mobilizing under what they called the American Plan, these employers declared that the open shop was consistent with American values, freedom, and patriotism. By contrast, they charged unions with limiting freedom by creating closed shop workplaces, where only union members could be employed. They argued that unions restricted production, made unreasonable wage demands, and kept capable workers from reaching their full earning potential. In reality, employers promoted the American Plan to rid their industries of union organization and were successful in holding back the number of workers who could enjoy the benefits of collective bargaining (Goldberg, 1999).

To further discourage unionization, industrialists devised as a carrot the system of welfare capitalism. Designed to instill worker loyalty and encourage efficiency, welfare capitalism was practiced by the largest employers, including Goodyear, International Harvester, and General Electric. The programs included company unions that could bargain for limited workplace improvements but not for wage increases. Some created grievance committees to hear worker complaints. Other features could include profit sharing, life insurance, and company baseball teams.

Labor journalist Louis Francis Budenz, a reporter for Labor Age, railed against the practices of company unions, considering them the gravest threat to workers. In one case, he reported a construction job purporting to have a company union that ingenuously promised, but failed to pay, trained carpenters $12 a day, nearly double the wage union carpenters earned. Budenz asserted that company unions were disingenuous organizations that aimed to draw in unsuspecting workers (Grant, 2006). By the mid-1920s a mere 4 million worked for a firm that practiced welfare capitalism, but the concept grew throughout the decade (Dumenil, 1995).

Both the American Plan and welfare capitalism accelerated during the postwar recession and caused considerable strife between labor and employers. Although the decade saw many strikes across multiple industries, the probusiness climate assured that organized labor made few gains.

The Triumph of Big Business

The U.S. economy rebounded from the postwar recession by 1922, thanks largely to a consumer revolution and growth in industries that manufactured automobiles and other durable goods like refrigerators and radios. Following Harding’s sudden death in 1923, Calvin Coolidge succeeded to the presidency. A Republican attorney from Vermont, Coolidge began his political career in Massachusetts, first in the state legislature and then as the commonwealth’s governor. He gained a national reputation as an opponent of organized labor after he fired the striking Boston police force in 1919.

Coolidge was elected in his own right in 1924 and extended a series of policies favorable to business expansion. He appointed probusiness men to the Federal Trade Commission and the Interstate Commerce Commission and supported a move to raise tariffs to offer more protection for business. Under his watch, Congress also passed three revenue acts, greatly reducing income taxes for most Americans.

In contrast to the Progressive era’s push to regulate large corporations and make them more responsive to environmental and societal problems, the 1920s political climate supported business mergers and did little to restrict or influence business practices. The U.S. Supreme Court and Justice Department protected businesses from organized labor through a series of injunctions and limitations on union organization and strike activity.

The economy grew considerably for the remainder of the decade. Industrial output rose 64%, and the production of automobiles grew from 1.5 million in 1919 to 4.8 million in 1929. Industries incorporated new technologies, including mechanization, assembly lines, and electricity to boost production. Worker productivity grew 43%, and overall output grew 70% (Murphy, 2012).

Henry Ford’s motor company stands as a clear example of the business ethos of the 1920s. Initially operating one plant outside Detroit, Michigan, Ford introduced the moving assembly line and applied Frederick Winslow Taylor’s scientific management to the manufacture of his Model T automobiles. The process reduced the time and cost to produce a car but also created a monotonous and challenging work environment that initially led to massive turnover.

Ford countered by paying workers $5 per day (roughly $15 an hour in today’s money) and reducing the workday to 8 hours. Soon workers were lined up for jobs at the Ford plants. The Ford Motor Company was also one of the first to apply the principles of welfare capitalism, offering workers profit sharing to discourage unionization. Ford also implemented a so-called sociology department to ensure worker loyalty, patriotism, and moral values (Drowne & Huber, 2004).

Ford’s sociology department, also known as the education department, aimed to guide his workers in living moral and upright lives and to embrace a new identity as a “Ford Man.” Ford expected his workers to refrain from using tobacco and alcohol and to avoid interaction with unions, political radicals, and socialists. Immigrant workers received instruction in English and endured a plan of Americanization as a condition of continued employment. Those who demonstrated clean and wholesome habits were likely to see a wage increase. Those who did not want their employer intruding in their personal lives were invited to look elsewhere for a job (Hooker, 1997).

Sick Industries

Although some workers such as those at Ford plants made wage gains in the 1920s, most corporate profits were not passed along to employees. Nor did all segments of the economy benefit from the government’s new probusiness orientation. Although most of the economy recovered from the postwar recession fairly quickly, railroads, coal, textiles, and agriculture continued to struggle. Workers in those industries experienced stagnant wages (Murphy, 2012). Employees at a Gastonia, North Carolina, textile mill, for instance, averaged 70 hours per week. Despite the long hours, men’s wages were a mere $18 per week, and women earned a paltry $9 per week (St. Germain, 1990). Many workers also faced unemployment or underemployment.

The coal industry was another “sick” industry struggling to recover in the postwar decade. Coal was once the main fuel for American factories and mills, but competition from cleaner and abundant oil and hydroelectric power contributed to falling coal prices. The price of coal fell from a high of $3.74 a ton in 1920 to a mere $1.78 in 1929. Mines reduced production or shut down altogether, leaving remote communities unable to participate in the growing consumer economy.

Farmers likewise struggled to find prosperity. Mechanization in the form of tractors, combines, and disc plows increased production capabilities but reduced the prices of staple crops like wheat and corn. Coolidge vetoed congressional proposals to aid the farm crisis, arguing that the government had no constitutional power to intervene in private business (St. Germain, 1990). The agricultural sector continued to limp along well into the 1930s, when the Great Depression reversed attitudes toward government interference in the economy.

Economic Growth and Foreign Policy

America’s emergence as the world’s dominant economic power drew the nation into a host of international affairs during the 1920s. The nation officially sought a foreign policy that aimed to reduce the risk of international conflict and ensure the safety of trade and investment. In practice, however, U.S. foreign interactions often undermined those very goals.

U.S. investment overseas made America the world’s leading creditor nation, and its continued economic success depended on the ability of other nations, especially those in Europe, to repay their war debts of approximately $10 billion. However, Harding and the Congress, focused on nurturing U.S. business development, enacted a series of policies that showed little concern for European recovery following the war’s devastation. Higher tariff rates made it difficult for Britain and France to profit from exports. At the same time, the United States flooded European markets with American manufactured goods. Instead of providing relief and encouraging the commerce needed to reduce the debt, the United States continued on a path that produced further restrictions.

In this climate the United States hosted the first conference aimed at world disarmament. Held in Washington, D.C., from November 1921 through February 1922, leaders from nine nations met to consider interests in the Pacific Ocean and east Asia. Among those attending were representatives from China, Japan, Britain, France, Italy, Belgium, and Portugal. Neither Germany nor the new Soviet Union was invited. Supported by peace advocates in America and abroad, the conference resulted in the Washington Naval Treaty, in which each nation agreed to reduce the size of its naval fleet and limit production of new warships (Goldstein & Maurer, 2012).

The Harding and Coolidge administrations also sought to retreat from involvement in Latin American affairs unless economic ties there forced the United States to intervene. American business interests sought investment in the rich oil fields in South America and encouraged a foreign policy favorable to their plans. The Senate ratified a treaty apologizing to Colombia for American intervention in Panama in 1903 and offered a payment of $25 million in amends. This paved the way for U.S. investment in Colombian and eventually Venezuelan oil fields.

To further cement relations in Latin America, Secretary of State Charles Evans Hughes used the centennial of the Monroe Doctrine in 1923 to assure the nations of the region that the United States intended to be a good neighbor, although at that moment the United States still occupied and controlled the governments of Haiti and the Dominican Republic (see Chapter 6) (Goldberg, 1999).

7.2 The Culture of Modernity

Modernity, or the bureaucratic, industrial, and consumer-oriented society of early 20th-century America, was characterized by an evolving and distinct culture. Following the postwar recession, the nation saw unprecedented prosperity and industrial productivity. The United States stood as the world’s dominant economic power, and at home most Americans enjoyed a higher standard of living and more leisure time. Although some segments of society, such as farmers, coal miners, and African Americans, did not experience as much prosperity, all participated in an emerging culture of modernity.

The Boom of the Consumer Culture and the Consumer Economy

Beginning with the growth of American capitalism and industrialization in the 19th century, a new consumerism began to emerge. Linked to the expanding market economy, consumer culture celebrated the worth of goods and services in terms of their financial value. A significant part of modernity in the 1920s was the expansion of a consumer-oriented culture that prioritized acquisition and consumption. It associated happiness with accumulating material goods and made monetary value the most important measure of worth. Consumption rather than hard work came to measure an individual’s worth in society (Leach, 1994).

Drawing more Americans into the consumer culture was key to maintaining the nation’s economic prosperity. Goods produced required a market, and many looked to the American consumer as an important outlet for manufactured products. Businesses soon realized that consumers simply did not have enough money in their pockets to afford everything that they wanted to buy. Therefore, they devised a way for them to enjoy the products immediately but pay for them later. This technique for immediate gratification became known as buying on credit, and it was very much opposite to the Victorian ethos of the 19th century, which held that upstanding citizens did not incur debts.

Turning this idea on its head, in the 1920s purchasing on credit meant that you were a strong consumer, and many aspired to purchase modern conveniences to demonstrate their rising economic status. The trend began with more expensive items like automobiles, and it soon extended to other durable goods such as refrigerators and washing machines and even to small consumer goods.

Edward Filene, owner of an upscale Boston department store, recalled handing a doll to a little girl with whom he had been speaking in the toy department of his store. As Filene looked for the girl’s reaction, her mother prompted her: “What are you going to say to the gentleman?” The girl looked Filene in the eye and said, “Charge it!” (as cited in Benson, 1988, p. 100). Soon more and more people began filling their homes with the latest devices, even as they owned fewer and fewer of them outright.

Those with charge accounts were likely to spend more than customers paying cash, especially at department stores such as Filene’s. The U.S. Department of Commerce surveyed the use of store charge accounts in 1928 and found that although charges accounted for a small percentage of total transactions, they often represented as much as 20% of overall sales. Managers treated charge customers well and courted their repeat business (Benson, 1988).

The use and availability of charge accounts continued to increase. Although installment buying (consumer credit), or individual borrowing for consumable goods or services, was evident before World War I, in the 1920s household debt nearly doubled. Due to manufacturing techniques, the prices for appliances, automobiles, and other household products generally declined across the decade. At the same time a relaxation on the qualifications for credit saw the amount of goods purchased on time increase enormously (Olney, 1991).

Modern Advertising

New approaches to advertising fueled the consumer economy. The volume of advertising increased tremendously, with popular magazines being the most common marketing medium. Later in the decade, radio joined magazines as an important venue for ads. Moving beyond the utilitarian display of products that previously characterized product marketing, modern advertising firms created colorful ads showing individuals enjoying products.

The ads glorified consumption and leisure, such as in a car manufacturer’s depiction of a lush countryside and the slogan “You find a Road to Happiness the day you buy a Buick” (as cited in Dumenil, 1995, p. 89). A growing number of advertising agencies associated their clients’ products with the modern era, fashion, and progress. They pushed the necessity of owning new household products, including refrigerators, vacuum cleaners, toasters, and radios (Olney, 1991).

The new advertising also played on consumers’ fears and anxieties, brought on by the changes in modern society. Capitalizing on the stress of modern life, Post Bran Flakes promoted a cure for those “Too Busy to Keep Well” (as cited in Dumenil, 1995, p, 89). The makers of Listerine, formerly used only as an antiseptic, advertised its ability to cure halitosis, more commonly known as bad breath. Another technique aimed for a personal approach by including the word you in the text of an advertisement.

Ads connected to other elements of popular culture through celebrity endorsements, linking movie stars and sports figures to products. Finally, the introduction of company spokespersons such as Betty Crocker helped humanize corporations and their products (Dumenil, 1995).

The Automobile in 1920s Culture

The automobile was the most expensive and most desirable durable good of the era, and it became increasingly available to average Americans. Ford’s Model T, or “Tin Lizzy,” remained the best-selling and most inexpensive car. Aiming to make his vehicles affordable for the company’s assembly line workers, Henry Ford pushed an efficiency that allowed him to continuously reduce prices. First costing $850 in 1909, in the 1920s a Tin Lizzy could be purchased for $260 (Flink, 1998).

Along with other consumer products, the automobile was an important factor in the postrecession boom. By the 1920s multiple manufacturers produced thousands of cars each year using the assembly line and other efficiency techniques first used at the Ford Motor Company.

By the end of the 1920s, more than half of American families owned a car, making a whole new culture possible. Like George Babbitt, Americans were fascinated with automobile-related gadgets and other developments of the car culture. Gas stations, motels, diners, and other businesses sprang up to serve car-owning individuals and families.

The automobile became the ultimate symbol of leisure, promising owners freedom and mobility. Many passed on more traditional pastimes such as Sunday church services to take a drive in the country. In rural areas especially, having a car opened new options for shopping and leisure (Goldberg, 1999).

Morals, Movies, and Amusement

Motion pictures and other forms of mass media transformed American culture in the 1920s. Movies created and spread a set of common American values, attitudes, and experiences. Beginning around the turn of the 20th century, theaters known as nickelodeons offered one-reel silent films to mostly working-class audiences. By the 1920s the movie industry, located in and near Hollywood, California, expanded to include elaborate multireel feature films. Cities constructed ornate movie theaters or “palaces,” and films began attracting a middle-class audience.

Movie actors such as Mary Pickford, her husband Douglas Fairbanks, Buster Keaton, and Charlie Chaplin became national stars, setting trends for dress and style. One film star in particular, Clara Bow, popularized the flapper image of the so-called Roaring Twenties. Starring in multiple silent movies, including It (1927)—from which she became known as the “It Girl”—Bow was the sex symbol of the age. Copying the stars’ style, young women bobbed their hair, wore short skirts, smoked, and listened to jazz music. The modern, emancipated young women of the 1920s drew the ire of more conservative Americans, who believed that their behavior challenged women’s traditional roles in society (Stenn, 2000).

Working-Class Leisure and Culture

The freedom of middle-class culture did not extend to the nation’s working class. Blue-collar workers, who toiled in skilled or unskilled manual work in manufacturing, mining, or other heavy industries, enjoyed less time for leisure activities. Although the working class earned less money and worked more hours, they did participate in a variety of pastimes. The expanding commercialization of leisure saw many participate in more sedentary activities. Instead of playing sports, they were more likely to attend a semiprofessional baseball game or listen to a band or music on the radio. Amusement parks, nickel theaters, and 10-cent museums catered to men and women and may have created a more homogeneous and less ethnically divided working class.

At the same time, various elements of different ethnic cultures spread among the working class. Whereas some leisure activities divided along racial and ethnic lines, others were adopted more fluidly. White musicians adopted African American music and musical instruments such as the banjo and then fused them into mainstream culture. Jews made up a disproportionate number of entertainers, many of whom were among the most important pioneers in the film industry but worked closely with non-Jewish actors.

Other forms of entertainment reinforced ethnic identities. Films aimed at particular ethnic groups were shown at “race” theaters in African American or Mexican neighborhoods. The growing music recording industry similarly emerged to serve particular audiences. “Race” records were sold at stores in cities with large African American populations such as Chicago, where one owner reported lines forming around the block to purchase the latest release. “Hillbilly” music similarly aimed for an audience of rural southern Whites (Dumenil, 1995).

The Jazz Age

Music was a central part of the 1920s, and jazz was the soundtrack of the decade. It combined traditional African American styles such as the deep soulful feeling of blues with the rhythmic beats of ragtime and, in the process, became a unique American musical form (Burns & Ward, 2000). The improvisational aspect of jazz let musicians spontaneously explore new sonic boundaries, and audiences listening on the radio or dancing in front of big bands experienced the newness and sexual openness of the Jazz Age.

Jazz started in New Orleans but soon spread to Chicago and New York (Martin & Waters, 2006). However, it did not go mainstream until the Original Dixieland Jazz Band, which was an all-White group that was clearly not the “original” jazz band, became popular. This was one of the many examples in the 20th century of White Americans capitalizing on and mass marketing African American culture. The mass marketing of African American culture did, however, also pave the way for African American jazz musicians like Joe “King” Oliver’s Creole Jazz Band and many others to tour the United States.

Jazz became a central unifying cultural phenomenon among the youth of the 1920s. Featuring improvisation over structure jazz broke musical rules, and the way it made racial mingling seem normal challenged the dividing line between Whites and African Americans. From its big-city origins, jazz soon spread and was played in dance halls, roadhouses, and illegal speakeasies across the nation. Radio and phonograph records helped spread the jazz craze to even the most remote towns and farms. It was the music of a younger generation coming of age in modern America, and it sparked a backlash among traditionalists who called it the “devil’s music” and worried that youth would lose their appreciation for classical music.

7.3 Traditionalism’s Challenge to the New Order

A large segment of American citizens pushed back against the march of cultural modernity and sought a more conservative vision for the nation. During their presidencies, Harding and Coolidge presided over a return to economic and political conservatism. A movement to regulate morality accompanied these values.

Although the impulse to dictate moral values was not new to Americans, the Progressive era had strengthened the belief in society’s right to regulate personal behavior (McGirr, 2001). The conservative movement of the 1920s banned alcohol sales and production and fed the rise of militant and fundamental Christianity. The 1920s also saw a rebirth of the Ku Klux Klan and a virulent anti-immigrant movement. All of these groups participated in a struggle that pitted a preservationist-oriented Protestantism on one side against modernism, secularism, immigration, and urbanization on the other (McGirr, 2001).

Prohibition

The American tradition of efforts to restrict alcohol consumption stretches back into the 19th century. Temperance activists from groups such as the Anti-Saloon League and the Woman’s Christian Temperance Union had long argued that stopping liquor sales and consumption would make America a well-ordered and industrious society. They also claimed it would reduce domestic violence and increase worker productivity. During the Progressive era, the Prohibition movement to end liquor trafficking gained considerable ground as many localities, counties, and cities voted to go dry.

With the outbreak of World War I in 1914, Prohibitionists gained more ground as anti-German hysteria made the dominant German-owned breweries suspect. In 1916 lawmakers in Congress took up the Anti-Saloon League’s call for a constitutional amendment banning liquor traffic, finally passing the 18th Amendment in December 1917. By that time 19 states had already outlawed alcohol. The states ratified the amendment, and it went into effect January 17, 1920. It banned the production, sale, and transportation of intoxicating liquors (Okrent, 2010).

Representing a triumph for conservative values, the amendment was almost impossible to enforce. A follow-up law, the Volstead Act, provided for enforcement and defined intoxicating liquor as any beverage containing more than 0.5% alcohol, prohibiting beer and wine from being consumed, along with rum, whiskey, and other hard liquor. Congress was unable to appropriate the funds necessary to enforce the law, and many distilleries simply moved their operations across the Canadian or Mexican borders and continued production. The youth and even middle-class men and women skirted the law, taking pleasure in frequenting illegal saloons known as speakeasies. Illicit drinking became fashionable, representing a modern form of leisure and entertainment (Goldberg, 1999).

“Wets,” who supported responsible alcohol consumption, advocated the controversial amendment’s repeal. In 1923 the New York state legislature repealed that state’s enforcement law. Even where enforcement was funded and supported, it became evident that it was impossible to fully eliminate liquor trafficking. The Prohibition era gave rise to organized crime syndicates that illegally manufactured and sold liquor on a wide scale. This criminal element gained notoriety for violence and frequently made headlines for their grisly activities.

Clearly, Prohibition was not working. The opposition movement grew throughout the decade, and in 1933 the states ratified the 21st Amendment, which repealed the 18th. It is the only constitutional amendment approved specifically to repeal another one.

Fundamentalism and the Scopes Trial

American religious life altered significantly as modernity advanced in the early 20th century. Some sects, including some Presbyterians and the Roman Catholics, moved toward a more scientific interpretation of the Christian Bible that incorporated and accepted such conceptions as evolution and natural selection. Other Protestants insisted on a literal, or fundamental, interpretation of the events depicted in the Bible as historical fact.

The strongest reaction against the new morality of the 1920s also came from these conservative religious groups. They worried that religious modernists would continue to push cultural changes like the acceptance of evolution and biblical criticism. They were also concerned with social changes in American society, including the recent influx of immigrants and what they perceived as the loose morals of many Americans. The term fundamentalism was thus coined to describe a movement to restore traditional values in the face of modern indulgences and relaxed morals.

Leaders of many Protestant denominations grew gravely concerned that the churches themselves stood in danger of being altered by modernists. A series of articles published under the title “The Fundamentals” outlined the fears of leading Protestant theologians that their principles were threatened by unorthodoxy (Dumenil, 1995). The fundamentalist movement aimed to bring lapsed Christians back into the fold and to promote and protect a conservative dogma.

In California, Minister Aimee Semple McPherson used modern technology to achieve conservative ends. She combined fundamentalist ideas with charismatic radio broadcasts, becoming a model for later televised evangelists. McPherson and others, like former professional baseball player and Christian evangelist Billy Sunday, spread their conservative Christian message to millions.

Supporting a literal reading of the Bible—especially Genesis, which says that God created the heavens and the earth in 6 literal days—fundamentalists began to argue that evolution (which is measured in the billions of years) should not be taught in public schools. The modern and growing acceptance of Darwinian principles such as evolution and natural selection represented a real and viable threat to conservative Protestant beliefs.

The conflict between fundamentalism and modernity came to a head in 1925 in the so-called Scopes Monkey Trial. John T. Scopes, a Tennessee high school science teacher, had taught theories of evolution to his students despite the fact that the Tennessee legislature had passed a law forbidding it. Scopes had volunteered to violate the law to test the state’s willingness to enforce the ban, and the city officials of Dayton, Tennessee, supported his actions because they hoped that it would bring national attention to their small town. The American Civil Liberties Union had agreed to defend anyone willing to violate the law so that the principles could be tested in a public court.

In his trial Scopes was represented by prominent attorney Clarence Darrow, a staunch advocate for civil liberties. Darrow was pitted against famous orator and politician William Jennings Bryan, an outspoken supporter of fundamentalism. Once Bryan was brought in, the trial became a national spectacle, sparking heated debate about science, religion, and the place of humans in the world. The proceedings were reported daily in the national press, and it was the first trial broadcast on radio.

On the seventh day of the trial, Darrow famously called Bryan himself to the stand as an expert on the Bible. Questioning Bryan on the historical accuracy of biblical events, including whether Bryan believed that Eve was actually created from Adam’s rib, Darrow aimed to use scientific evidence to prove that many biblical stores were metaphorical. Bryan accused Darrow of casting ridicule on Christians.

The end result was more anticlimactic than the media-frenzied buildup. The jury found Scopes guilty of teaching evolution, and he had to pay a $100 fine. But the trial initiated the rift between religious fundamentalists and scientific modernists that continues to this day (Larson, 2006).

Immigration Restriction

Other cultural conflicts of the 1920s were cast in ethnic terms. The flood of eastern and southern European immigrants that began arriving in the 1880s made the nation’s industrial growth possible but also sparked recurring patterns of nativism, or anti-immigrant sentiment. Nativism was particularly strong during economic downturns, such as the post–World War I recession.

Some came to apply the term melting pot to the diverse groups of ethnicities and nationalities among the immigrant communities. Rather than blending diverse people into a new type of American, however, the expectation was that immigrants should conform to dominant White Protestant culture. Intense Americanization campaigns that included English education and discouraged the persistence of ethnic culture sought to mold White ethnics into proper citizens, but African Americans and other non-Whites were not deemed capable of assimilation (Dumenil, 1995).

The fact that many of the recent arrivals were Catholic, Jewish, or of some other non-Protestant religion also inflamed both mainstream and fundamentalist Christians, who assumed that to be fully American, one must be Protestant. A movement to restrict and qualify the numbers and types of immigrants began in the Gilded Age with the exclusion of Chinese immigrants. A number of groups favored other restrictions such as literacy tests but generally lost ground to strong business interests, which wanted to keep the door as open as possible to fill their labor needs.

The tide turned to favor immigration restriction during World War I and strengthened afterward. Many questioned the loyalty of German Americans and other Europeans thought to be radical Socialists, Communists, or anarchists. Although the tide of immigrants receded during the war, European refugees began to flood into the United States in the spring of 1920, with as many as 5,000 arriving each day (Dumenil, 1995).

Nativists urged Congress to act immediately, and in 1921 a temporary law placed a quota on the number of immigrants to be admitted from each nation. Under the measure only a maximum of 357,803 European immigrants could enter the United States each year. Each nation was given a quota of 3% based on its count in the 1910 census. The law favored immigrants from western Europe and severely limited the numbers of new arrivals from southern or eastern Europe. It was designed to last for a single year but was not replaced until 1924.

Nativists feared that a reopening of immigration would erode American culture and society, and so they fought to make the restriction both permanent and more exclusive. One fear was that the dominant culture based on Protestant values might be replaced or challenged due to the large numbers of Roman Catholics and Jews among the new immigrants. Another strain of nativist thought argued that the inclusion of southern and eastern Europeans, many of whom had a dark complexion, would lead to race mixing and the “mongrelization” of the American people.

Proponents of eugenics, a pseudoscientific movement, defined immigrants, African Americans, and those with disabilities as physically inferior. Eugenicists argued immigration restriction was necessary to protect the White race in America from being polluted through mixing with inferior peoples.

In 1924 Congress responded to all these voices with the National Origins Act, which set permanent national quotas using the 1890 census and cut the overall number of foreign nationals allowed to enter the United States in any year to 167,667 and eventually to 150,000. Asian immigrants were denied entry altogether. The law did not apply to countries in the Western Hemisphere, so Canadians and Latin American citizens remained free to emigrate without restriction. Although some nativists expressed a desire to include Mexico in the quota system, agricultural interests reliant on inexpensive immigrant labor lobbied against the measure (Goldberg, 1999).

With just a few revisions, the National Origins Act remained the nation’s primary immigration law until 1965, when it sparked backlash and indignation from ethnic organizations in the United States and was formally protested by foreign governments.

A Second Ku Klux Klan

During the 1920s a revival of the Ku Klux Klan gained strength by appealing to anti-immigrant and especially anti-Catholic White Protestants, as well as to those who wanted stronger enforcement of Prohibition. In the Reconstruction era the Klan had been a southern-based terrorist group that suppressed African American civil and political rights. Wearing long white gowns and hoods, the late 19th-century Klan was a secretive organization that performed its work under cover of darkness using illegal methods.

The successor organization embraced 100% Americanism, the notion that dominant White culture and Protestant traditions formed the only acceptable American values, and styled itself as a fraternal organization on the order of the Elks, the Masons, or the Odd Fellows. In addition to the South, it also had prominent chapters in the North and particularly the Midwest (Chalmers, 1965).

The Klan of the 1920s gained inspiration from D. W. Griffith’s 1915 film, The Birth of a Nation, which depicted the triumph of the White supremacist organization over the forced imposition of racial equality during Reconstruction. Capitalizing on the film’s popularity, Methodist minister William J. Simmons, known as Colonel Simmons, and several other men created the organization that they proclaimed to be the successor to the original Klan. Membership was limited to White Protestants and remained small until wartime nationalism launched a backlash against immigrants, Catholics, and radicals.

The Klan represented the decade’s strongest pushback against the changes of modern American society. In the South, where it still proclaimed to be primarily a White supremacist organization, members whipped African Americans for voting, refusing to ride in segregated rail cars, or seeking a wage increase. In many cases they burned a fiery cross in the yard of offenders. In California the group targeted the Jewish influence in the growing motion picture industry. In northern cities, the Klan attacked Catholics and ethnic immigrants for clinging to their native culture. The group also railed against the slack morals of the younger generation and women who wore short skirts, bobbed hair, and engaged in public smoking or drinking (Goldberg, 1999).

The Klan’s membership swelled after Simmons contracted with publicists Edward Clarke and Elizabeth Tyler, who promoted the organization using modern subscription, or membership, techniques and clever advertising. Recruiters received a hefty percentage of every subscription fee they sold. New chapters and even women’s and children’s auxiliary groups spread widely. The organization portrayed itself as the defender of “pure womanhood” and touted its opposition to strong drink, wife beaters, and adulterers, in addition to its promotion of American values. In late 1922 a Dallas dentist named Hiram Wesley Evans ousted Simmons and the publicists, and the organization continued to grow (Goldberg, 1999).

Klan member lists were rarely made public, but scholars estimate that the organization had more than 5 million members between 1920 and 1925 (Dumenil, 1995). The Klan’s fraternalism offered support for local businesses and openly endorsed candidates for office. Many retail outlets advertised their connection by displaying signs that proclaimed “Trade with Klan,” whereas Jewish and Catholic businesses were often subjected to boycotts. The Klan also offered a platform for newly enfranchised women, who could combine nativist political views with support for Prohibition and women’s rights (Dumenil, 1995).

The Klan declined after 1925. The passage of the National Origins Act eliminated many supporters’ fears of a mongrelized America. In 1924 a sex scandal involving a prominent Indiana Klan leader led to a public trial and helped discredit the organization. The same year at a rally in Niles, Ohio, organized Irish and Italian immigrants clashed physically with Klan members, earning the organization more negative national press.

A public parade of Klansmen and women down Washington, D.C.’s Pennsylvania Avenue in August 1925 marked the fraternal organization’s last major public appearance. Although the Klan persisted with a much-reduced membership, its national influence waned by 1926 (Goldberg, 1999).

7.4 The Crash

Between 1925 and 1929, many Americans enjoyed the fruits of capitalism. Industrial growth meant record profits for corporations, and an increasing number of Americans invested in the soaring stock market. Steel production, retail sales, and auto manufacture led to a dramatic increase in stock profits, a scenario known as a bull market. The investing boom began in the mid-1920s and seemed to have no end. Rising almost 40% in the first half of 1929, stock profits netted fortunes for many.

But many industries that suffered from the postwar economic downturn, including farming and coal mining, did not experience a rebound. Wheat crops and coal lay in stockpiles as prices continued to spiral downward. Farmers experienced the economic downturn long before the rest of the nation and had little opportunity to participate in the consumer culture that swept other segments of the nation. When the stock market began a rapid decline in late 1929, those sick industries and the rest of the economy came crashing down.

Buying on the Margin

The rising prosperity of the 1920s had seemed to lift almost all boats. Although there were certainly numerous poor people throughout the country, never before had there been so many who were either wealthy or living much better than they imagined they ever would. Existing businesses were turning profits, and new businesses opened every day. The automobile industry exemplified this growth. In 1926 it produced 4.3 million cars, and just 3 years later, production increased to 5.3 million. It seemed that the only outlook for the future was an optimistic one, but the system contained some inherent flaws (Galbraith, 2009).

One of these flaws was the way in which many people purchased stocks and invested in the economy. In order to buy more stocks, people began purchasing on margin. They paid only a small portion of the stock’s actual cost, borrowing the rest from their broker or bank. After selling the shares at a higher price, investors repaid the loan and pocketed the remaining profit. Stock prices rose nearly every day, and there was a surplus of buyers, which drove up the price even more. Everyone had a tip on the next hot stock, and often they were right, which fueled the speculative frenzy. Between May and September 1929 the average stock value increased by 40%. Although relatively few Americans owned stock in the 1920s, many more paid close attention to the market’s movements.

The problem was what happened if the stocks decreased in value. If a stock decreased, it became very difficult for an investor to repay a margin loan. With the market booming, few people imagined this scenario could happen, however, and brokerage firms encouraged everyone to enter the stock market on margin. With their very low interest rates, many investors took the brokers’ advice and borrowed money to invest. So long as the market remained strong, there was much money to be made.

Herbert Hoover and the End of the Boom

Right in the midst of this boom, Calvin Coolidge, often known as “Silent Cal,” quietly announced his intention not to seek reelection by handing a note to that effect to a reporter in 1927. The 1928 election pitted Republican Herbert Hoover against Democratic New York governor Alfred E. Smith, the first Catholic nominated by a major party.

Trained as an engineer, Hoover appeared well qualified to meet the nation’s challenges, even though he had never held elective office. An advocate of economic modernization, during World War I he headed a food relief effort in Europe, and then served as secretary of commerce under Harding and Coolidge. He believed strongly in the tenets of voluntarism, the idea that the voluntary actions of individuals and capitalists, but not government intervention, would create a socially responsible economic order.

Smith’s Catholicism became a major issue during the election, and although he won majorities in the major cities, thanks largely to the ethnic vote, Hoover claimed 58% of the overall popular vote. Hoover benefited from the fact that the Republicans were strongly associated with the nation’s economic prosperity. Soon after his inauguration in 1929, however, the booming economy of the 1920s came crashing down (Klein, 2001).

October 1929

The 1920s stock market hit its highest point on September 3, 1929. In the days immediately after, it began to erratically drift lower, but few people expressed concern. In mid-October Irving Fischer, a well-respected economics professor from Yale University, optimistically predicted that stocks would maintain their high plateau. Everything changed on Tuesday, October 29, 1929. Within hours, panic streamed across all media outlets. Boys selling the morning newspapers shouted the dire warnings. Radio announcers speculated about a problem. In New York’s financial district, the concern was evident on everyone’s faces (Smith, 2003).

A massive stock sell-off unlike any in history began with major banks and rapidly spread to all investors. It happened so rapidly that the stock ticker, the Teletype device that conveyed the status of the market to brokerage houses, could not keep up with the news, and many people were unaware of the situation’s gravity. Officials at the New York Stock Exchange ushered people away from the viewing windows. Police arrived on the scene. By 11:30 a.m. it was clear that the market was disintegrating. Observing the hysteria on Wall Street, one reporter from the Saturday Evening Post said that the Wall Street traders looked like “dying men counting their own last pulse beats” (as cited in Thomas & Morgan, 1979, p. 348).

By November the Dow Jones Industrial Average, a measure of the top 30 stocks in the country, had declined by half. Brokerage houses began calling back loans from investors who had purchased on margin. Few people had the resources to cover their debts, and many went bankrupt. The market crash punctuated the false sense of security that cast its shadow across the nation’s economy and was one of several factors that pushed the United States toward a depression greater than any experienced before. The crash combined with the stockpile of agricultural and mining commodities, and in ensuing months companies and banks failed and jobs were lost. By 1931 the downward spiraling economy left few standing in its wake.

Causes of the Great Depression

The stock market crash was not the cause but a symptom of the nation’s larger economic problems. Although it was the first sign of impending economic disaster, several factors contributed to the Great Depression. These were all linked to the two overarching problems of overproduction and underconsumption.

One factor in overproduction was a lack of economic diversification. Too much of the nation’s economy depended on too few industries, including steel production and automobiles. These industries, as well as agriculture and mining, continued to produce more than the market could absorb. In 1929 these industries lost profitability, and newer industries in areas like plastics and chemicals could not grow as fast as the others declined.

Another cause was weak consumer buying, or underconsumption. Mass consumption was one of the primary stimulants to the American economy in the 1920s, with people taking advantage of higher wages to purchase household goods. However, the lack of infrastructure development contributed greatly to underconsumption in many areas of the nation. Despite the fact that credit had allowed some Americans to increase their purchasing, rural residents still had no access to electricity in their homes and could not take advantage of modern conveniences such as electric refrigerators, irons, and radios. Immigration restriction also reduced the number of new consumers entering the United States, forcing manufacturers to look elsewhere to market their products.

As the decade neared an end, business owners began taking more profit for themselves and expanding their production capabilities. As a result, these employers returned fewer profits to workers, diminishing their ability to purchase nonessential goods and services. In economic terms the supply side increased at the expense of the demand side because owners kept wages as low as possible. With wages not increasing fast enough to keep pace with the credit demands, many people defaulted on loans. Bankers, hoping to recoup their losses, often invested funds in the stock market.

After the market crash the banks were also struggling to stay alive. Between 1930 and 1933, thousands of banks began to fail, essentially crippling the American financial system (Bernanke, 2000). Bankers, industrialists, and American consumers were all unprepared for an economic collapse, and all were partly to blame. Each sought to maximize their position in relation to the modernizing economy, and each failed in their own way, leading the nation into economic collapse.

The collapse quickly became a global problem. The United States had the largest economy in the world at the time, and during the 1920s it spent a tremendous amount of money investing in the rebuilding of postwar Europe. But when the economy started to falter, the United States suspended its international investments, which weakened the economies of nations throughout the world. Compounding this issue was the imposition of the Hawley–Smoot Tariff in 1930, which increased the import tariff with the aim of protecting America’s struggling industries from foreign competition.

This law raised U.S. tariffs on roughly 20,000 imported goods, making it very difficult for manufacturers in Europe to sell goods to the United States. As noted historian Richard Hofstadter described it, this legislation “was a virtual declaration of economic war on the rest of the world” (as cited in Houck, 2001, p. 130). For example, the tariff raised duties on all Japanese imports by 23%, causing hundreds of small businesses in Japan to close (Brendon, 2000).

Furthermore, when Germany could not pay back war debts to France and England in 1928, these nations in turn could not repay their debts to the United States, and a devastating depression swept through these European nations. In Germany a young Adolph Hitler took advantage of the unrest caused by the depression in his country to ascend to power by using his charismatic personality to promise a brighter future (Cravens, 2009).

The Depression Begins

President Hoover believed he lacked the authority to prevent banks from failing or to offer a safety net to those who were penniless and without jobs, but he nevertheless attempted to coordinate a federal response to help the nation. He reacted neither quickly nor creatively to the crisis. His first plan was to convene a conference of business leaders in Washington, D.C. There he tried to win their support for a voluntary plan to help the economy and restore confidence among Americans. He wanted business executives to retain their employees; at the same time, he tried to convince labor leaders not to strike for higher wages.

By 1931 it was clear that these voluntary measures were not working. Because Hoover refused to use the federal government to press a reform agenda, business leaders simply did whatever they thought would best help their own businesses survive. Other Hoover programs also backfired. He asked Congress for an unprecedented $432 million public works program, but he was so concerned about a federal deficit that he raised taxes in 1932 to pay for it. This was perhaps the worst year of the Depression, and Hoover’s taxes made struggling Americans’ lives even more difficult.

However, Hoover did ultimately make some innovative attempts to combat the Depression. One example was the Reconstruction Finance Corporation (RFC), which granted federal loans to struggling businesses such as banks, insurance companies, and railroads to save them from collapse. The program had potential, but it was widely regarded as a failure; despite its budget and authority to lend several billion dollars, RFC officials only loaned money to organizations it felt had enough collateral to pay back the loans (Ippolito, 2003). This was too conservative an approach, and the RFC only used 20% of its funds to help businesses in need.

Despite his application of traditional voluntarism and other conservative philosophies to the problems facing the nation, Hoover’s attempts failed miserably and did not bode well for his reelection in 1932.

The Bonus Army

While Hoover struggled to apply voluntarism to the nation’s economic situation, ordinary Americans began to demand more action from the federal government. As their economic situations became dire in the spring of 1932, nearly 43,000 veterans and their supporters marched on Washington to demand a cash payment of bonuses owed them for World War I service.

Among those organizing a group of veterans to travel to Washington was Walter Waters of Portland, Oregon. Enlisting in the Oregon National Guard in 1917, Waters saw action in France as a part of the 41st Infantry Division and did not return to the United States until June 1919. Waters rallied a large contingent from his home state and soon became one of the movement’s leaders as they camped near downtown Washington from May through midsummer. He recalled insisting, “We will stay here until 1945 if necessary to get our bonus” (as cited in Dickson & Allen, 2004, p. 50).

That was the date the bonus certificates were scheduled to be redeemed, but the Bonus Army demanded that Congress act to pay them immediately. They camped and waited as Congress debated a new bill. The House of Representatives passed a measure to move forward the date for the bonuses to be paid and sent the measure to the Senate. The Bonus Army marched on the U.S. Capitol on June 17 only to learn that the Senate defeated the Bonus Bill by a vote of 62 to 18. Disappointed, Waters and the rest of the veterans returned to their camps to wait for President Hoover to act, but he refused.

Near the end of July, federal officials ordered police to remove the Bonus Marchers from their encampment, but the angry veterans refused to leave. In the struggle that ensued on July 28, two marchers were shot by police, one fatally.

Hoover reacted by ordering the camps cleared of marchers. Regular army personnel under command of Gen. Douglas MacArthur and six battle tanks under command of Maj. George Patton pushed the veterans out of the city. At first the veterans cheered the army troops, mistakenly believing they came in support of their cause. But then the veterans heard the order to charge against them. More than 50 veterans were injured and more than 100 arrested. The Bonus Army vacated the capital without winning their bonus, but Hoover’s actions were widely criticized and his chances for reelection severely harmed.

American Lives: Franklin Delano Roosevelt

The only American president elected to four consecutive terms, Franklin Delano Roosevelt (FDR) served three full terms, taking office in 1933 and leading the United States through most of the decade-long Depression and World War II. Although in increasingly frail health, FDR agreed to run for a fourth time in 1944. He died of a stroke on April 12, 1945, just a few months before the final end to the war. More than any 20th-century president, Roosevelt shaped the direction of the Democratic Party, redefined Americans’ relationship with their government, and established the United States as the world’s dominant international power (Brinkley, 2000).

Roosevelt began his career as a Wall Street lawyer, but he soon turned to politics as a staunch adherent to the Democratic Party. First serving in the New York State Senate, he moved to Washington when appointed undersecretary of the navy by Woodrow Wilson in 1913. Roosevelt gained national attention in 1920 when the Democrats nominated him for vice president alongside presidential candidate James M. Cox, though the Republican ticket headed by Warren G. Harding won the election that year. Soon after, Roosevelt contracted poliomyelitis (polio), a disease that paralyzes the limbs. Despite intense therapy, he never again regained the use of his legs and could only stand with assistance.

Instead of retreating from public life, Roosevelt did his best to conceal his disability and returned to politics. He was elected governor of New York in 1928 and again in 1930, just as the Great Depression began to take its toll. He created a state agency to steer relief efforts in New York and developed the social philosophy that would guide his policies as president. Despite his privileged lifestyle, Roosevelt seemed to grasp the needs of average Americans.

As governor, Roosevelt tried many different strategies, combining government assistance with work programs, local relief, and state funding. But the Great Depression was larger and lasted longer than any previous economic crisis, and state and local aid programs were insufficient to provide relief. Although many, including President Herbert Hoover, believed that relief should rest in the hands of local and state officials, Roosevelt firmly believed that “the Federal Government has always had and still has a continuing responsibility for the broader public welfare” (as cited in Polenberg, 2000, p. 8). This belief guided his policies as president and ensured that millions of Americans maintained faith in him, even when those policies failed to bring an end to the suffering of the Great Depression (Polenberg, 2000).

8.1 Roosevelt and the First New Deal (1933–1934)

In 1932, during one of the worst periods of the Depression, the Democrats nominated Roosevelt to run for the presidency. He had a famous name and a fiery and charismatic persona. While he was giving his acceptance speech at the Democratic convention, the Bonus Army camped in Washington, D.C., demanding the early payment of the bonus promised as a reward for their World War I service. Americans seemed to be demanding a new type of leadership.

Everywhere, the nation struggled desperately in the throes of the Great Depression, the largest worldwide economic downturn in history. Louise Proctor Allen, who grew up in Chicago, remembered that her middle-class family struggled through the decade-long crisis with a single pair of shoes and home-sewn clothes (Allen, 2001). Others were even more desperate. Marvell Hunt, who was married in 1927 and raising a young family in Utah when the Depression hit, recalled, “It was hard to get a hold of enough to buy a sack of flour and we made our own breads, cooked our own vegetables, bottled our fruits, and raised our gardens” (Hunt, 1997). Her husband, Don, traveled all over in search of work each day.

Amid such desperate times Roosevelt promised a solution, telling Democratic convention attendees and the nation, “Republican leaders not only have failed in material things, they have failed in national vision, because in disaster they have held out no hope. . . . I pledge you, I pledge myself to a new deal for the American people” (as cited in Houck, 2001, p. 131). Roosevelt’s rhetoric inspired the nation at a time when hope was fading. William Ronci, an insurance underwriter from Youngstown, Ohio, recalled that with the election approaching, many believed Hoover had only the interests of the wealthy in mind. Roosevelt, Ronci said, was “probably the light tower in the White House” (Ronci, 1974).

At the beginning of 1932, Republicans firmly believed that Hoover’s voluntarism, combined with his fiscal policies—including the Reconstruction Finance Corporation (see Chapter 7)—would curb the economic depression. Although his rough treatment of the Bonus Army earned the ire of many Americans, the Republican convention unreservedly nominated him for a second term. This confidence was misplaced, however, as blame for the Depression gathered around the incumbent. Roosevelt easily won the 1932 election, and his New Deal became one of the best known political labels in American history. While their merits and results are still debated today, Roosevelt’s policies unquestionably changed the course of U.S. history.

During the campaign, the New Deal symbolized an undefined hope for much-needed change, but soon after Roosevelt’s election it came to embody a series of programs that fundamentally reshaped the nation as the federal government took on new responsibilities for the well-being of the American people. Through his policies the government also began to play a much more active role in creating rules for the nation’s businesses and industries. The programs of the New Deal redefined the relationship between the people and their government; in the process, Roosevelt became known as the founding father of American liberalism. Roosevelt more or less created the modern liberal political tradition (Hamby, 1992).

Roosevelt’s sizable victory (see Table 8.1) had long coattails, as Democrats also gained solid control of Congress. A coalition of politicians formed largely by rejecting former president Hoover and his philosophy that the federal government should not provide direct aid to the people, despite the fact that he bailed out major industries experiencing financial distress. The nation sought a new path to solving the escalating financial problems and was willing to give the Democrats full political control in both the executive and legislative branches of government.

Forging an agenda based on specific policies and actions rather than ideology, the New Deal coalition brought farmers, workers, women, African Americans, intellectuals, and immigrants together in the belief that an active federal government offered the best hope for positive change. This coalition lasted through the Depression, World War II, and even beyond Roosevelt’s death in 1945 (Landy, 2002).

Table 8.1: The election of 1932

Candidate

Electoral vote

Popular vote

Percentage of popular vote

Franklin D. Roosevelt (Democrat)

472

22,821,857

57.4

Herbert Hoover (Republican)

59

15,761,841

39.7

Norman Thomas (Socialist)

881,851

2.2

William Z. Foster (Communist)

102,991

0.26

The First 100 Days

Roosevelt’s first 100 days in office was a remarkable period that generated an “alphabet soup” (so called because of their various acronyms like the AAA, CCC, and FERA) of programs and agencies that sought to contribute, sometimes in very controversial ways, to easing the economic strains facing the nation. The programs were new and experimental, and while some brought measurable relief, others failed or even caused harm. Neither FDR nor his programs brought an end to the Depression itself, which dragged on through the entire decade of the 1930s, only ending with the outbreak of World War II in the 1940s. Despite the longevity of the problems facing the nation, Roosevelt enacted innovative programs that helped bring aid to the American people, and his charismatic personality restored Americans’ confidence in government.

The Roosevelt administration hit the ground running. From March to June 1933, during his first 3 months in office, the president initiated legislation directed at immediate solutions for the ailing economy (see Table 8.2). The first place he directed his attention was the banks. As the Depression settled on the nation, thousands of the nation’s banks failed and billions in deposits were lost. Americans, fearful of losing all their money, flocked to withdraw their savings from their local banks. These so-called bank runs caused even more banks to fail.

By the time Roosevelt entered office, banks in dozens of states were already closed, and citizens had no access to their money on deposit. Many lost all of their savings. Roosevelt’s decisive actions were unprecedented. Following the lead of several states, he declared a national bank holiday. Closing the nation’s financial institutions between March 6 and March 13 prevented additional runs on the banks and gave the federal government time to figure out how to write legislation to strengthen them (Smith, 2008).

Table 8.2: Major legislation of the first 100 days

Date

Program

Description

March 9, 1933

Emergency Banking Relief Act

Legalized the bank holiday, gave the 

federal government the ability to restrict

 afailing bank’s operations and to determine a bank’s strength.

March 20, 1933

Economy Act

Gave the federal government the power

 to control salaries for federal workers

and reorganize federal agencies in the interest of the economy.

March 31, 1933

Civilian Conservation Corps

Established a public works program and 

provided vocational training and jobs in conservation.

April 19, 1933

Gold Standard

Abandoned the gold standard. 

(See the Gold Repeal Joint Resolution below.)

May 12, 1933

Federal Emergency Relief Act

Provided aid to states for distribution to the

 unemployed and their families.

May 12, 1933

Agricultural Adjustment Act

Restricted agricultural production by paying 

farmers to produce fewer crops.

May 12, 1933

Emergency Farm Mortgage Act

Provided for the refinancing of farm mortgages.

May 18, 1933

Tennessee Valley Authority

Created to improve river navigation, flood control, and

 rural electrificationprograms.

June 5, 1933

Gold Repeal Joint Resolution

Canceled the gold clause in federal and private debts. 

This meant that all debtscould be paid in legal tender. 

This was the final step in ending the gold standard.

June 13, 1933

Homeowners Refinancing Act

Refinanced home mortgages at lower monthly payments.

June 16, 1933

National Industrial Recovery Act

Promoted fair competition, guaranteed collective 

bargaining rights, regulated work standards, regulated prices.

June 16, 1933

Banking Act (Glass–Steagall Act)

Separated commercial and investment banking, 

increased Federal Reservepowers to oversee market operations.

 Created federal insurance for deposits upto $5,000.

The result was the Emergency Banking Relief Act that Congress passed on March 9 and the president signed into law 3 days later. It gave the Treasury Department power to inspect the banks and restrict their operations if judged to be financially unstable. The act also provided federal funds to further shore up the solvency of banks. The Emergency Banking Relief Act quickly restored confidence, and as the banks reopened $1 billion in currency moved from under people’s mattresses or coffee cans back into the system.

A resolution of Congress also took the final step to move the nation off the gold standard. Passed on June 5, the Gold Repeal Joint Resolution canceled any outstanding debts calling for repayment in gold and instead declared greenbacks (paper money) legal tender for all debts. This made the nation’s currency system more flexible at a time when many businesses and individuals were struggling to pay their debts.

Confidence was further enhanced by the creation of the Federal Deposit Insurance Corporation (FDIC), under the Banking Act of 1933, passed in June, which guaranteed that the government insured all bank deposits up to $5,000. If a bank failed, customers would get their money back from the government (today the FDIC guarantees deposits up to $250,000). The Banking Act also separated investment and retail banking, establishing a firewall that further protected consumer bank customers from risky investments and speculation.

Although some banks never reopened, those that did were stronger than ever, and for the first time consumer deposits were insured against future loss. During Roosevelt’s first two terms as president, fewer banks failed than under any previous administration. In 1936, for the first time in U.S. history, no American bank closed its doors (Badger, 1989).

In addition to these decisive actions, Roosevelt used radio addresses called fireside chats to instill confidence in the American public. His first of many informal speeches to the nation explained the banking legislation in clear and simple language and built a direct connection and more personal relationship with the working and middle classes.

Another popular message in his first chat was his plan to end Prohibition, an unpopular and largely unenforceable policy. Proposals for its repeal gained wide support. Some advocates even came to argue that the rise of organized crime and disrespect for the law caused more problems than temperate drinking. Roosevelt announced the Beer–Wine Revenue Act, which legalized the sale and consumption of 3.2% alcohol beers and light wines.

Within a few days of taking office, then, FDR managed to make Americans feel more secure with their local banks and assure them that alcohol would once again be legal (Kiewe, 2007). By December the states had ratified the 21st Amendment, repealing Prohibition altogether.

Relief and Recovery

The initial goal of the New Deal was to restore the economy and rebuild consumer confidence. Through fireside chats and other means of outreach, Roosevelt gained public support for his recovery agenda, but the nation was in a deep hole when he took office in 1933. In order to solve the twin problems of underconsumption and overproduction, the New Deal proposed major structural changes, some of which were radical and even eventually deemed unconstitutional.

Thanks to Roosevelt’s programs, the U.S. economy grew 10% annually beginning in 1933. Unemployment peaked in 1932 at nearly 25% and then shifted gradually downward. But even with such strong growth, unemployment never fell below 14% for the rest of the decade (Badger, 1989).

The Agricultural Adjustment Administration

Agriculture had limped along as a “sick” industry during the 1920s and was left especially vulnerable when the Depression struck and the economic downturn spread to international export markets. During the early years of the Depression, the prices paid for farm commodities fell by 40%, leaving many farmers awash in debt. Roosevelt sought a means to solve the long-standing farm problem; to create a long-lasting parity, or balance, between urban and rural earnings; and to distribute economic recovery evenly throughout economic sectors.

As a result, FDR tailored his first massive structural reform program to come to farmers’ aid through the Agricultural Adjustment Administration (AAA). The AAA passed Congress in May 1933 and had several goals. The first was to reduce the output from American farms, since overproduction caused a food surplus that drove down prices.

Representatives from the major commodities industries (corn, cotton, wheat, tobacco, and rice) agreed to an overall production limit. The government determined the production number, and the AAA directors officially told each farmer how much they could produce. The government secured funds through a tax on farm commodity processors to pay some farmers for leaving their land fallow, or out of production. This worked well in the short term and strengthened the agricultural sector, but it did little to ease the Depression because food processors passed the burden of the new tax onto consumers. The plan also provided little relief for small farmers because large industrial farms reaped most of the benefits.

In 1936 the Supreme Court suspended the AAA on grounds that the Constitution only allows the federal government to regulate interstate commerce, whereas the processing tax applied within state boundaries. The justices argued it was unconstitutional to tax processors and then repay the funds to the farmers (O’Sullivan & Keuchel, 1989). When farming overproduction continued unabated, a new Agricultural Adjustment Act passed in 1938 retaining the subsidy program but shifting the burden of funding it to the federal government, and the farm adjustment program resumed.

The National Industrial Recovery Act

While farmers were clearly struggling, so too was industry, and FDR sought to address the problem through the National Industrial Recovery Act (NIRA). The act funded a series of public works projects under the Public Works Administration to improve and beautify the nation’s infrastructure and put people to work. Congress allocated $3.3 billion to create new buildings, roads, and flood controls. Examples of these projects that are still in use today include Chicago’s subway system, Skyline Drive in Virginia, and New York’s Triborough Bridge.

Another part of the act established the National Recovery Administration (NRA), which initiated, with input from consumers, laborers, and industry leaders, fair practices or codes of conduct. In each industry, business leaders proposed an NRA code to cover factors such as production schedules, wages, and prices. The point of the codes was to curb the cutthroat competition that was driving many businesses into bankruptcy, which often led to layoffs and economic turmoil. The code for each industry was then reviewed by business leaders within that industry, who highlighted key areas where, from their perspective, the code was unfair. The NRA then stepped in to work through a compromise.

Section 7a of the NRA also guaranteed workers the right to bargain collectively and was hailed by union advocates. This was a radical departure for the federal government. From the onset of industrialization in the Gilded Age, the government had typically sided with industry and management in labor disputes and recognized the right of corporations to keep workers from collective bargaining. In many labor disputes, federal judges granted employers injunctions and on some occasions federal troops deployed to end labor strikes. This provision of the NRA represented a new era of possibilities for union organization.

The Supreme Court declared the NRA unconstitutional in 1935 because it was significantly flawed. In many industries, large organizations dominated the code writing, tailoring them to individual benefit and essentially endorsing or creating monopolies. Nevertheless, the law generated a short-term boost to industry that, despite its abrupt ending, left an enduring mark (Brands, 2009). Many of its compromises appeared in future legislation and led to minimum wages, elimination of child labor, collective bargaining for union workers, and limits to working hours per week, all well-established features of the industrial workforce today.

Putting Americans Back to Work

For the average American out of work—1 out of 4 in 1932—these New Deal programs did not provide immediate relief. In truth, Roosevelt did not consider direct aid to the people his most important task, but it was important enough that he addressed it in his first 100 days. The New Deal included important programs that aimed to get Americans back to work, including the Civilian Conservation Corps (CCC), the Federal Emergency Relief Administration (FERA), the Civil Works Administration (CWA), and the Tennessee Valley Authority (TVA).

All of these works programs were new and unique experiments. Never before had the government created work programs to aid in times of high unemployment. The programs were also not meant to completely substitute for private sector employment or to encourage dependence on the government, and none paid as well as private sector employment.

The Civilian Conservation Corps targeted unemployed and unmarried men between ages 18 and 25, largely because projects often required travel and living away from home. Since these young men were future potential heads of household, the program also aimed to provide them vocational training useful in the private sector. John Lambert of rural Tazewell, Virginia, worked in a CCC camp in the Shenandoah National Park from 1934 to 1937. There he and other recruits lived in specially constructed barracks and had most of their needs taken care of by the agency. Joining the corps before his 20th birthday, Lambert recalled helping build hiking trails within the park and eventually advancing to drive trucks on the projects (Lambert, 1994).

By July 1933 more than 275,000 recruits were at work on projects and housed in camps near the work sites. The program lasted until 1942 and created new recreation areas, built wilderness roads, and planted 3 billion trees. During the CCC’s duration, more than 3 million men earned $30 per month working in parks, forests, and other recreational areas. They were required to send $25 of their monthly earnings home to their families. Approximately 250,000 African Americans were employed in all–African American CCC companies and eventually, at the insistence of First Lady Eleanor Roosevelt, the CCC employed a small number of women (Hiltzik, 2011).

The Federal Emergency Relief Administration, originally created by the Hoover administration, was retooled to provide money directly to states to help relief agencies. Under FERA the government also created the Civil Works Administration as a short-term relief measure that put 4 million people to work in a variety of municipal projects, including laying sewer pipe, constructing roads, and building airports. The CWA offered work instead of a handout for multiple segments of society, including Native Americans. Several thousand Native Americans were employed to repair reservation housing, excavate prehistoric Native American mounds for the Smithsonian Institution, and make Alaskan rivers more navigable for salmon breeding (Hiltzik, 2011).

The boldest program emanating from Roosevelt’s first 100 days was the Tennessee Valley Authority, which built dams along the Tennessee River and brought electricity to dozens of rural communities. Affecting multiple southern states, including Tennessee, Alabama, Virginia, North Carolina, and South Carolina, the work projects of the TVA brought jobs along with the lasting benefits of electric power, flood protection, and soil reclamation to rural residents in some of the nation’s poorest counties (Hiltzik, 2011). Within a decade, the projects of the TVA brought conveniences to rural Americans, bringing them up to speed with the features of modern life that city dwellers had enjoyed for years.

Looking back on these public welfare efforts in his second inaugural address, FDR said, “The test of our progress is not whether we add to the abundance of those who have much. It is whether we add to the abundance of those who have too little” (as cited in Alter, 2006, p. 332).

Reaction: Roosevelt’s New Deal and Congress

Roosevelt was a master at pressuring and compromising with opponents of his New Deal programs in Congress. He approached negotiations for each bill differently, depending on his goals, but the 1933 Farm Relief Act is illustrative of his interactions with Congress. The act sought to increase the number of agricultural products that the AAA could control.

At the outset, FDR sent a message to Congress asking for quick passage due to the timing of the crop cycle. It was March, and the farmers needed immediate relief for the upcoming growing season. Nevertheless, there was significant delay in Congress. Joseph Martin, a Republican representative from Massachusetts (and later the Speaker of the House), opposed the bill, arguing that the Farm Relief Act would transform the United States into Soviet-style communism. He also suggested that the increased taxes the bill required would unfairly burden the American consumer.

The act eventually became law with a key Roosevelt concession. He appointed George Peek, one of the most vocal opponents of crop restriction, to head the AAA, the agency charged with carrying out that very policy. This concession began the national planning of American agricultural production (Freidel, 1990), and Roosevelt made it a reality through an unusual and significant compromise.

Voices of Protest

As the Depression wore on, passionate critics of the New Deal emerged. Though stock prices climbed gradually by the end of 1933, there were still roughly 12 million Americans without a job. A rising stock market meant little to those who struggled to earn enough money to put food on the table. FDR was still widely popular to many, and the Democratic Party was victorious in the 1934 midterm congressional elections, winning 74% of the House and 72% of the Senate.

But there was also a growing base of New Deal critics—and a significant divide among Americans. Eighty percent of all business leaders, journalists, and professionals in medicine and law opposed Roosevelt’s policies. According to historian David Hackett Fischer, “In less than two years, President Roosevelt had made himself the best-loved and worst-hated man in the country” (Fischer, 2005, p. 488).

Business-Minded Opponents

The formation of the American Liberty League, led largely by probusiness members of the president’s own Democratic Party but also a few Republicans, exemplifies the polarizing reaction to the New Deal. The stated goal of this new organization was to uphold the Constitution and oppose all congressional legislation it deemed dangerous, especially challenges to private property. Jouett Shouse, a former head of the Democratic National Committee, became its first leader in 1934. In Shouse’s view, the danger began with Roosevelt himself (Shouse, 1935).

The league was ambivalent about the National Recovery Administration, but it openly opposed the Agricultural Adjustment Administration, calling its plan for controlling farm production fascist—a strong condemnation at a time when totalitarian dictators were rising to power in several European countries. The league was most concerned with New Deal incursions on individual rights, and especially rights to the ownership and use of property. The NRA, which placed limits on production and set prices across individual segments of the economy, for example, trampled on the rights of business owners to manage their operations according to the terms of the free market. The league lost ground and support after the 1936 election and was dissolved in 1940.

Radical Opponents

Another key political critic was Huey P. Long, a senator from Louisiana. Unlike the probusiness Liberty League, Long’s attack on the New Deal came from the radical side and included advocating reforms that were akin to socialism. He argued that Roosevelt’s programs did not go far enough to aid struggling Americans. Long’s Share Our Wealth program called for a radical redistribution of wealth that included free college, pensions for all Americans, and a shorter workweek.

Share Our Wealth clubs sprang up in multiple states, and Long’s popularity soared. FDR called Long “one of the two most dangerous men in the country” (Brinkley, 1982, p. 57), the other being Gen. Douglas MacArthur, whom Hoover sent to clear out the Bonus Army from Washington in 1932. Although he had initially supported Roosevelt, Long planned his own presidential bid but was assassinated in Louisiana in 1935 (White, 2005).

The popular broadcaster Father Charles E. Coughlin was another outspoken critic of Roosevelt. He used his Sunday radio sermons to describe the failures of the president and the New Deal. A radical Catholic priest, he was among the first religious leaders to use radio to reach a mass audience, totaling several million each week. Like Long, Coughlin argued that Roosevelt’s programs did not go far enough. His social justice agenda called for nationalization of industries and the railroads, as well as currency and banking reform that would eliminate private banks.

After 1936 Coughlin’s politics took a dark turn, and he began to spout anti-Semitic rhetoric, claiming that Jews were responsible for the popularity of communism and expressing support for the anti-Jewish movement rising in Adolph Hitler’s Germany. Nevertheless he continued to be popular, drawing large radio audiences well into 1939, when the show was canceled after he vocally opposed the government’s position on the war in Europe.

Although Long and Coughlin never worked together, in the media and the minds of many Americans their programs represented similar populist remedies for the problems facing the nation during the Great Depression, with both advocating sweeping realignments of the economic system to aid those at the bottom of society (Brinkley, 1982).

The New Deal’s probusiness critics, as well as more radical opponents such as Coughlin and Long, had important concerns in common. Most notably, they believed that the president held “dictatorial” power and that the government should work toward ensuring prosperity for all, without exercising too much control over individuals and communities. They appealed to an earlier version of American life that was no longer available in the industrial age and perhaps had not even existed in preindustrial America. Roosevelt, in contrast, advanced programs and policies that would help the nation adapt to the realities of industrialization and modernity. It was for these reasons that FDR sought to move beyond their threats and more firmly establish his vision for the United States.

8.2 The Second New Deal (1935–1941)

The Supreme Court’s nullification of the NRA in 1935 left Roosevelt searching for a new strategy to ease the Depression, which showed no signs of abating. Despite the vocal opposition, the large Democratic majority in Congress and the president’s popularity with voters allowed him to forge a new plan that included more carefully thought-out legislation that had a lasting impact on the nation. This Second New Deal included five key pieces of legislation passed by Congress between July and August 1935 (see Table 8.3).

The Banking Act strengthened the Federal Reserve by centralizing monetary policy and assured businesspeople of a more stable economic environment; the Wheeler–Rayburn Public Utility Holding Company Act prevented the growth of holding companies in public utilities (which became as corrupt as railroad corporations had been in the late 19th century); and the Revenue Act or, as it was derisively known at the time, the “Soak the Rich Tax,” increased taxes on the wealthy. The National Labor Relations Act, or Wagner Act, offered federal protection to workers who wanted to engage in collective bargaining. The Social Security Act created a social safety net that provided old-age pensions, unemployment insurance, and direct aid to women and dependent children.

Table 8.3: Major legislation of the Second New Deal

Date

Program

Description

April 8, 1935

Works Progress Administration

Public works program initially funded 

at $5 billion, employed millions of 

Americans inpublic works projects 

such as road and school building.

July 5, 1935

National Labor Relations Act (Wagner Act)

Guaranteed the right of private 

sector employees to join unions; 

created the National Labor Relations 

Board.

August 14, 1935

Social Security Act

Provided old-age pensions,

 funding to states for unemployment 

insurance and aid to families with 

dependent children.

August 23, 1935

Banking Act

Reorganized the Federal Reserve 

system to provide for uninterrupted 

operation of the banking system.

August 30, 1935

Revenue Act

Raised the income tax rate on

 higher income levels. This progressive

 tax rate was ashigh as 75% on 

incomes over $5 million.

The Works Progress Administration

Among the key components of the Second New Deal was a program to get Americans back to work. The jobs programs of the NRA and the CCC paled in comparison to the massive Works Progress Administration (WPA) that Congress authorized in 1935 with an initial budget of nearly $5 billion. Throughout its duration (1935–1943), the WPA employed more than 8.5 million Americans in public works, arts, and humanities projects. Taking on more than a million individual projects, WPA workers built more than 650,000 miles of roads and highways, constructed or improved more than 124,000 bridges, and built or repaired 853 airports. The WPA also employed artists, historians, writers, and academics to paint murals, write children’s books, and collect oral histories as part of the Federal Writers’ Project. One enterprise collected and preserved the oral histories of more than 2,000 men and women who were enslaved before and during the Civil War (Watkins, 1993).

One of Roosevelt’s closest advisors, Harry Hopkins, headed the WPA, which quickly came to be the nation’s largest employer. Hopkins and Roosevelt were determined to get as many Americans to work as possible but gave programs for men priority. Since it was rare for women to be the head of a household, their employment needs were often considered secondary to those of men. Although for most working-class families a woman’s income was just as essential as that of a man, cultural bias kept women from enjoying the best work programs. Women were at first relegated to sewing projects and recreational work, and even in 1938, the WPA’s peak year, women accounted for only 13.5% of those employed. Later, women worked as nurse’s aides, day care workers, or in schools, but they were generally paid half as much as male WPA workers (Watkins, 1993).

Although New Deal administrators at the federal level did not sanction gender discrimination or racial bias, the programs were administered locally, resulting in variations in how they were executed. In the South especially, city and county Democratic Party administrators were often White men who perceived women and African Americans as taking jobs from “more deserving” White men.

The experience of Mae Gaddis of South Jacksonville, Florida, illustrates the ways in which working-class women struggled for entrance to the work programs. Writing to First Lady Eleanor Roosevelt in 1939, Gaddis related her difficulty in obtaining a WPA work card. Living alone with her 4-year-old disabled son after her husband walked out, she was denied local charity because she was deemed able to work. Even providing copies of her legal divorce papers did not help. She told the First Lady that she took the papers “to the W.P.A. office but they even refused to read it” (as cited in Green, 2007, p. 198).

African Americans faced even more discrimination, despite the pleadings of Eleanor Roosevelt to offer them equal opportunities. Although African American men found it somewhat easier than women to gain WPA employment, they were often relegated to menial jobs and were seldom able to obtain skilled or supervisory positions (Badger, 1989). Even in northern cities, African Americans without skills struggled to get a WPA appointment. Queens, New York, resident Jake Govan recalled, “The people who had the most trouble during the Depression were day laborers, ‘cause they didn’t have no skills” (as cited in Gregory, 1998, p. 42). While as a skilled tradesman he had no trouble getting employment with the WPA, many of his neighbors went without work (Gregory, 1998).

The Wagner Act and Labor Relations

The Wagner Act, or the National Labor Relations Act, was another significant part of the Second New Deal that continues to affect U.S. workers in the 21st century. Authored by New York senator Robert F. Wagner, the law guaranteed the right of workers to join unions and bargain collectively. The Wagner Act prevented businesses from interfering in union activities or from taking punitive measures against those who participated in them. Many business owners resented the act because in their view it infringed on their freedom of contract and property rights and inserted the federal government between employer and employee.

Over these objections, the Wagner Act established a National Labor Relations Board (NLRB) that was empowered to investigate and stop employers from interfering with workers’ rights to organize. One of the NLRB’s functions was to operate secret-ballot elections when workers sought a vote on union representation. Workers across many industries lost no time in organizing, and by 1941 the NLRB held nearly 6,000 elections across multiple segments of the economy and involving almost 2 million employees (Cohen, 1990).

One result of the organization drive was the formation of the Committee for Industrial Organizations in 1935, headed by United Mine Workers president John L. Lewis. The most prominent union organization in the United States, the American Federation of Labor, represented only workers in skilled trades and generally excluded women and African Americans. Most of those seeking to organize under the Wagner Act were unskilled factory and industrial workers, many of whom were immigrants. The CIO, which was renamed the Congress of Industrial Organizations, aimed to create a powerful labor coalition for these workers, especially in the automobile and steel industries (Zieger, 1995).

Through the use of sit-down strikes, where workers literally sat down on the job and refused to work, and other tactics, the CIO garnered attention and strength, as well as much opposition from industrial employers. In Michigan, General Motors staunchly refused to recognize workers’ desire to form a union, despite the Wagner Act’s guarantee. Beginning in December 1936 autoworkers took control of an assembly plant in Flint, remaining in the factory for 44 days until General Motors reluctantly agreed to recognize the United Automobile Workers (UAW) as their representative union (Zieger, 1995). The UAW was able to expand its organizing drive to other plants, and although the struggle remained violent and bitter, by 1941 workers across the entire auto industry enjoyed union protection.

Parts of the steel industry also strongly resisted unionization drives. The nation’s largest producer, U.S. Steel, bowed to worker organization, but multiple smaller firms refused union recognition. Beginning on May 26, 1937, the violent Little Steel Strike erupted in multiple states at Republic Steel, Youngstown Sheet and Tube, and other small steel producers. This prolonged strike lasted into August and demonstrated the strong tensions that existed between labor and management when, on Memorial Day in 1937, police were called in by management and killed 10 picketers near one of Republic Steel’s mills in Chicago.

Roosevelt expressed frustration at both sides, including Republic Steel’s leaders and union head John L. Lewis, exclaiming, “a plague on both of your houses” (as cited in Fried, 2001, p. 141). Despite the violence and determination of the steelworkers, the strike failed, and workers at most of the smaller steel corporations remained unorganized until the early 1940s. The legacy of the Wagner Act was not that it eased labor relations, but that it provided legal protection for laborers to make their opinions heard.

Social Security

The cornerstone of the Second New Deal was the Social Security Act of 1935. Though Progressive reformers and others had suggested that the nation needed a federal system of support to help those who were disabled or aged, opponents argued providing direct relief—which they claimed undermined self-reliance—would be in opposition to the nation’s founding philosophy. After the shock and experience of the Great Depression, however, many realized that traditional forms of societal welfare support (local charity, family, and paid labor) were insufficient to meet the growing need.

As a result, Roosevelt worked tirelessly to guide the Social Security legislation through Congress. Enacted in 1935, the bill’s preamble declared it was

an act to provide for the general welfare by establishing a system of Federal old-age benefits, and by enabling the several States to make more adequate provision for aged persons, blind persons, dependent and crippled children, maternal and child welfare, public health, and the administration of their unemployment compensation laws. (as cited in Jurkowski, 2008, p. 85)

The Social Security Act formed the basis of the New Deal’s welfare state. A permanent measure, one of the act’s aims was to provide a modest monthly income to the nation’s elderly that would keep them out of poverty. In addition to old-age pensions, the act provided grants to states to provide unemployment compensation. By 1937 all states and territories of the United States passed unemployment insurance laws making them eligible for federal money to offer temporary benefits for those who lost their job through no fault of their own. Another component of the Social Security Act similarly granted states federal funds for direct aid to women and dependent children.

Like many other New Deal measures, the grant programs were administered at the state level, sometimes by racially prejudiced local officials. As a result, African Americans in the South and Latinos in some western states often found it difficult to secure unemployment compensation and other benefits (Badger, 1989).

Beginning in 1937 all workers began paying a new payroll tax to fund Social Security. Starting in 1940 retirees began drawing $41.30 per month from the fund as a supplement to their retirement. The law was in line with the wealth redistribution ideas of New Deal critics Huey P. Long and Charles E. Coughlin, and it demonstrated that FDR empathized with the plight of the elderly, disabled, and unemployed.

Social Security’s critics called it a move toward socialism, arguing it would create a nation of dependents. New Jersey senator A. Harry Moore claimed, “It would take all of the romance out of life. We might as well take the child from the nursery, give him a nurse, and protect him from every experience that life affords” (as cited in Segal, 2010, p. 350). Despite the criticism, Social Security’s supporters countered that it was a moderate piece of legislation because workers paid into it through a regressive payroll tax, and they therefore earned its benefits. Whatever the opinions about it, one thing was clear: the Social Security Act of 1935 shifted Americans’ relationship with the federal government and fostered the expectation that the government was responsible for the welfare of its citizens.

The Supreme Court Battle

At the same time FDR shepherded the Social Security Act through Congress, he also began campaigning for the 1936 presidential election. Although some predicted a close race between Roosevelt and Republican challenger Alf Landon of Kansas because the Depression continued to plague the nation, the president’s charismatic personality and honest attempts at solving the nation’s economic crisis left many voters unwilling to embrace political change. Roosevelt won with 61% of the vote, the largest popular margin ever. In the Electoral College, Landon claimed a mere eight electors, carrying only Maine and Vermont.

Despite the solid support, a severe challenge to Roosevelt’s New Deal program in the form of a series of cases brought before the U.S. Supreme Court loomed on the horizon. One purpose of the Supreme Court is to serve as one of the three main checks and balances in the American political system to ensure that neither Congress nor the president overstep their constitutionally limited authority. The New Deal legislation was challenged by a number of opponents, so the Supreme Court was compelled to review cases brought before it to ensure the legislation’s constitutionality.

The Supreme Court was a heavily conventional body during much of the 1930s. By the end of FDR’s first term, some of the president’s most important pieces of New Deal legislation had been struck down, including the National Recovery Act. After that ruling, Roosevelt held a press conference in the Oval Office, declaring:

The implications of this decision are much more important than almost certainly any [Supreme Court] decision of my lifetime or yours. . . . The big issue is this: Does this decision mean that the United States Government has no control over any economic problem? (as cited in Shesol, 2011, p. 148)

Roosevelt was reluctant to do anything about Supreme Court opposition prior to the 1936 election; however, after his victory, he set in motion an unprecedented strategy to take control of the situation, which became known as his court-packing plan. Roosevelt argued that because presidents appointed Supreme Court justices for life, once the justices reached age 70, if they did not retire, the president should appoint another to share the workload. Using this logic, he declared in February 1937 that the court’s size should increase from 9 to 15 members (thus conveniently allowing him to appoint 6 new justices sympathetic to the New Deal).

Suddenly, the Supreme Court changed direction on some of the New Deal measures. Prior to the court-packing scheme, the justices opposed the Social Security Act. Three months later one of the justices announced his retirement, which enabled FDR to appoint Alabama senator Hugo Black in his place. Black was in favor of New Deal legislation, so the president had an important new ally. Furthermore, Justice Owen J. Roberts underwent one of the most dramatic changes in philosophy in Supreme Court history. Scholars still debate whether or not Roosevelt’s threat to add justices to the court caused Roberts to reverse course, but Roberts’s sudden support for Social Security resulted in the famous saying “a switch in time that saved nine” (as cited in Trachtman, 2009, p. 108). The “switch” was Roberts’s change of direction on Social Security, and the “nine” were the number of justices.

Although this was a victory for Roosevelt, on the whole the court-packing scheme was a devastating defeat. It gave his critics an overwhelming reason to attack his policies. In March 1937 historian James Truslow Adams said the following in a radio address:

The question is of the freedom of that Court which in the last resort is the sole bulwark of our personal liberties. . . . If a President tries to take away our freedom of speech, if a Congress takes away our property unlawfully, if a State legislature . . . takes away the freedom of the press, who is to save us except the Courts? (as cited in Johnsen, 1937, p. 278)

If one man, even the president of the United States, could effectively change the makeup of the courts and shape the direction of their rulings, many wondered, did he not hold too much power? A coalition rallied against the court-packing scheme. The criticism came at critical time because Roosevelt’s landslide reelection victory effectively gave his programs the mandate of the people. Afterward, many voters rethought their support.

In July 1937 Congress rejected the court-packing plan. The president dismissed the loss, saying that he might have lost the battle with the court but ultimately won the war, because he was able to appoint a new justice and keep Social Security. It was clear to many, however, that this was a defeat with long-lasting consequences. Historian Alan Brinkley argued that the court battle eroded Roosevelt’s reputation of invulnerability (Brinkley, 1995). The result was that Roosevelt’s ability to get legislation through Congress was greatly diminished.

The “Roosevelt Recession”

A contributing factor to the end of New Deal reform was the blame Roosevelt began to shoulder for worsening economic conditions. The downturn was unexpected. Americans saw some economic relief between 1935 and 1936. Even in early 1937, continued positive production numbers across multiple industries gave South Carolina senator James F. Byrnes the confidence to proclaim that “the emergency has passed” (Golway, 2009, p. 47). Byrnes’s confidence proved to be greatly premature, and the brief economic improvement of the mid-1930s did not last.

Between August and October 1937, the stocks measured by the Dow Jones dropped 40%. It was reminiscent of the losses suffered during the Great Crash in 1929. Even worse for the average American, 2 million people lost their jobs. The press began derisively calling the downturn the Roosevelt Recession, laying the blame for the economic woes squarely at the president’s feet.

The downturn was caused in part by a cut in government spending. In response to Republican calls for spending cuts, and agreeing that a balanced budget was important, Roosevelt supported a cut in spending for New Deal programs that had been boosting the economy. To further reduce the budget deficit, Roosevelt introduced new taxes, which weakened consumer power at a time when the economy needed people spending more and not less. Industrial production declined and unemployment approached 20%, a figure not seen since 1933.

The recession sparked a significant economic debate. From the perspective of advocates of a type of economics referred to as the Austrian School of Economics, the goal was to keep a balanced budget and rein in spending. Henry Morgenthau, the secretary of the treasury, promoted this plan. Proponents of the Progressive perspective, led by economist John Maynard Keynes, argued just the opposite: Government should spend more, not less, to prime the economic pump and get businesses moving again in the right direction.

Keynes first postulated his theory in his 1936 book, The General Theory of Employment, Interest, and Money, and his philosophy came to be known as Keynesian economics. Keynes discussed how economics is both a science and a way to conduct public policy (Minsky, 2008). The government could actively shape the future by spending its way out of the Depression instead of focusing on balancing budgets. Keynes called the strategy countercyclical, which meant that the government should spend money when times were bad and tax when the economic situation was more favorable. Although he was unsure about the new Keynesian ideas, Roosevelt ordered the restoration of federal funding to previously cut programs, and in 1938 he asked Congress to fund $5 billion in programs, part of which went to the Works Progress Administration and the newly reauthorized Agricultural Adjustment Administration. This spending did bring some relief, but it did not end the ongoing Depression.

Fair Labor Standards Act

Despite waning support for his policies and significant opposition in Congress, Roosevelt managed to pass one last significant piece of New Deal legislation. Signed into law in June 1938, the Fair Labor Standards Act (FLSA) reinforced the New Deal’s commitment to labor and the working class. Still in force today, the act established a pattern of federal guarantees labor advocates had long desired.

The legislation instituted a federal minimum wage—40 cents an hour—and set the workweek at 40 hours. Those who worked more hours would be paid time and a half as overtime. It also defined and formally banned the use of child labor. Roosevelt took to the airwaves to drum up support from the American public through one of his fireside chats. The president told the country, “Do not let any calamity-howling executive with an income of $1,000 a day, . . . tell you . . . that a wage of $11 a week is going to have a disastrous effect on all American industry” (Figart, Mutari, & Power, 2005, p. 106).

Despite his success with the FLSA, the 1938 midterm elections dealt another blow to Roosevelt’s legislative agenda and political power. The powerful Democratic congressional majority that had given him almost universal support started to fall apart. The president unsuccessfully campaigned against Democrats who spoke out against the New Deal. Republicans also gained seats in both the House and Senate, making it possible for anti–New Deal Democrats to work across the aisle to oppose the president’s policies.

In 1939 this decidedly different Congress slashed funding for relief programs, and a faction within the House launched investigations against New Deal agencies, including the Works Progress Administration and the National Labor Relations Board. The New Deal opponents did not have enough clout to repeal New Deal legislation, but they did halt further forward momentum (Badger, 1989). As the decade neared a close, the New Deal had failed to bring recovery from the Great Depression.

8.3 Depression Era Society and Culture

The 1920s struggle between tradition and modernity quickly faded as the United States grew ever more mired in the Great Depression. Far more severe than any other economic downturn in history, the Depression that lasted from 1929 to 1941 left an indelible mark on American culture and society. The Depression was in some ways a unifying experience, and the cultural expressions of the decade reflect a common frame of reference that emerged from the widespread exposure to misery and suffering.

Popular Entertainment

Amid their suffering, Americans sought a release in popular culture. The 1930s produced an amazing array of popular movies, novels, music, radio programs, and iconic photographs. The film industry came of age in the decade, producing films tinged with Depression era messages, such as Frank Capra’s Mr. Smith Goes to Washington (1939), in which a man from small-town America unwittingly finds himself a U.S. senator and manages to expose considerable corruption in Congress. Many of the decade’s films offered moviegoers a brief escape from their troubles. Among the decade’s classic films are The Mummy (1932), King Kong (1933), Snow White and the Seven Dwarfs (1937), and The Wizard of Oz (1939).

Americans found that the Depression created a common experience that was reflected in the decade’s cultural expressions, and the evolution of the cultural forms that first emerged in the 1920s allowed popular culture to become more pervasive (Dickstein, 2009). Popular vaudeville stage shows were transformed into radio programs, bringing their skits and humor into homes across the nation. Radio shows ranged from comedies such as Fibber McGee and Molly, which premiered on the National Broadcasting Company (NBC) in 1935, to sinister mysteries represented by The Shadow, which originated as a pulp-fiction novel series. Other radio series included dramas, westerns, and the ever-popular musical variety show.

Photojournalism captured the Depression experiences of ordinary Americans, and new forums such as Life magazine brought those images into the homes of thousands. Popular culture helped create a common frame of reference through mass media in new and engaging ways.

The literature of the decade likewise highlighted the concerns of Depression era Americans. Emphasizing social criticism, realistic portrayals of events and segments of the population offered rich details about American life. Erskine Caldwell’s Tobacco Road (1932) follows a family of poor White tenant farmers in Georgia through their economic and ethical anxieties. John Steinbeck’s Of Mice and Men (1937) traces the relationship between two displaced migrant farm laborers as they travel across California in search of work. In Richard Wright’s Native Son (1940), 20-year-old Bigger Thomas, a poor African American living on Chicago’s South Side, struggles to survive in White-dominated society.

More than any previous era, the cultural expressions of the Great Depression represented the anxieties of society and offered a realistic picture of ordinary life. Through film, photography, literature, and popular media, artists of the Depression identified strongly with ordinary people and their needs. These cultural expressions stand as witness to what it meant to live in the Depression. They offer what one historian has called “a richly subjective understanding of the mind and heart of the Depression” (Dickstein, 2009, p. xviii).

A New Deal for Some

Most New Deal programs and agencies aimed at solving the nation’s economic problems, but not all Americans benefitted equally from the creation of a welfare state safety net. Workers who came to enjoy union protection during the 1930s reaped the most from the New Deal in terms of better wages, working conditions, and unemployment insurance. Others in society, including domestic workers, women, minorities, and the aged (Social Security did not begin paying retiree benefits until 1940) were often left behind.

Women and the New Deal

Work programs associated with the New Deal extended most benefits to men, and especially White men. When the Depression struck, only slightly more than a quarter of all women were in the paid labor force at any given time. Many women worked part time or intermittently due to their domestic responsibilities and did not qualify for work-related programs such as Social Security and unemployment compensation.

Additionally, the Roosevelt administration and the Congress excluded religious and nonprofit organizations from these benefits. This immediately disqualified thousands of women who worked as teachers, nurses, and social workers for nonprofit firms. Similar exclusions of domestic and farm workers from the programs also impacted women in need of relief, especially African Americans and Latinos who filled the bulk of jobs in those segments. Women were more likely to benefit from such programs as Aid to Dependent Children, made possible under the Social Security Act and not dependent on a person’s work history (Mettler, 2012).

A few women did gain important administrative posts within the Roosevelt administration, including Frances Perkins, who served as the secretary of labor during FDR’s entire presidency (1933–1945). She championed important works programs such as the CCC and the WPA and was a driving force behind the Fair Labor Standards Act. In fact, Roosevelt appointed both the first woman to hold a cabinet post (Perkins) and the first African American woman to head a federal agency.

African Americans Search for a New Deal

At the urging of Eleanor Roosevelt, FDR appointed civil rights leader Mary McLeod Bethune to direct the Division of Negro Affairs, part of the National Youth Administration (NYA). The NYA provided work and educational programs for youth ages 16 to 25. Herself a prominent educator, Bethune worked diligently to increase the participation of African American youth in the programs and to reduce the unemployment rate for young African Americans, which ran as high as 40% during the Depression (Badger, 1989).

Despite the efforts of Bethune and others, the New Deal failed to resolve the major concerns of African Americans and their allies. At the Depression’s onset most African Americans still lived in the rural South. The economic crisis hit the sharecropping, tenant farming, and unskilled industrial jobs that formed the core of their employment especially hard. The National Recovery Act often exempted the occupations of African American men and women from its code benefits, and it excluded domestic work altogether at a time when 90% of domestic housekeepers were African American women (Badger, 1989). The Social Security Act similarly excluded domestic and agricultural workers, trades employing up to 65% of African Americans (Hiltzik, 2011).

The Agricultural Adjustment Administration actually worked against African Americans’ interests. By paying landowners to leave large segments of their cotton farms fallow, the AAA effectively pushed thousands of African American sharecroppers and tenant farmers off their lands because landowners could earn more in government subsidy than from their share of the tenant’s crop. Other relief programs administered by local officials, such as the Civilian Conservation Corps, tended to give positions to White men, and even when they were employed in relief jobs, African American men were paid less. In some rural southern counties, African Americans were paid only 30% of what Whites received. Among the New Deal programs, only the NYA consistently provided work and other relief to African Americans (Hiltzik, 2011).

Racial discrimination, violence, and intimidation continued to plague African Americans during the Depression. Roosevelt denounced racial violence and lynching as murder, but he failed to convince Congress to pass an antilynching measure. His need to keep the votes of White southerners, who formed a core of his Democratic coalition, left him walking a tight line on racial matters. Within the White House, only Eleanor Roosevelt spoke out against the discriminatory practices of her husband’s programs, but her influence did little to change New Deal policies.

Mexican Americans and the New Deal

Latinos faced a reality similar to African Americans during the Great Depression. Nearly a million Mexican Americans called the United States home in the 1930s, and most were first- or second-generation immigrants living in the Southwest. Latinos formed the majority of migrant farmworkers, who traveled from farm to farm picking strawberries, grapes, and other crops at harvest time. Like tenant farmers or sharecroppers, their occupations were excluded from those qualifying for aid under government work programs.

The unskilled nature of their work was similarly unrecognized under the labor organization guarantees of the National Industrial Recovery Act or the Wagner Act. California farmworkers did manage to briefly find support in the Communist-led Cannery and Agricultural Workers Industrial Union. The union led some 37 strikes in 1933 alone and won a wage increase for thousands of farmworkers. Repression by powerful California growers and the arrest of union leaders, however, led to the union’s demise in 1934 (Badger, 1989).

The Native American New Deal

Native Americans were another rural group that struggled during the Depression. Since the passage of the Dawes Severalty Act (see Chapter 1) in 1887, the federal government had encouraged their assimilation into American culture, but Native Americans remained among the poorest of the nation’s residents. Bureau of Indian Affairs commissioner John Collier worked tirelessly to improve conditions and programs.

Under his leadership Congress passed the Indian Reorganization Act (IRA) of 1934, which partially reversed the Dawes Act and secured certain rights for Native Americans. The act ended the policy of individual land allotment and returned local self-government to recognized tribes. Although the IRA did little to relieve the economic distress of individuals, it did form an important foundation for the future. It particularly marked a shift from a drive to assimilate Native Americans to the preservation and celebration of their culture and the restoration of their rights.

As commissioner, Collier also replaced boarding schools with local day schools, encouraged the revival of ancient cultures and tribal dances, and improved health services. He was able to use the resources of the Civilian Conservation Corps, the Civil Works Administration, and the Works Progress Administration to build schools and hospitals and provide limited employment opportunities. Despite Collier’s efforts, little changed on reservations, and many Native Americans remained in poverty long after the New Deal ended (Badger, 1989).

8.4 Legacies of the New Deal

Scholars on the left and right continue to debate the legacy of the New Deal, but several conclusions can be drawn. First, despite frequent charges to the contrary, the New Deal was not an attempt to subvert the capitalist economic system and replace it with socialism. Instead of nationalizing the banks and industry, FDR saved and strengthened the private banking system, ensured the existence of family farms, and prevented the failure of many private industries.

Second, the New Deal strengthened the federal government at a time when it needed to exercise control over the economy. The debate will continue over how FDR chose to use the power of the federal government, but he left it unquestionably stronger.

Third, the New Deal initiated the concept of the welfare state in the United States and created agencies that continue to affect the lives of every American today. By ensuring that Americans have access to money to buy what they need in the private marketplace, Roosevelt completed the 20th century transformation of the United States into a consumer society.

Some have questioned Roosevelt’s record on civil rights, claiming that he did not have the will to advance the cause of African Americans. Recent scholarship suggests that this view is partially incorrect and that Roosevelt’s Supreme Court appointments played key roles in aligning the justices with more progressive views on racial equality (McMahon, 2004).

The final important legacy of the New Deal lies in where it fell short of its goals. The New Deal did not end the Great Depression, nor did it help some of the poorest people in America. Although some felt relief, many tenant farmers, African Americans, and migrant workers continued to struggle, and a few were in worse positions than before the Depression. The Roosevelt Recession highlighted the dire economic straits the nation continued to face. With millions of families already destroyed by the Depression, it was ironic that the global destruction caused by a world war would be the catalyst of America’s resurgence.

A New Relationship Between Americans and Their Government

Although scholars continue to debate the legacy of the New Deal, there can be little doubt that it substantially changed the relationship between Americans and the federal government. The government’s role in managing the economy greatly expanded, and it became the primary mediator in disputes between workers and employers. The New Deal programs to manage farm production continue well into the 21st century, with the government advising farmers what and when to plant and providing significant subsidies. Federal regulations oversee the banking industry, insure bank deposits, regulate the stock market, and subsidize home loans. Social Security pensions for retired Americans, disability payments, and unemployment compensation are now considered a right of citizenship and not a government handout.

Other transformations reshaped the physical United States. Hydroelectric dams and electrification programs brought electricity to rural Americans, conservation programs planted forests, and public works projects built schools, libraries, and other facilities still in use today.

Radicalism and Rise of Organized Labor

The Great Depression also brought radical intellectuals, Communists, Socialists, labor leaders, and New Deal advocates together under a common canopy. In stark contrast to the probusiness backlash of the 1920s, during the Depression radical voices helped sway politics and public policies. Coinciding with the Second New Deal, the Communist Party, which never numbered more than 100,000 at any one time, sought to expand its influence through alliances with other organizations. In the mid-1930s the Popular Front became a term for the coalition consisting of Communist Party members, socialists, and New Deal advocates who urged reform, not revolution. The tempered language of the movement afforded the Communists some mainstream, albeit temporary, respect for the first time.

Although few American workers actually joined the growing ranks of the Communist or Socialist Parties, thousands were motivated by Communist or Socialist organizers to march in the streets to demand better pay and working conditions. Radicals such as the CIO’s John L. Lewis, Senator Huey P. Long, and others created a climate in which struggles for industrial unionism, unemployment relief, and even African American civil rights seemed possible (Cohen, 1990).

Federal support for organized labor also formed another important legacy of the New Deal. The militant labor activism of the era redefined the meaning of civil liberties, including freedom of speech and assembly. As industrial workers flocked into the CIO and other industrial unions, they demanded government support for their collective voice. By the end of the decade, the federal government emerged as the protector of free speech for organizations as well as individuals (Kutulas, 2006).

In 1939 the Department of Justice instituted a Civil Liberties Unit aimed at protecting freedom of thought and expression, and organizations such as the American Civil Liberties Union expanded and gained respect. Support for civil liberties and especially criticism of the government were not universal, however. The House of Representatives established an Un-American Activities Committee in 1938 to investigate potential disloyalty and subversive activities. Among the committee’s targets were communists, leftists, Democrats, and labor radicals. Shortly after the eruption of World War II, Congress followed with the Smith Act of 1940 that made it a crime to teach against or advocate the overthrow of the government (Kutulas, 2006).

A World in Crisis

The federal support for civil liberties in the United States stood in stark contrast to the new governments and philosophies emerging in Europe. The Great Depression, unlike previous economic downturns, was a global event. International trade fell by more than 30% when multiple nations, including the United States, raised tariffs to protect their own manufactured goods. Industrialized nations also sharply reduced the purchase of raw materials, leading to a further collapse in mining, agriculture, and other extractive industries.

Some European nations’ economic relief policies proved more effective than the New Deal, although it is important to note that the economic downturn was greater in the United States than in most areas of Europe. At the decade’s end, the U.S. unemployment rate remained above 15% and did not significantly recover until 1941, when the United States entered World War II. It was mobilization for war that finally lifted the shadow of the Depression from the nation (Kindleberger, 1986).

American Lives: Rosie the Riveter: Margarita Salazar McSweyn

Margarita “Margie” Salazar was 25 years old when the United States entered World War II. The following year she left her traditionally female occupation for a job at a Lockheed assembly plant in Los Angeles, becoming one of more than 4 million women who left their occupations and homes to fill industrial jobs tied to wartime production. Women like Salazar were exemplified by Rosie the Riveter, a character popularized by a government advertising campaign. They worked in defense industries as welders, riveters, aircraft assemblers, and in other male-dominated occupations. For the first time they enjoyed higher wages and the ability to step outside the occupations and experiences society had proscribed as women’s sphere. For the duration of the war, women were encouraged to take on new roles to aid the nation’s war effort.

Salazar was born in New Mexico July 20, 1916, and her large family moved to Los Angeles when she was an infant. She spent most of her youth in a largely Mexican American neighborhood, where social activities revolved around the church and a Mexican social club. She attended Sullivan Beauty College and worked as a beauty operator (beautician) until the war broke out.

Seeing advertisements in English and in Spanish for women to work at the Lockheed airplane assembly plant, Salazar thought the job would allow her to do her part for the war effort, and she relished the higher wages the job offered. Aiming to protect the wage rates for returning veterans, labor unions advocated for women war workers to be paid the same rate as men. Salazar recalled, “I thought it’d be a whole new experience” (as cited in Gluck, 1987, p. 85). Wearing the required pants, sensible shoes, and hair net, Salazar at first filled in for a variety of assembly positions within the factory but eventually moved to the tool-dispensing shed because she found it difficult to stand for a full shift.

Salazar’s patriotism extended beyond her Lockheed job. She volunteered for the Civil Defense Corps, a federally organized volunteer force that worked to mobilize the civilian population in response to potential threats. The federal government strongly encouraged Salazar’s involvement in the volunteer group as well as her work at the Lockheed plant. As a part of the nation’s total war program, it was essential that American citizens do their part to contribute to the war effort, and with so many young men entering military service, it was vital that women step in to keep the nation’s war production running smoothly.

Before the war’s end, Salazar, along with many other young women, sought to leave her aircraft job for a less strenuous white-collar job. The Lockheed plant work was hard. Long days spent standing on her feet, dirty conditions, and heat made it difficult to endure, and she wanted out. Because her labor was so vital to the war effort, she had to provide Lockheed with a doctor’s certificate before she could quit. She ultimately took a position as a clerk in a beauty supply store.

Had Salazar remained at the assembly plant, her job would have ended with the war. Women war workers were encouraged to return to their homes or traditionally female jobs to make room for the thousands of returning veterans in search of work. For Salazar and millions of women, their World War II experiences permanently altered their social and work lives. Her world expanded beyond the “pink-collar ghetto” of beauty work and her Mexican American neighborhood. Marrying a veteran in 1945, she returned to the workforce in the 1950s when her children reached school age and filled a dual role as white-collar worker and homemaker. Maintaining links to family and cultural tradition, Salazar’s war experiences showed her there were other choices, which she exercised to expand her world (Gluck, 1987). She died in 1989.

9.1 The Road to War

During the 1930s the United States focused inward to solve its mountain of economic problems. International relations were forced to the back burner, but President Franklin D. Roosevelt did reach out in hopes of securing new trading partners. His Republican predecessors refused to recognize the Soviet Union, or Union of Soviet Socialist Republics (USSR), the single-party Communist state that emerged from the Russian Revolution. But Roosevelt, seeing a potential market for American trade goods, exchanged ambassadors with the Soviets. It was the beginning of an uneasy alliance between the Communist Soviet Union and capitalistic United States, one that lasted only so long as both parties needed a partner and eventually eroded in the face of global conflict.

During the Depression years, much of Roosevelt’s foreign policy focused on interests in the Western Hemisphere. In another effort to shore up trade and end the nation’s economic malaise, Roosevelt promoted the Good Neighbor Policy, which ended the pattern of American intervention in Latin American affairs. The policy asserted that the United States recognized the sovereignty of Latin American countries. The longtime occupation of the Dominican Republic came to an end, and a new treaty with Cuba dissolved the Platt Amendment that had granted the United States the right to intervene in that nation’s affairs (Pike, 1995). The relaxation of concerns about Latin American affairs lasted until the onset of the Cold War, when protection of the region once again became a grave concern.

Fascism and New World Leaders

As the United States struggled to redefine foreign relations and establish new trading partners, European nations similarly reevaluated their place in the world. Two important European nations, Germany and Italy, witnessed the emergence of totalitarian or Fascist governments.

Fascism was the result of radical right-wing ideologies whose proponents saw it as a viable conservative response to threats to the economic and social order. Fascism thus became a violent middle-class attempt to suppress working-class aspirations. It proposed a social unity that eliminated political parties and trade organizations. If, from the perspective of national unity, an individual or group was considered counterproductive, it was eliminated (Curtis, 2003). The tenets of fascism were thus diametrically opposed to those of democracy, and this was one reason the United States came to see the Fascist movement as a threat.

Championing the Fascist state were new leaders who took control in Italy and Germany. Benito Mussolini organized the Fascist Party movement in Italy and came to power in 1922. His party promoted a national regime that promised to improve Italian culture and society, harkening back to its roots in ancient Rome. Mussolini’s charismatic personality led many to believe that he offered the best route to save the Italian nation in a time of struggle. In order to succeed, he urged Italians to strive for a “true Italian” ideal that required citizens to abandon individualism and to see themselves as a component of the state. Italians flocked to his movement, and his cult-like status helped to inspire other would-be Fascist leaders (Haugen, 2007).

In Japan the growth of ultranationalism in the early 20th century supported the rise of militarism, seeing that nation’s military gain important government influence. Officers of the army and navy occupied the nation’s highest offices, including prime minister. In 1931 conflict with neighboring China over economic and political treaties resulted in a Japanese invasion of that nation’s Manchuria region. Facing international criticism, Japanese militarists abandoned international cooperation and withdrew Japan from the League of Nations in 1933.

After years of conflict, by 1937 Japan was at war with China. Occupying the coastal region, Japanese military personnel wreaked havoc on the civilian population, outraging the United States and other nations by murdering and raping thousands during the capture of the Chinese capital at Nanking (Payne, 1995).

In 1933 another Fascist government appeared in Europe with the rise to power of Adolf Hitler as the leader of the National Socialists, or the Nazi Party, in Germany. His ascendency stemmed from a weakened democratic government that was unable to pay German war debts from World War I. At the conclusion of that conflict in 1919, under the Treaty of Versailles, Germany was ordered to pay more substantial reparations than other members of the Central Powers, including some $33 billion to cover damages to civilians and property, and it was forbidden from rebuilding its military. The economic consequences meant German citizens suffered severely, including through the years of the Great Depression.

Hitler presented a way to regain national pride and resurrect the struggling German economy, and he capitalized on these needs to attain political power. Similar to Social Darwinism, he proposed Nazism, arguing that White people, especially Aryans from Europe, formed a master race. Under his leadership the Nazis employed a particularly radical interpretation of eugenics, the notion of improving the genetic quality of the human race, to degrade other races and unite Germans. His ideas sought to link White Germans of all classes together, creating a racialized nationalism.

Hitler was appointed chancellor in 1933 and effectively intimidated his opposition and the German Parliament into giving him absolute powers. As a dictator, he used his police force to persecute Jews, Roma, homosexuals, the physically and mentally disabled, and other minority groups because he believed that the German race was superior and he did not want to “dilute” or “weaken” his nation with outsiders he considered “inferior.” He blatantly violated the Versailles treaty and began a process of rearming the nation. Europe and the United States ignored his actions through the first several years of his dictatorship, believing that little would come of his reign. These leaders of Italy, Germany, and Japan pushed the world to war over the course of the 1930s (Carr, 1985).

Appeasement and the Road to War

Several key events pushed the world along the path to war. Territorial expansion and international aggression were early factors in the rising belligerency. In 1935 Mussolini’s army moved into Africa, conquering Ethiopia. One year later, Hitler remilitarized the Rhineland (western Germany), in direct violation of the Treaty of Versailles, which specified the region was to remain unfortified. In 1936, in the midst of the Spanish Civil War (1936–1939), Hitler and Mussolini sent support to Gen. Francisco Franco, who eventually emerged as the victor. Franco went on to head a new Fascist regime in Spain and closely allied with Germany and Italy.

Unrest also proliferated in Asia throughout the 1930s. China and Japan formally went to war with one another in 1937, following Japan’s invasion of Manchuria. Japan formed an alliance with Nazi Germany, and eventually Italy, initially aimed at forging an alliance against attack from the Communist Soviet Union. The group became known as the Axis Powers. This alliance provided important support for each growing power, as they invaded and controlled significant parts of Asia, Africa, and Europe (LaFeber, Polenberg, & Woloch, 2008).

In 1938 Hitler continued his military conquests in Europe by uniting Germany with Austria; then he took the Sudetenland, the northern and western areas of Czechoslovakia, which was populated heavily with German speakers. This raised great concerns throughout Europe, most notably in France and Britain, that Germany was out to grab substantial amounts of European territory as it had during World War I.

Instead of forcing Germany to return the territory, Britain and France adopted a policy of appeasement, making political or material concessions in order to avoid war or conflict. After German occupation of the Sudetenland, France and Britain negotiated a formal promise with Hitler in the Munich Agreement that he would not seek additional territory. Signed at the end of September 1938, the agreement left Hitler angry and the Czechoslovakians dismayed because they had not been party to the discussion. But British prime minister Neville Chamberlain was strongly in favor of this approach (Judt, 1998).

Between 1937 and 1939 Britain, with France in agreement, continued to support appeasement in order to avoid conflict with the Germans. The United States watched the international events from afar. The League of Nations, which was supposed to maintain collective security, failed to stop the buildup of a German empire and fighting force. Historians have long debated the effects of appeasement. Some argue that the wait-and-see attitude of the policy allowed Hitler and the other Axis Powers to increase their military strength and shore up national support. On the other hand, other scholars argue that there was little else Chamberlain could do, especially with the United States unwilling to get involved (MacDonald, 1981).

Barely a year after signing the Munich Agreement, Hitler invaded the rest of Czechoslovakia, while Mussolini attacked Albania. Both Fascist regimes were heading eastward across Europe. Hitler also signed a nonaggression pact with Soviet leader Joseph Stalin, previously his sworn enemy. Declaring that the two nations would not attack each other for a period of 10 years, it also included a secret provision that outlined how the Soviets and Germans would divide Eastern Europe in the future and ensured that Hitler could invade Poland without opposition from Stalin.

Seeing no prospects for peace, Britain and France abandoned the policy of appeasement and promised Poland support if Germany launched a full attack. Germany invaded Poland on September 1, 1939, just days after signing the pact with the Soviets. Britain and France declared war on Germany just 2 days later, officially beginning World War II.

Over the following year, Germany waged blitzkrieg, or “lightning war,” throughout Europe. Employing mass numbers of troops, tanks, and armaments supported by air strikes, the German fighting force moved quickly and effectively to surprise and destroy unprepared or ill-equipped regions in Europe. It was remarkably effective, as Hitler’s armies overran Denmark, Norway, Belgium, and Netherlands, all between April and May 1940. One month later, German soldiers raised a Nazi flag over Paris, and on June 22, 1940, France surrendered.

The blitzkrieg emerged as a German tactic only in fits and starts (Jackson, 2003). But given its success during the early stages of the war, it became part of the official German war strategy, and the country’s military commanders consciously used it for the first time in a campaign against the Soviet Union.

While the blitzkrieg concentrated on continental Europe, German forces also targeted Britain in a series of bombing raids known as the Battle of Britain. Aiming to gain superiority over the British Royal Air Force (RAF), the attacks began on July 10, 1940, and lasted 31⁄2 months. Germans first targeted British ships, shipping installations, and airfields, but bombing raids eventually moved across the nation and included the city of London, which was substantially damaged.

The British fought back, with RAF planes shooting numerous German flyers from the skies. Especially in cities, RAF radar and a successful air raid warning system allowed civilians to take cover, significantly reducing casualties. Nevertheless, as many as 40,000 civilians died. The full campaign persisted until the end of October, with the RAF significantly reducing the German air force, or Luftwaffe, but some raids continued into the next spring.

American Isolationism and Neutrality

Amidst the challenges of the Great Depression, most Americans, like their leaders, paid little attention to Japanese aggressions and the rise of fascism in Germany and Italy. At the close of World War I (see Chapter 6), the United States chose not to join the League of Nations, and in the decades since that war’s end, many Americans came to believe that U.S. involvement in the war had been a mistake.

One reason some questioned U.S. actions was the belief that arms manufacturers seeking profit had guided the nation into war. A Senate committee operating between 1934 and 1936 investigated these allegations and produced the Nye Report, uncovering the potential profit motives behind U.S. involvement. The public was also shocked to learn that during its neutral period from 1915 to 1917, the United States loaned Britain and its allies nearly $2.3 billion, thereby having a vested interest in ensuring a British victory. These revelations helped spark isolationist sentiment among the U.S. public.

Congressional actions during the 1930s reflected this desire to remain free of foreign disputes and conflict. A high tariff on imported goods, enacted in 1930 and remaining throughout the decade, insulated consumers from foreign markets. A series of Neutrality Acts issued in 1935 and 1937 forbade American travel on belligerents’ ships and outlawed the sale of arms or any war-related implements to countries at war (Doenecke & Stoler, 2005). Congress hoped that these measures would allow Americans freedom of the seas and help avoid involvement in European conflicts.

For ethnic Americans, and especially recent immigrants, perspectives on world events varied. Some Italian Americans and German Americans celebrated the growing patterns of nationalist pride in their homelands but were concerned by the rise of Fascist dictators at the heads of those movements. Vilified for their ethnicity during World War I, Germans had largely assimilated into the dominant U.S. culture by 1940, but they looked on with interest as events unfolded in Europe. More recent arrivals, Italians often found their loyalties divided. Irish Americans tended to maintain an anti-British stance (Jeffries, 1996).

For some Americans already obsessed with a fear of communism, the rise of Hitler in Germany offered a potential counterpoint to the Soviet Union. Few could have predicted the scope of the global conflict to come.

As war broke out in Asia and Europe, the United States struggled to remain isolated from it and renewed the Neutrality Acts in an attempt to avoid the escalating conflict. Although as many as 90% of Americans supported isolationism in 1937, that support declined each time Hitler took an aggressive step in Europe. Signaling a wavering of America’s neutral stance, in 1939 Congress approved the sale of arms to Britain on a “cash and carry” basis. Credit was not to be extended, and any war materials purchased had to be transported on British ships. Popular opinion changed even more abruptly following the surrender of France in 1940, which left Britain to wage war against Germany alone.

Amid the turmoil abroad, the U.S. presidential election of 1940 approached. Breaking with a tradition established by George Washington, Roosevelt became the first president in history to seek a third term in office. Besides citing the nation’s economic concerns, he argued that the international situation was too delicate to risk a leadership change.

The Republican Party nominated New York businessman Wendell Wilkie to challenge Roosevelt. Wilkie’s campaign pointed out that Roosevelt failed to bring the nation out of the Depression and was walking the United States close to involvement in international conflict, but the nation was not ready for change during such tumultuous times. Although Wilkie found some support in the Midwest, Roosevelt was easily reelected with nearly 55% of the popular vote and a landslide in the Electoral College, 449 to 82.

Following the election, Roosevelt asked Congress for $1 billion in additional defense funding. He also pledged 50 decommissioned U.S. Navy destroyers to new British prime minister Winston Churchill. At Roosevelt’s urging, Congress passed the Lend–Lease Act, which authorized military aid to various countries and assumed they would somehow be able to repay the costs at a later date. Although his presidential campaign had promised to keep the United States out of the foreign conflict, this military buildup and support for the Allies was in direct contradiction to the Neutrality Acts. The last of those acts, passed in 1939, had allowed the United States to provide arms to the Allies, but only on a cash and carry basis.

Under the Lend–Lease program, billions of dollars’ worth of arms were sent to Britain and China, and eventually the Soviet Union when Hitler renounced his nonaggression pact with that nation. America, in Roosevelt’s words, had become the “great arsenal of democracy.” As late as January 1941 the president still hoped to keep America out of war. But in his annual address to Congress, he outlined Four Freedoms that people all around the world should enjoy: freedom of speech, freedom of worship, freedom from want, and freedom from fear (Jones, 2009). He argued that threats to other democracies created a threat to U.S. freedom and democracy, and he broke with isolationists by arguing that the United States needed to provide support for the Allies.

Among those allies under immediate threat were the Chinese. The Japanese invasion of Indochina in September 1940 cut off arms supplies to the Chinese army and made the Asian situation critical. Roosevelt responded by freezing Japanese financial assets in the United States and halting trade, including the all-important shipment of American oil. In order to protect its assets in the Pacific, including Hawaii and the Philippines, from Japanese encroachment, the United States began to manage the Asia situation carefully.

Animosity against the Japanese was linked to more than that nation’s aggressive actions against the Chinese. In 1937 a Japanese attack on a U.S. naval vessel, the Panay, while it was protecting American interests and property along the Yangtze River in China increased tensions between the nations. Although the Japanese claimed the incident was a mistake, it served to turn U.S. public opinion against Japan.

Pearl Harbor

On December 7, 1941, the Japanese responded to increasing tensions between the nations by bombing the U.S. naval base at Pearl Harbor in Hawaii, aiming to destroy the U.S. Pacific Fleet before the Americans could consider striking Japan. Early that Sunday morning, Mitsuo Fuchida, the flight commander of a Japanese Zero airplane, approached Pearl Harbor and radioed “Tora! Tora! Tora!” (as cited in McNeese, 2010a, p. 11), which signaled that the Japanese air force had successfully approached the American island undetected (the word tora means “tiger”). Over the course of the next 2 hours, hundreds of Japanese planes, including torpedo bombers, dive bombers, fighters, and horizontal bombers, targeted the U.S. Pacific Fleet headquartered there.

The surprise attack sank or destroyed 21 important ships and killed more than 2,400 Americans. Japan lost just 29 aircraft. Within hours of the bombing of Pearl Harbor, the Japanese attacked U.S. holdings in the Philippines, marking a second assault on American military forces. Later that same day, Japan continued its assault with attacks on Guam, Midway, and Hong Kong (Lord, 2001).

By a margin of 477 to 1, Congress approved a declaration of war on Japan on December 8, 1941, formally entering World War II. Only pacifist Jeanette Rankin (the only woman in the Congress) voted no. Three days later, on December 11, the other two Axis Powers, Germany and Italy, declared war against the United States (James & Wells, 1995). U.S. isolation was laid to rest, and the nation put its energies and its military might alongside the Allied Powers of Great Britain, France, and the Soviet Union.

9.2 The Home Front

In the days following the attack on Pearl Harbor, many Americans feared the Axis Powers would attack the U.S. mainland. After declaring war on the United States, Hitler sent German submarines, known as U-boats, to patrol along the Atlantic coast. For months U.S. pilots tried to demolish them, but the submarines destroyed a number of American ships, threatening to disrupt the transport of war materials to Europe. By mid-1943 the U.S. Navy ended the submarine threat, but many still worried that the war that besieged Europe would soon overrun the United States. Fears that the Japanese might attack on the Pacific coast also persisted.

Another fear was sabotage from within. Military forces were placed on high alert, and government buildings, defense factories, and even important bridges were placed under guard. Machine guns were attached to the White House roof and placed on other prominent buildings in New York and other cities. The nation turned toward mobilizing both the military and civilian forces needed to participate in the largest war in human history, though to much relief domestic threats failed to materialize (James & Wells, 1995).

Mobilizing for War

World War II was not just a military effort. The war also changed the lives of every American at home. The United States had no organized civil defense system, and although some industries had converted to producing weapons and other materials for the Allies, the scope of mobilization required to participate in the growing global conflict was staggering. Full cooperation from the nation’s citizens was critical to the mobilization of industry and to the reorganization of many aspects of the economy and society.

Americans were urged to do their part by rationing products and commodities essential to the war effort or made scarce due to the conflict. Gasoline and rubber tires were among the first products to be rationed, with a 35-mile-per-hour speed limit imposed to conserve fuel. By mid-1942 food staples and especially sugar were subject to government ration, and government-issued ration books tracked consumers’ purchases of important commodities. Coffee came under ration after German U-boat attacks disturbed shipments from Brazil, and other foods such as butter, oils, cheese, and meat came under ration plans to reserve supplies to feed military personnel. Silk, used in crafting parachutes, became almost impossible to obtain, as did nylon for women’s stockings, and new leather shoes came under ration. Local ration boards distributed the ration books, which contained stamps exchangeable for a certain commodity. For many commodities, such as coffee and sugar, consumers received equal ration coupons. For others, such as tires and automobiles, consumers had to make application and prove their need to make the purchase.

Dollar-a-Year-Men

Mobilizing the economy for wartime production effectively ended the Great Depression by providing needed jobs. Roosevelt created a new agency, the War Production Board, in January 1942 to coordinate retooling and production across multiple industries. Former Sears, Roebuck & Company executive Donald Nelson was tapped to head the agency, whose tasks included converting automobile factories to tank manufacturing and convincing industrialists such as Henry Ford to build more than 1,000 B-24 bombers (Eiler, 1997).

Businesses were offered incentives to participate in the war effort; the federal government funded development and production costs, and industries received a guaranteed profit on the tanks, airplanes, and arms they produced. The war greatly enhanced the power of the big businesses that drove the wartime production and also swelled the government’s involvement in the economy. Most government spending went to war production industries. Federal employees also grew from 1 million to 4 million during the war’s duration.

Much of the mobilization effort concentrated on securing essential wartime materials, often from the American people. This included scrap metal, with collections of cans and razor blades to forge war equipment, and women’s silk and nylon hosiery, which was used to make parachutes and rope, but one of the most important materials in the war was rubber. Used primarily for tires and tank treads, rubber was in short supply during the war because the Japanese cut off supplies from the Dutch East Indies.

A typical rubber drive occurred in Dayton, Ohio, in May 1942. Local gas and service stations supplied 75,000 pounds of old tires, and Dayton mayor Frank M. Krebs contributed his garden hose. Several schoolchildren scoured the county collecting old automobile belts and car floor mats. A local shoe repair shop turned in its supply of rubber heels for the soles of shoes (Dayton History Books Online, 2000).

Nelson drew on other experienced executives to head various segments of war production. Called dollar-a-year men because they agreed to run various agencies or industries for this token salary, many remained on the payroll of their prewar companies but oversaw conversion to war production. The number of these executives ranged from 310 in 1942 to more than 800 by the war’s end. The expert technical and business knowledge of the executives was essential to the smooth operations of wartime industries, but some questioned their motives, claiming that their real interest lay in making a personal profit (Klein, 2013).

Labor and the War

If the war was good for business, it was equally good for workers. During the conflict incomes soared, especially for those engaged in work related to war production. In some cases the boost in earnings were sufficient to pull families into the middle class. Another important innovation at work was the introduction of employer-paid health insurance plans, although in some cases those benefits were provided in lieu of monetary raises.

The surge in work and the return to employment brought a massive influx of new members into labor unions, including the AFL and the CIO. In 1942 the Roosevelt administration established the National War Labor Board (NWLB), a resurrection of the organization that had managed the nation’s labor force in World War I. Composed of business executives and labor leaders, the NWLB was authorized to mediate labor disputes and establish labor policies for the duration of the conflict.

In exchange for a no-strike pledge by employees, the NWLB negotiated settlements with employers who continued to fight against unionization drives in industries such as steel and auto production. This government settlement finally brought union protection to workers in resistant segments of those industries. During the war, union membership surged to its highest level in history, with more than 15 million American workers protected by collective bargaining in 1945 (Lichtenstein, 2010).

In spite of the no-strike pledge, industries still faced a number of work stoppages. Wage stagnation in the face of expanding profits for war industries led some workers to walk off the job. Eventually, Roosevelt empowered the NWLB to control wages and prices, making it a powerful part of the wartime administration and ensuring continued production (Kersten, 2006).

An important component of the workforce during the war was the migration of men and women for employment. Millions of people moved for work opportunities that developed during the war. War industries located production in urban areas, such as Detroit, where auto plants retooled to manufacture vehicles, airplanes, and armaments.

Among those migrating for work were African Americans from southern states. An estimated 60,000 African Americans moved to Detroit between 1940 and 1946, approximately doubling the number of African Americans in that city’s workforce. In Chicago, another city with important wartime industries, a similar influx of 60,000 African Americans swelled the workforce between the attack on Pearl Harbor and mid-1944 (Atleson, 1998). Rural White men also funneled into wartime industries, but in fewer numbers because many were drafted or joined the military and because increasing demands on the farm economy allowed many to be exempted from the draft so long as they worked in farming.

Women at Work and War

Women also participated in the migration. Massive numbers of women worked in industries from which they were previously excluded because of their gender. Rosie the Riveter symbolized this new, hardworking, industrial American woman. She was fictional but represented the ideal government worker, including being loyal, efficient, and patriotic. Despite the toughness she displayed, the sight of a feminine, pretty woman taking on industrial work inspired many young women who were eager to help out in the war effort.

The reality of wartime work was anything but glamorous. Ethel Jerred of Ottumwa, Iowa, applied for a job in a local meatpacking plant while her husband was at war. The plant offered her a choice: a traditional women’s position that paid 59 cents an hour, or 72 or more cents for a job in the men’s departments of kill and cut, fresh meat packing, or meat wrapping. She took a job on the men’s floor and recalled, “My first check was sixty-two dollars, and I thought I was wealthy. That was the most money I’d ever made in one week” (as cited in Stromquist, 1993, p. 127).

Most women war workers were like Margarita Salazar McSweyn and Ethel Jerred. They were patriotic Americans who sought both to improve their economic status and serve their country. Many resented the loss of their higher wages at the conflict’s conclusion, when they were expected to return to lower paying “women’s work.”

Those who continued to work after the war usually did so out of economic need. Many sought positions using their wartime training, but women were almost universally excluded from skilled industrial trades after 1946. The postwar jobs available to women tended to be in clerical or sales work and paid on average 43% to 52% less than industrial work (Kesselman, 1990).

Women also served in the military as nurses and pilots, and in other noncombat military positions. For the duration of the conflict, as many as 400,000 women served in military or associated positions. Among the women’s units were nurse corps of the army and navy, the Women’s Army Corps (WAC), and navy corps known as WAVES. The marines and coast guard also had women’s reserve units. Another group, Women Airforce Service Pilots flew important noncombat missions but were not formally enlisted in the military (Cole, 1995). Outside the military, women also held important positions with the American Red Cross and Civil Air Patrol.

African American women were among those enlisting in the WACs, and more than 6,200 served. They received separate training and lived in segregated housing but served in many of the same roles as White women. Female African American officers trained alongside their White counterparts, and by late 1943 training programs for other specialist positions were also integrated, but housing remained segregated. The army nurse corps also saw more than 500 African American women serve in both the U.S. and European theater. The navy retained a ban on African American women’s enlistment until late 1944, but by the end of the conflict African American women also served in the WAVES and the navy nurse corps (Honey, 1999).

The involvement of women in the military during World War II formed a major turning point in female military service. Their enlistments were “for the duration” plus 6 months to help ease the transition of returning veterans at the war’s end. Mary Hamilton of Mannington, West Virginia, enlisted as a nurse in the WAC soon after finishing her nurse training in 1945. Sent to the European theater, in Germany she tended to the needs of servicemen and servicewomen returning from the field. She remained in service in Europe for a year beyond the war’s conclusion.

Although some were hesitant to accept women in military roles, women’s willingness to volunteer helped ease pressure on the dwindling numbers of men available for the draft. Women served bravely in almost every noncombat role by the end of the war.

The Draft

In September 1940, more than a year before the United States entered the war, Congress enacted the first peacetime draft. German aggression and growing victories in Europe and the Luftwaffe’s continued air bombing of Great Britain made preparations for the nation’s defense wise, even if the United States managed to remain neutral. Within a month of its enactment, 16 million men aged 21 to 35 registered for the Selective Service. Seeking only 900,000 recruits in the first round of drafts, Selective Service officials imposed qualifications on military service. African Americans were initially excluded from the marines and army air corps. On the advice of psychiatrists, homosexuals were also disqualified. Eventually 2.5 million African Americans did register for the draft and were subject to conscription into segregated units.

Once the United States entered the war, the military became less concerned with disqualifying large groups of Americans. Between 1939 and 1945 more than 17 million men and women served in the armed services. Of the men serving, 61% were draftees. White men formed the largest number of service members, but other groups made important contributions. More than 901,000 African Americans and significant numbers of Mexican Americans, women, Chinese Americans, Japanese Americans, and Native Americans also served (Berube, 2010).

Patriotic Dilemmas and Military Service

Americans of various ethnic groups contributed importantly to the Allied war effort, even when that service came in racially segregated units. Some ethnic minorities were drafted, but many volunteered for service in the military or in programs that aided the war effort. Most faced further discrimination and were initially assigned to menial tasks instead of combat roles. Some, such as Japanese Americans, faced a true patriotic dilemma when thousands of their fellow citizens as well as recent immigrants were interned for fear they were disloyal.

African American Military Service

African American military participation was limited to 10% of military enrollment, but their eventual enlistment of just over 900,000 was a bit short of that number. The African Americans who served in the war were segregated into African American units led by White officers. African American soldiers were also more likely to be in service branches, such as the quartermaster, engineer, and transportation corps.

African Americans joined for patriotic reasons, but also used the war to press for equal rights. These included military rights equal to those afforded to White soldiers and access to jobs that had formerly been for “Whites only.” Roy Wilkins, the editor of the NAACP’s Crisis magazine, explained the issue like this: “This is no fight merely to wear a uniform. This is a struggle for status, a struggle to take democracy off of parchment [the Constitution] and give it life” (Wilkins, 1940, p. 375).

African American officers called their goal the Double V—or double victory against fascism abroad and racism at home (James, 2013). For example, on April 12, 1945, the same day that Roosevelt died, the U.S. Army took 101 African American officers into custody for directly refusing an order from a superior officer. This was a serious charge because, if convicted, they could face the death penalty, but a compromise was eventually reached and the charges were dropped. The violation stemmed from their refusal to sign orders to accept segregated housing and recreational facilities. Their protest was one of the final events that pushed toward the desegregation of the U.S. military, although that did not formally occur until 1948.

Like American women in the war, African Americans also took advantage of new opportunities. One of the best examples was the African American pilots of the 332nd Fighter Group, the famed Tuskegee Airmen, and the same officers arrested in the segregated housing protest. In total, Tuskegee Airmen flew 15,000 sorties and shot down more than 200 German aircraft (Moye, 2010).

However, while these men were willing to die for their country, they were not eligible for many military honors for their service. Though many deserved it, no African American received the Medal of Honor, the highest military award for bravery. President Bill Clinton corrected this error 50 years later, bestowing the medal on seven African Americans who served in the war, but just one, Vernon Baker, was still alive (Latty & Tarver, 2004). The most highly decorated African American at the time of the conflict was Doris “Dorie” Miller, a Navy cook aboard the USS West Virginia on the morning of the attack at Pearl Harbor. For brave actions during and after the attack, he was awarded the Navy Cross in 1942.

Native Americans in Service

Native Americans contributed much to the American war effort, with 20,000 serving in the military, many as volunteers. Several hundred Native American women served in WAC and WAVES units. Most famously, Native American soldiers from the Navajo tribe employed their unique language to send military messages that the Japanese could not decode. Others, both men and women, left the relative poverty of the reservation for high-paying industrial jobs in war industries. Most who left the reservation never returned.

Unlike African Americans, Native Americans were not drafted into segregated units. The war provided a chance for them to mingle with Whites of varying backgrounds and to learn job skills that would be important in the postwar era. Service also made them eligible for veterans’ benefits, including funding for school tuition and government-assured mortgage loans.

The Bracero Program

As millions of men and some women entered the military, the remaining workforce was not enough to meet the growing demand for labor. The push for women to leave their homes or current jobs to work in war industries was only one way to meet the need. In June 1942 the United States entered into an agreement with Mexico to enable temporary laborers, known as braceros, to work in America. The first arrived in Stockton, California, where they worked in beet fields, but the program soon grew to include multiple states.

Braceros worked largely in agriculture and in the southwestern states, but at harvest time they traveled to the Northeast and Midwest. During the course of the war more than 200,000 Mexican workers labored on farms in 24 states. Their labor became so integral to the harvest season in many states that the United States extended the program beyond the war years and established a pattern of Latino workers coming northward to labor for part of the year while retaining ethnic and economic ties with their homeland (Calavita, 1992).

The bracero program was fraught with problems, and many migrant workers suffered at the hands of American employers. They worked exceedingly long days in the hot sun and were provided substandard housing. Under the program a portion of their pay was also deducted and sent to the Mexican government, to be given to them upon return to Mexico. In some cases workers were not paid fully for their labor. Finally, after two lawsuits, in 2008 American courts granted surviving braceros or their heirs $3,500 each to compensate for their monetary loss during the war.

Despite the problems with the program, the temporary legal status granted to migrant workers gave them a taste of American freedom and instilled a desire for the better life that seemed possible north of the border. Thousands of braceros remained beyond the expiration of their legal contract and became undocumented immigrants among a growing Latino community in the Southwest. Growing patterns of chain migration created networks of communication about higher wages and better jobs available in the United States. Many were encouraged to migrate northward even without benefit of documentation, creating increasing incidences of undocumented immigration across the southern border of the United States (Barkan, 1996).

Japanese American Internment

The ethnic group most affected by the nation’s involvement in World War II was the Japanese. The American response to the attack on Pearl Harbor raised questions about the nation’s commitment to freedom for its citizens, especially those of Japanese descent. On February 19, 1942, Roosevelt signed Executive Order 9066, which provided for the internment of Japanese immigrants and Japanese American citizens. The government removed more than 100,000 immigrants and American citizens of Japanese descent (two thirds of them were American citizens) from their homes on the West Coast and incarcerated them in internment camps with very poor living conditions. Their businesses and personal property were confiscated.

The sad irony was that an American president committed to humanitarian ideals and a war for democracy imprisoned Americans simply because of their family heritage. Yet the large concentration of Japanese on the West Coast fostered fears that in the event of a Japanese invasion, these American citizens and immigrants would aid the enemy. Some of Roosevelt’s closest advisors, including Lt. Gen. John DeWitt, who was in charge of West Coast defenses, pushed him to detain the Japanese, arguing that even though many were second- and even third-generation Americans, racial and ethnic ties could run deeper than their patriotic commitment to the United States.

Those incarcerated were only allowed to take whatever possessions they could carry and were given no legal recourse to protest their removal. At the various camps, armed guards and barbed wire prevented their escape (Robinson, 2001). Some refused deportation, including Fred Korematsu, a Japanese American who worked as a welder in a war production plant in Northern California. Korematsu resisted the order to report for detention but was eventually arrested. He and several other Japanese Americans brought legal suit against the government, with conflicting results. Two important cases illustrate the conflict between support for civil liberties and the nation’s need to defend against potential alien enemies.

Represented by the ACLU, Korematsu’s case eventually reached the U.S. Supreme Court, but in December 1944, in the case of Korematsu v. United States, the court upheld the constitutionality of Japanese internment, claiming that the need to protect against espionage was more important than individual citizens’ rights and that internment of a group designated as such a threat was legal.

On the same day, however, in the case of Ex parte Endo, the Supreme Court declared Executive Order 9066 unconstitutional and held that an American citizen, no matter whether native born or naturalized, could not be interned without due process and that internment could not be used against any group of people en masse. In January 1945 those imprisoned were allowed to leave the camps. In the 1960s Japanese Americans began a redress movement seeking an official apology and reparations for their internment during the war. Legal and civil actions spanned several decades, but finally in 1992 congressional appropriations provided for $20,000 to each of the 82,210 internees or their heirs.

Ironically, Japanese Americans played a heroic part in the war, especially the 18,000 who served in the all-Japanese-American 442nd Regimental Combat Team and the 100th Infantry Battalion. The participation of Japanese Americans in the American fighting force further illustrates the conflicting ways that Japanese Americans were treated during the war. They fought in seven major campaigns in North Africa and Europe and were among the first to liberate Jewish prisoners from the concentration camp at Dachau in southern Germany.

9.3 Over There

The United States entered the war on two fronts simultaneously. In Europe, where the Soviet Union had shouldered most of the fighting since 1941 and faced heavy casualties, the Allies were glad to see American ground troops and air fighters. In the Pacific, Japanese incursions into Indochina, Myanmar (Burma), Thailand (Siam), and Indonesia (Dutch East Indies) proved a challenge to U.S. military might. Although the United States was the world’s dominant industrial power and the manufacturer of the latest and best military equipment and armaments, World War II raged on for 31⁄2 years after America entered the conflict.

War in Europe

The Allied Powers included America and her strong partner Great Britain, a weaker France (most of which had fallen to the Germans in 1940), and the Soviet Union, with which the United States had a strained relationship at best. Joseph Stalin had wielded dictatorial power ever since he became the Communist leader following Vladimir Lenin’s death in 1924, and the fundamental divide between democracy and communism made Roosevelt and Stalin—and the United States and USSR—an unlikely pairing.

However, a common enemy in Hitler bound the nations together. It was clear that both the American and British leaders saw the Soviet Union as an undesirable ally, but Roosevelt and Churchill saw Hitler as something far worse. As British prime minister Winston Churchill said, “If Hitler invaded Hell I would make at least a favorable reference to the Devil” (as cited in Miscamble, 2007, p. 51).

There were important disagreements between the allies over the way the war should be fought. In early 1942, shortly after the United States entered the war, Stalin agitated for American forces to open a second front in the West. One observer said that Soviet foreign minister V. M. Molotov knew only four English words: “yes,” “no,” and “second front” (LaFeber, 1997). Though the location and time to open a second front became a major source of disagreement between the American and British leaders, Churchill and Roosevelt eventually agreed on a plan for American troops to attack first in northern Africa and southern Europe.

Meanwhile, the German army invaded the Soviet Union, and Soviet troops languished in Stalingrad in the southwestern region of the Soviet Union during the winter of 1942–1943. The Soviets, fighting without much additional Allied support, held on to this vital city during the protracted Battle of Stalingrad that began in August 1943 and raged for 41⁄2 months. If Stalingrad were lost, the entire Soviet Union might have toppled.

After months of heavy fighting, the Axis forces finally exhausted their supplies and available supply lines and surrendered to the Soviets’ Red Army on February 2, 1943. It was a battle won at great cost; Stalin lost a half million soldiers and Germany lost 300,000. Some felt that Britain and the United States were letting their two enemies weaken each other, and this discontent and distrust would be a contributing factor to the coming Cold War. As early as 1941, then senator Harry S. Truman had warned, “If we see that Germany is winning the war, we ought to help Russia; and if that Russia is winning, we ought to help Germany, and in that way let them kill as many as possible” (as cited in Patterson, 1988, p. 8). This strategy lent credibility to Soviet mistrust and concern that the United States and Britain purposely avoided opening a second European front.

In North Africa, U.S. general Dwight D. Eisenhower launched a pincer strategy, with the British army pushing from the east and the Americans from the west. These were the giant pincers, and German forces were caught in the middle. In May 1943 Germany surrendered more than a quarter million troops and all of North Africa to the Allies. After this campaign, U.S. general George S. Patton successfully led an amphibious assault on Sicily, and Mussolini fled Italy. German troops descended on Italy to keep it out of Allied hands, but the Allies managed to liberate the nation on April 24, 1945. However, World War II was still far from over.

D-Day

Throughout the war, the U.S. Army Air Force and British Royal Air Force sent units, known as sorties, to engage in strategic bombing against Germany. The first notable Allied bombing attack came in May 1942, when the RAF sent 1,046 planes to target factories, homes, and stores in Cologne, Germany. This Thousand Bomber raid killed several hundred and left 45,000 homeless. The bombing of Cologne and the other similar operations that followed were horrific attacks that, even though purported to be strategic, left thousands of civilians dead.

Meanwhile, military commanders planned a daring attack across the English Channel. D-Day, on June 6, 1944, was the largest amphibious assault ever attempted in the history of warfare. German leaders were prepared for the Allies to strike at the narrowest portion of the Channel, but the allies surprised them by selecting the beaches farther south at Normandy, France, instead.

Coordinating the efforts of more than 150,000 soldiers was a massive undertaking. Eisenhower, who commanded the mission, was not certain such a large, synchronized effort would succeed, and he prepared a press release to be issued should the mission fail. In it he predicted that he had a 50% chance of success: “My decision to attack at this time and place was based upon the best information available. If any blame or fault attaches to the attempt, it is mine alone” (as cited in O’Neill, 1997, p. 346).

Allied forces landed and established five beachheads along the Normandy coast, with the Americans landing at positions known as Omaha and Utah beaches. It was a difficult battle with massive casualties. At least 10,000 Allied troops fell, and German troops suffered between 4,000 and 9,000 casualties.

V-E Day

In succeeding weeks more than a million Allied troops followed the initial invaders on shore. Once troops amassed, the Allies began their slow march toward the German capital at Berlin. Opening up a crucial second Western Front, the Allied troops began taking substantial numbers of Axis prisoners. Two months later, in late August, Allied forces liberated France from Nazi control.

The Allies followed the land war with new bombing attacks targeting civilian populations in cities such as Dresden, Germany, in February 1945. They argued that the important military advantage they offered in crippling the Axis war effort and destroying transportation lines overcame the humanitarian objections to killing as many as 25,000 people in a series of bombings that was so aggressive, much of the city center erupted into a firestorm. To supporters, such acts achieved the desired effect: On April 30, 1945, just 2 months after the Dresden attack, Hitler committed suicide in a Berlin bunker. That same month an army air force historian visited a Nazi death camp at Buchenwald. After seeing the bones from the crematorium and the Jewish inmates who suffered from typhoid, he said, “Here is the antidote for qualms about strategic bombing” (as cited in Schaffer, 1985, p. xiii). Germany surrendered unconditionally on May 8, 1945, on what is known as V-E Day, or Victory in Europe.

The Holocaust

Celebrations for the Allied victory in Europe were tempered by the horrific images and reports of Nazi atrocities that emerged following the liberation of multiple concentration camps. During World War II Hitler ordered the systematic murder of 11 million men, women, and children. Six million were Jewish, but Roma (Gypsies), religious and political dissenters, homosexuals, and others also fell victim to the Holocaust.

Persecution of the Jews began when Hitler came to power in 1933. Preaching about the superiority of Aryan Germans, Hitler used the legal system, the press, and even force to attack Jewish communities in Germany. Jews were important members of German society, with many holding posts in business, popular culture, and the intellectual community. Hitler and many Germans came to believe that Jews contributed to a modern society that challenged the nation’s traditional culture (Abzug, 1999).

Although such anti-Semitism was not new, when combined with the extreme nationalism of Hitler’s Fascist regime, it proved deadly. During the 1930s the Nazis systematically began to strip German Jews of their civil and political rights. They were forbidden to serve in the military or own land and banned from holding such occupations as lawyers, doctors, dentists, and accountants and eventually from owning businesses. Jews were first required to obtain special identification cards, and then to wear a yellow Star of David on their coats to identify them.

In November 1938, on what became known as Kristallnacht (night of broken glass), the Nazis stepped up the anti-Semitic campaign, attacking Jews, their homes, and businesses. The Nazis fined Jews for the property damages occurring on Kristallnacht. They confiscated property and began transporting thousands of Jews to labor camps or concentration camps. Many Jews sought to emigrate to the United States, but strict quotas on immigration as well as suspicion and anti-Semitism prevented the entry of most.

Once the war was underway, the Nazis began mass deportation of Jews and others considered undesirable to concentration camps. Jews in Germany and those in all of the German-occupied territories were subject to removal. In some camps the able-bodied were put to work in slave-like conditions, but in others deportees were murdered soon after arrival.

Although reports of the atrocities reached the United States and the Allies early in the war, Roosevelt and other leaders believed the reports of mass killings to be exaggerated and did nothing. While some activists in America sought to aid Jewish refugees, U.S. immigration policies prevented their entrance. Although some pressed for a relaxation of quotas, many others, fearing foreign influences, supported even stricter immigration policies. The World Jewish Congress appealed to the Allies to conduct target bombing of the camps and the railroad lines that transported victims to certain deaths but was told that resources could not be diverted (Breitman, Goda, Naftali, & Wolfe, 2005).

Soviet soldiers arrived at the death camp at Auschwitz, Poland, in January 1945 to discover that all the rumored horrors were in fact true. Unlike some concentration camps built to contain non-Aryan populations, camps such as Auschwitz were part of Hitler’s plan to systematically murder undesirable populations. Soldiers found emaciated men and women, gas chambers, and pits filled with the ashes of the murdered. The liberation of this and other camps opened the world’s eyes to Hitler’s “final solution,” his plan to kill every Jew in Europe. Journalists and Allied soldiers documented the horrors with cameras and written accounts. The truth about the genocide was finally known, but not before 2 out of every 3 Jews in Europe had perished (Bergen, 2003).

War in the Pacific

While the United States and the Allies were engaged in a bitter struggle in Europe, U.S. forces waged an equally significant conflict against Japan in the Pacific. Immediately after Pearl Harbor, the Japanese fleet hovered in the Indian Ocean and off the East Indies prepared to engage the American naval forces. Beginning in the spring of 1942, engagements between the United States and Japanese erupted in the central Pacific. Fleet Admiral Chester William Nimitz led the naval forces, while Gen. Douglas MacArthur, still recovering from his lack of leadership at Pearl Harbor, led an island-hopping attack strategy in an attempt to return to the Philippines, which he had abandoned during the initial Japanese invasion.

In May 1942, exactly 5 months after the attack on Pearl Harbor, American and Japanese aircraft carriers engaged each other for the first time in the Battle of the Coral Sea, where the Americans stopped a Japanese fleet bound to attack Australia. Occurring 1 month later, the Battle of Midway Island (June 4–7, 1942) further damaged the Japanese navy. The Japanese lost 4 aircraft carriers, 1 cruiser, 332 aircraft, and 3,500 men. Americans lost 1 aircraft carrier, 1 destroyer, and 307 men (Gilbert, 2004).

In the wake of Midway, the United States pushed forward to win a string of impressive victories in the Pacific. In June 1944 a massive American naval force attacked the Mariana Islands. In some of the costliest fighting of the war, they captured Guam, Tinian, and Saipan. These strategically important islands were just over 1,300 miles from Tokyo. In October 1944 General MacArthur entered the Philippines, and sporadic intense fighting saw the island nation finally liberated at the war’s end in August 1945.

Despite the victories, as the United States and the Allies pressed closer to Japan, the resistance intensified. The marine corps engaged in the deadliest battle in its history at Iwo Jima, which was just 750 miles from Tokyo. The battle started on February 19, 1945, as 800 Allied vessels waited offshore with 70,000 marines. Their task was to take the 8 square miles of the island that 22,000 Japanese soldiers defended in an intricate network of caves and tunnels. The orders from the Japanese general Tadamichi Kuribayashi were to fight to the death. The battle lasted 36 days; in total, 28,000 soldiers died—6,821 of them American. Though the Americans were ultimately victorious, it was a high price to pay for victory.

Even after Iwo Jima, as the American forces pressed closer to the mainland of Japan, the resistance increased. In early April until mid-June 1945, the battle for Okinawa, which was 370 miles from Tokyo, presented yet another example of the Japanese strength and resiliency. Deadly kamikaze, or suicide planes, flew into American warships. Japan sacrificed 3,500 planes and pilots with this strategy. Napalm attacks by the United States killed thousands more Japanese.

When the battle for Okinawa was over at the end of June, the Allies had suffered more than 50,000 casualties and Japan more than 100,000. American military leaders struggled to imagine the death and devastation that awaited them when they eventually attacked the Japanese mainland. It was largely because of this fear that Okinawa became the final battle of World War II (Leckie, 1996).

Hiroshima and Nagasaki

In an effort to bring the conflict in the Pacific to a close, the United States began conducting strategic bombing raids on Japan in June 1944 and intensified the attacks in the spring of 1945. The raids targeted industrial sites but also hit urban areas where manufacturing facilities were located. The bombings killed significant numbers of civilians, with estimates ranging from a quarter million to as many as 900,000.

When the air raids failed to force Japan to surrender, the United States decided to use the newest and most devastating weapon on the planet. A massive American government, industry, and academic partnership known as the Manhattan Project had operated for 3 years to create an atomic weapon. The project’s initial goal was to develop the bomb before German scientists, who appeared close to success, but when the Allies defeated Germany, the Manhattan Project continued. Scientists tested the first atomic bomb near Alamogordo, New Mexico, on July 16, 1945. The bright light and mushroom cloud was a remarkable scientific achievement that ushered in the dawn of the nuclear age and threatened a war more deadly than any known before.

The decision to use atomic weapons on Japan rested with the nation’s new president. At the end of the day on April 12, 1945, Vice President Harry S. Truman had been summoned to the White House. He expected to be greeted by FDR, but instead he was taken to Eleanor Roosevelt’s study. She simply said, “Harry . . . the President is dead” (as cited in Miscamble, 2007, p. ix). Roosevelt’s health had been in decline for more than a year, and his opponents had tried to make the most of it during the 1944 presidential election. Among other concerns, he suffered from arteriosclerosis, a narrowing of the arteries. While visiting relatives at his cottage in Warm Springs, Georgia, he suffered a massive cerebral hemorrhage and collapsed.

Two hours after arriving at the White House, Truman took the oath of office. Addressing his cabinet, he told them that it was his intention to “continue both the foreign and domestic politics of the Roosevelt administration” (Miscamble, 2007, p. ix). Within 4 months, Truman authorized the dropping of atomic bombs on Japan.

Truman knew very little about the bomb prior to becoming president. He had, in fact, only been vice president for a few weeks when Roosevelt died. In that short time Roosevelt had apparently excluded him from all discussions about the Manhattan Project and all executive branch conferences on foreign policy. With limited knowledge at hand, Truman made the crucial decision to use the new weapon, arguing that doing so would prevent heavy casualties that would likely occur in an invasion and land war (Donovan, 1996). The first attack hit the city of Hiroshima on August 6, and the second targeted Nagasaki 3 days later.

The acute effects of the bombing of Hiroshima killed between 90,000 and 166,000, with thousands dying later from the long-term effects of radiation. At Nagasaki between 60,000 and 80,000 perished. A few days later, on August 15 in the wake of the Nagasaki destruction, the emperor of Japan offered his surrender. World War II had ended.

Historians and the public have debated whether Truman should have dropped the bombs. Historian Wilson Miscamble (2007) wrote:

Those who rush to “judge” Truman’s decision to use the atomic bombs must hesitate a little so as to appreciate that had he not authorized the attacks on Hiroshima and Nagasaki, thousands of American soldiers, sailors, marines and airmen might have been added to the lists of those killed in World War II. And, added to their number would have been the thousands of allied prisoners of war whom the Japanese planned to execute. Could an American president have survived politically and personally knowing that he might have used a weapon that could have saved their slaughter? (p. 242)

Another historian, Ronald Takaki, argued that Truman’s decision was partly motivated by anti-Japanese racism. Japanese people were vilified in American media and government propaganda. Newspaper accounts referred to “Japs” and portrayed Japanese men as rats. Takaki (1995) claimed that Americans were more willing to accept the use of the atomic bomb in Asia because of those racist beliefs. The use of the atomic bomb also aimed to intimidate the Soviet Union, reflecting the emerging tensions of the coming Cold War. The U.S. use of the weapon demonstrated the nation’s superior military might and served to escalate the military and ideological divisions between the United States and the Soviet Union.

9.4 Toward a New World Order

At the close of World War II, the world experienced a radical change in the way that international power and influence was distributed. Japan and Germany, once dominant powers, were utterly defeated. Great Britain retained its status, but physical devastation from wartime bombing raids left that nation weakened. France also emerged from the conflict with an urgent need to rebuild its weakened infrastructure and industrial base. Only the United States and the Soviet Union stood strong at the end of the war. It soon became clear, however, that the United States was now the dominant world power.

Planning the Postwar World

In several important ways World War II shaped the postwar world, beginning when the Big Three Allied leaders—Roosevelt, Churchill, and Stalin—met in 1943 in Tehran, Iran, which was the first time the three of them had ever met in person. They appeared to develop a positive relationship. Roosevelt broke the ice first by teasing Churchill about his “Britishness” and his cigars. Churchill expressed feigned irritation, which seemed to please Stalin, and Roosevelt recalled, “Stalin broke out into a deep, hearty guffaw, and for the first time in three days I saw light” (as cited in Meacham, 2004, p. 265). Roosevelt eventually even referred to Stalin as “Uncle Joe.” When the meeting ended, the three had agreed to the D-Day plan that earned them victory in World War II. However, Tehran represented the high point of their relationship.

The Big Three at Yalta

Roosevelt, Churchill, and Stalin met again in February 1945 at Yalta, on the Soviet Union’s Black Sea coast. At the conference they discussed the future of Germany, Eastern Europe, and the creation of a new international coalition that eventually became the United Nations (UN). The stakes were high, but the outcome was much different and more strained. As one historian wrote, Roosevelt and Churchill called themselves the “Argonauts,” an allusion to ancient warriors who tried to steal a Golden Fleece away from a dragon that never slept. For Churchill and Roosevelt, their prize was a favorable settlement to World War II, and their dragon was the powerful dictator, Stalin. With resources dwindling, Churchill and Roosevelt realized that without additional fighting, the Soviet Union was unlikely to relinquish the territories in Eastern Europe its military had liberated from the Nazis.

Roosevelt’s health at Yalta was not good, but he was determined to participate in the important meeting. Churchill and many of Roosevelt’s advisors did not trust the Soviets or their intentions after the war ended. Among the issues was the argument that Germany should be made to pay significant reparations following the war, the postwar governing of Germany, and the disposition of Eastern European nations. Churchill and Roosevelt reluctantly agreed that the Eastern European nations bordering the Soviet Union ought to look toward the Soviets for aid and alliance, and the Soviets agreed to allow democratic elections in those nations. Considering the unwillingness and likely political inability of Churchill and Roosevelt to force the Soviets to agree to more democratic gains, just agreeing to elections, even ones that were not likely to be fully free or fair, was a major accomplishment of the meeting.

Poland proved to be a thorny problem, foreshadowing Cold War divisions between the United States and Soviet Union. With Soviet troops already occupying the country, a provisional pro-Communist government had been established. Roosevelt and Churchill believed that the Polish government officials exiled at the beginning of the war should return as the rightful governing body. In the end the Soviets agreed to allow free elections in Poland as well, but Roosevelt and many Americans believed the Communist influence would pervade Poland, especially so long as the Soviet troops remained.

Ultimately, many in the West viewed Yalta as a lost opportunity to shape the postwar world in a positive way. Some thought that Stalin had manipulated Churchill and Roosevelt and that they were too willing to appease his Communist ambitions. However, the Soviets were no happier with the outcome. Though Yalta by itself did not cause the Cold War, it contributed to the mutual distrust that divided the world between communism and democracy (Plokhy, 2010).

A New Financial Order at Bretton Woods

A new financial order was as important as the division of territory as the war neared an end. For 3 weeks in July 1944, representatives from 44 nations met together at a resort in Bretton Woods, New Hampshire, to ponder the best way to regulate the international monetary and financial systems following the war.

The outcome of this important conference established rules for international commercial and financial relations among industrial states. Representatives agreed to the General Agreement on Tariffs and Trade, which regulated international trade, and a path was laid for the creation of an international banking system. This eventually included the World Bank and the International Monetary Fund (IMF). The World Bank loaned money to developing nations and for the rebuilding of Europe. The IMF regulated the value of currency on the international market and prevented countries from intentionally devaluing their currency (Hoopes & Brinkley, 1997).

The United Nations and the Nuremberg Trials

As the war came to a close it also became clear that a successor to the League of Nations would be essential for future peace. A new multination body, the United Nations, was formally established at a 1944 conference at Dumbarton Oaks, near Washington, D.C. It consisted of two main bodies—a General Assembly and a Security Council. The General Assembly was to act as a deliberative body and include representatives from all member nations. The Security Council was to be responsible for keeping international peace and ensuring security. The Security Council included just five nations: the United States, Britain, France, China, and the Soviet Union.

Representatives of 51 countries met in San Francisco in June 1945 to formally adopt the UN Charter. The organization outlawed force or threat of force as a means to settle disputes among nations.

Another result of World War II was a series of trials before the International Military Tribunal held at Nuremberg, Germany. Even before the war’s conclusion, many began to clamor for the prosecution of the Nazis for committing war atrocities, and especially actions related to the Holocaust. Between November 1945 and October 1946, 22 Nazi military and political leaders faced indictment for war crimes, crimes against humanity, and aggressively pushing the war. Another 100 Nazi defendants faced trial in the United States under additional proceedings.

Three of the defendants were found not guilty, 7 were sentenced to lengthy jail terms, and 12 sentenced to death by hanging (Mettraux, 2008). The lasting legacy of the trials came with the United Nations’ establishment of the Nuremberg Principles of international law relating to war crimes or crimes against humanity.

Toward an Atomic Age

More than 400,000 American lives were lost in World War II. The Allies crushed the Axis Powers and stopped the march of fascism and militarism in Europe and Asia. The Nazi genocidal campaign against the Jews was halted, and a new international organization emerged to mediate future disputes. Government spending on the military and infusion from Allied purchases finally ended the Great Depression and brought full employment to all Americans who sought a job. Although brighter days seemed ahead for the United States, wartime events also left clouds on the horizon.

The devastation caused by the atomic bombs ushered the world into a new and fearful era. The fires of world war were extinguished, but a new conflict was brewing as other nations, especially the Soviet Union, sought to build their own nuclear arsenals. The alliance between the United States and the Soviet Union fractured, and the two most powerful nations in the world verged on the edge of a “cold war” that pitted Soviet communism against American capitalism and democracy. In order to contain the spread of communism, the United States threw off any pretense at isolation and became the police force of the free world.