Sign Up for Email Updates


Anxiety, according to The Random House Dictionary, denotes “distress or uneasiness of mind caused by apprehension of danger or misfortune.” By this definition, the twentieth century qualifies as an age of anxiety for Americans.

There is irony in this condition, because in many respects we twentieth-century Americans have enjoyed much more security than our forebears. Our life expectancy has been longer, our work easier and more remunerative, our style of life more comfortable, stimulating, and unconstrained. Yet notwithstanding all objective indications that our lives are better than those of our ancestors, we have become incessant worriers.

Our predecessors dealt with their worries by relying on religious faith. For tangible assistance, they turned to kinfolk, neighbors, friends, co-religionists, and comrades in lodges, mutual benefit societies, ethnic associations, labor unions, and a vast assortment of other voluntary groups. Those who fell between the cracks of the voluntary societies received assistance from cities and counties, but governmentally supplied assistance was kept meager and its recipients stigmatized.

In the twentieth century, especially during the past sixty years, Americans have placed their faith in government, increasingly in the federal government. Since Franklin Delano Roosevelt assumed the presidency in 1933, voluntary relief has taken a back seat to government assistance. Eventually, hardly any source of distress remained unattended by a government program. Old age, unemployment, illness, poverty, physical disability, loss of spousal support, child-rearing need, workplace injury, consumer misfortune, foolish investment, borrowing blunder, traffic accident, environmental hazard, loss from flood, fire, or hurricane—all became subject to government succor.

Our ancestors relied on themselves; we rely on the welfare state. But the “safety net” that governments have stretched beneath us seems more and more to be a spider’s web in which we are entangled and from which we must extricate ourselves if we are to preserve a prosperous and free society.

Bismarck, Soldiers, and Mothers

The modern welfare state is often viewed as originating in Imperial Germany in the 1880s, when the Iron Chancellor, Prince Otto von Bismarck, established compulsory accident, sickness, and old-age insurance for workers. Bismarck was no altruist. He intended his social programs to divert workingmen from revolutionary socialism and purchase their loyalty to the Kaiser’s regime, and to a large extent he seems to have achieved his objectives.

In the late nineteenth century, no aspiring American social scientist regarded his education as complete without a sojourn in a German university, and the impressionable young men brought back to the United States a favorable view of Bismarckian social policies absorbed from the teachings of Deutschland’s state-worshiping professoriate.[1] Men such as Richard T. Ely, Edward A. Ross, Henry Carter Adams, and Simon Patten transported ideas and outlooks that persisted through several generations. Consider, as only one example, that Edwin Witte, the chief architect of the Social Security Act of 1935, was a student of John R. Commons, who was a student of Ely (described by Joseph Schumpeter as “that excellent German professor in an American skin”[2]).

While Ely and the others were preaching their Germanic doctrines, an incipient welfare state was emerging quite independently in the United States through a far-reaching expansion of the pensions provided to Union veterans of the Civil War. Originally the pensions went only to men with proven service-related disabilities and their dependent survivors. But politicians, especially the Republicans, recognized that they could buy votes by dispensing the pensions more liberally. Eligibility rules were stretched farther and farther. Eventually no service-related disability needed to be proved, no combat experience was required, and old age alone was sufficient for a veteran to qualify. Some Congressmen even went so far as to change the official military records of deserters in order to award them pensions through special acts of Congress.[3]

Between 1880 and 1910 the federal government devoted about a quarter of its spending to veterans’ pensions. By the latter date more than half a million men, about 28 percent of all those aged 65 or more, were receiving pensions, as were more than 300,000 dependent survivors of veterans. Moreover, thousands of old soldiers lived in homes maintained by the federal government or the states.[4]

That politicians turned the legitimate pension system for injured veterans and their survivors into a political patronage machine should hardly have come as a surprise. Buying votes and dispensing patronage are what elected politicians normally do unless rigidly constrained. The doleful experience might well have served as a warning, and for a while it did, but eventually the lesson was forgotten.

During the first three decades of the twentieth century, when middle-class political movements generally refused to support proposals for comprehensive social spending programs on the grounds that elected politicians would abuse them, women’s organizations, including the General Federation of Women’s Clubs and the National Congress of Mothers, lobbied successfully for the establishment of state mothers’ pensions.[5] These small, locally administered stipends went to “respectable impoverished widows” to allow them to care for children at home. Between 1911 and 1928 forty-four states authorized such payments.[6] In 1935, with passage of the Social Security Act, the federal government joined forces with the states in financing an extension of the mothers’ pensions, Aid to Dependent Children (ADC)—later called Aid to Families with Dependent Children (AFDC), which ultimately became nearly synonymous with “welfare.”

Also, during the second decade of the twentieth century, all but six states enacted workmen’s compensation laws, which removed workplace injury claims from the courts and required that employers carry insurance to pay compensation for various types of injury under a system of strict liability.[7]

The First Cluster, 1933-1938

Between 1929 and 1933 the great economic contraction left millions of Americans destitute. State and local governments, straining to provide unprecedented amounts of relief while their own revenues were shrinking, called on the federal government for help. President Herbert Hoover opposed federal involvement in relief efforts, but he reluctantly signed the Emergency Relief and Construction Act of 1932, which transferred federal funds to the states for relief of the unemployed (under the fiction that the transfers were loans).

After Roosevelt took office the federal government immediately launched into vast relief activities. The Federal Emergency Relief Administration (FERA), directed by welfare czar Harry Hopkins, channeled funds to the states—half in matching grants ($1 for $3) and half in discretionary grants. The money went to work-relief projects for construction of roads, sewers, and public buildings; to white-collar beneficiaries such as teachers, writers, and musicians; and to unemployable persons including the blind, crippled, elderly, and mothers with young children.[8]

Hopkins’s discretionary allocations and his oversight of the federal money embroiled the FERA in political controversy. Politicians fought fiercely for control of the patronage inherent in determining who would get the relief money and jobs and fill the 150,000 administrative positions. “Governor Martin Davey of Ohio had an arrest warrant sworn out for Hopkins should he set foot in the state, and a number of politicians, the most notable being Governor William Langer of North Dakota, were convicted of misusing funds and served time in jail.”[9]

Also in 1933, Congress created the Civilian Conservation Corps, to put young men to work in outdoor projects under quasi- military discipline; the Public Works Administration, to employ people in building public works such as dams, hospitals, and bridges; and the Civil Works Administration, to operate hastily contrived federal make-work projects for more than 4 million of the unemployed during the winter of 1933-1934.

In 1935, with 7.5 million workers (more than 14 percent of the labor force) still unemployed and another 3 million in emergency relief jobs,[10] Congress passed the Emergency Relief Appropriation Act, under authority of which FDR created the Works Progress Administration (WPA) to hire the unemployed. The President appointed Hopkins as administrator. By the time it was terminated eight years later, the WPA had paid out more than $10 billion for 13.7 million person-years of employment, mostly in construction projects but also in a wide range of white-collar jobs including controversial support for actors, artists, musicians, and writers.[11]

Like the FERA, the WPA engaged the ambitions of state and local politicians in a “cooperatively administered” arrangement that set a pattern for many subsequent welfare programs Under federally issued guidelines and with mostly federal funding, state and local officials got substantial control of the patronage. Local governments usually designed the projects, selecting workers from their relief rolls and bearing a small portion of the costs. Republicans correctly viewed the WPA as a massive Democratic vote-buying scheme. WPA projects were frequently ridiculed, as in the following stanzas of a contemporary song:

We’re not plain every day boys,
Oh, no, not we.
We are the leisurely playboys
Of industry,
Those famous little WPA boys
Of Franklin D.
Here we stand asleep all day
While F. D. shooes the flies away
We just wake up to get our pay
What for? For leaning on a shovel.[12]

The spirit of this song persisted ever afterward, as many tax-paying private employees have resented those employed in government make-work projects (often described in later days as “training” programs).

During the first two years of his presidency, Roosevelt came under growing pressure from more radical politicians. Louisiana Senator Huey Long touted his Share Our Wealth Plan for a sweeping redistribution of income and gained a national following in 1934 and 1935. Simultaneously, California physician Francis Townsend recruited millions of supporters for his Townsend Plan, under which people over sixty years of age would retire and receive from the government a monthly stipend of $200 on the condition that all the money be spent within thirty days. To head off the mass appeal of such outlandish proposals, FDR formed in 1934 a Committee on Economic Security, whose Executive Director was Edwin Witte, to formulate a plan for a national social security system.

This planning bore fruit in 1935 when Congress passed the Social Security Act, the foundation of America’s welfare state. The act gave federal matching funds to the states for assistance to the aged poor, the blind, and dependent children. It levied a payroll tax, 90 percent of which would be refunded to states that established acceptable unemployment insurance systems. (All of them did.) And it created a national old-age pension program disguised as insurance but actually, especially after amendments in 1939 added surviving dependents as recipients, a scheme for transferring current income from working to nonworking people.

From that time forward, defenders of the pension system denied that it was a “welfare” program for redistributing income. “It was portrayed instead as a huge set of public piggy banks into which individual prospective `beneficiaries’ put away `contributions’ for their own eventual retirements.”[13] In the 1950s, 1960s, and 1970s, congressional incumbents made the pension system a fabulous vote-buying machine, as they repeatedly extended its coverage, added Disability Insurance in 1956, raised the benefits and even, in 1972, indexed the pensions to protect them from inflation. Only in the 1990s did a substantial portion of the public begin to recognize that the piggy-bank depiction was a myth and that the system faced bankruptcy as the ratio of taxpayers to recipients slipped ever lower because of demographic changes.[14]

As the New Deal was breathing its last in 1938, it brought forth the Fair Labor Standards Act. This established a national minimum wage (originally 25 cents per hour for covered employees but scheduled to rise to 40 cents over seven years), fixed a maximum work week (originally 44 hours but scheduled to fall to 40 by 1940), set a 50 percent premium for overtime work, prohibited the employment of children under sixteen years of age in most jobs, and authorized the Department of Labor to enforce the law.[15] Afterward, Congress raised the minimum wage repeatedly. It is now $4.25 per hour. This pseudo-welfare measure has proven to be an effective means of increasing the unemployment rate of low-productivity workers (those who are young, ill-educated, or inexperienced), but continuing support by leftist politicians and labor unions has prevented its repeal.

The GI Bill

In the spring of 1944, with elections looming and 11.5 million men—most of them draftees—in the armed forces, FDR and Congress saw the wisdom of accepting the American Legion’s proposals to create unprecedented benefits for veterans: hence the Servicemen’s Readjustment Act, popularly known as the GI Bill of Rights. Besides guaranteeing medical care in special veterans’ hospitals, the law provided for pensions and vocational rehabilitation for disabled veterans, occupational guidance, unemployment benefits for up to 52 weeks, guaranteed loans for the purchase of homes, farms, or businesses, and stipends and living allowances for up to four years for veterans continuing their education.[16] Most of the 16 million veterans of World War II took advantage of the unemployment and educational benefits. And by 1962 the Veterans’ Administration had insured more than $50 billion in loans.[17]

Even though the veterans’ program applied to only a minority of the population, it helped to retain the momentum of the burgeoning welfare state. “When the steam appeared to have escaped from the engine of the New Deal by 1945, the World War II nondisabled veterans’ benefits—by design and chance—provided new sources of energy.”[18] The GI Bill set an irresistible precedent, and later legislation provided similar benefits for veterans of the Korean War and, in 1966, even for those who served in the armed forces in peacetime.[19]

The Second Cluster, 1964-1972

With the succession of the ambitious New Dealer Lyndon B. Johnson to the presidency, the drive to build the welfare state became ascendant again. The election of 1964 brought into office a large, extraordinarily statist Democratic majority in Congress. Keynesian economists were assuring the public that they could fine-tune the economy, taking for granted a high rate of economic growth from which the government could reap a perpetual “fiscal dividend” to fund new programs. John Kenneth Galbraith, Michael Harrington, and other popular social critics condemned the failures of the market system and ridiculed its defenders. The public seemed prepared to support new measures to fight a “War on Poverty,” establish “social justice,” and end racial discrimination. Hence the Great Society.[20]

Congress loosed a legislative flood by passing the Civil Rights Act of 1964. Among other things, this landmark statute set aside private property rights and private rights of free association in an attempt to quash racial discrimination. But the ideal of a color-blind society died an early death, succeeded within a few years by “affirmative action”—an array of racial preferences enforced by an energetic Equal Employment Opportunity Commission and activist federal judges.[21]

Congress proceeded to pass a variety of laws injecting the federal government more deeply into education, job training, housing, and urban redevelopment. The Food Stamp Act of 1964 gave rise to one of the government’s most rapidly growing benefit programs: in 1969 fewer than 3 million persons received stamps, and federal outlays totaled $250 million; in 1981, 22 million persons received stamps, and federal outlays totaled $11 billion.[22] The Community Action Program aimed to mobilize the poor and raise their incomes. When Congress appropriated $300 million to create community action agencies, a wild scramble to get the money ensued, led by local politicians and, in some cities, criminal gangs—as vividly portrayed in Tom Wolfe’s tragicomic tale Mau-Mauing the Flak Catchers (1970).

In 1965 Medicare was added to the Social Security system, insuring medical care for everyone over 65 years of age. Medicaid, a cooperatively administered and financed (state and federal) program, assured medical care for welfare recipients and the medically indigent. As usual, these programs were not exactly what they were represented to be. “Most of the government’s medical payments on behalf of the poor compensated doctors and hospitals for services once rendered free of charge or at reduced prices,” historian Allen Matusow has observed. “Medicare-Medicaid, then, primarily transferred income from middle-class taxpayers to middle-class health-care professionals.”[23]

The federal government’s health programs also turned out to be fiscal time bombs. Between 1970 and 1994, in constant (1987) dollars, Medicare outlays increased from $16.4 billion to $109.3 billion; the federal portion of Medicaid from $7.7 billion to $63.5 billion.[24] Like the old-age pensions, these programs achieved rates of growth that could not be sustained indefinitely.

Other Great Society measures to protect people from their own incompetence or folly included the Traffic Safety Act (1966), the Flammable Fabrics Act (1967), and the Consumer Credit Protection Act (1968).

After Richard Nixon became President, highly significant measures continued to pour forth from Congress—the National Environmental Policy Act (1969), the Clean Air Act Amendments (1970), the Occupational Safety and Health Act (1970), the Consumer Product Safety Act (1972), the Water Pollution Control Act (1972), and the Equal Employment Opportunity Act (1972), to name but a few. Nixon also wielded his congressionally authorized power to impose comprehensive wage and price controls between 1971 and 1974, thereby (spuriously) protecting the public from the inflation created by the monetary policies of the Federal Reserve System.

The Welfare State Marches On

Although the growth of the welfare state has slowed during the past twenty years, it has scarcely stopped. Such recent measures as the Clean Air Act Amendments (1990), the Nutrition Labeling and Education Act (1990), the Safe Medical Devices Act (1990), the Americans with Disabilities Act (1990), the Civil Rights Act (1991), and the relentless power-grabs of the Food and Drug Administration show that our rulers remain as determined as ever to protect us from ourselves—to treat us as a shepherd treats his flock, and with similar regard for our intelligence and our rights.

If we cared nothing for our own freedom, we might be inclined to accept the ministrations of the welfare state with gratitude. But even then our contentment would be disturbed by the large extent to which the government fails to deliver what it promises. To be blunt, the government’s protection is largely fraudulent. Officials pretend to protect citizens and promote social harmony while actually accomplishing the opposite. Thus, the government’s affirmative action programs have actually fostered racial acrimony and conflict rather than racial harmony.[25] The environmental laws have caused many billions of dollars to be squandered in mandated actions for which costs vastly exceeded benefits.[26] And the Food and Drug Administration, far from improving public health, has caused (at least) hundreds of thousands of excess deaths and untold human suffering.[27] It is bad enough that citizens are viewed as sheep; it is worse that they are sheared and slaughtered.

Fifty years ago Bertrand de Jouvenel wrote, “The essential psychological characteristic of our age is the predominance of fear over self-confidence. . . . Everyone of every class tries to rest his individual existence on the bosom of the state and tends to regard the state as the universal provider.” But this protection costs the public far more than the high taxes that fund its provision: “if the state is to guarantee to a man what the consequences of his actions shall be, it must take control of his activities . . . to keep him out of the way of risks.”[28] In the interval since Jouvenel was writing, the demand for government protection has risen to new heights, and the corresponding loss of individual liberties has proceeded apace.

If we are to regain our liberties, we must reassert our responsibilities for ourselves, accepting the consequences of our own actions without appealing to the government for salvation. To continue on the road we Americans have traveled for the past century is ultimately to deliver ourselves completely into the hands of an unlimited government. It will not matter if democratic processes lead us to this destination. As noted above, the making of the welfare state has been from the very beginning a matter of corrupt vote-buying and patronage-dispensing by politicians—democracy in action.

And one sad servitude alike denotes
The slave that labours and the slave that votes.[29]

We can have a free society or a welfare state. We cannot have both.

1. Dorothy Ross, The Origins of American Social Science (Cambridge: Cambridge University Press, 1991), pp. 104-106.

2. Joseph A. Schumpeter, History of Economic Analysis (New York: Oxford University Press, 1954), p. 874.

3. Theda Skocpol, “America’s First Social Security System: The Expansion of Benefits for Civil War Veterans,” in Theda Skocpol, Social Policy in the United States: Future Possibilities in Historical Perspective (Princeton: Princeton University Press, 1995), p. 63.

4. Ibid., p. 37.

5. Theda Skocpol, “Gender and the Origins of Modern Social Policies in Britain and the United States,” in Skocpol, Social Policy, pp. 114-129.

6. Ibid., pp. 74,76.

7. Herman M. Somers, “Workmen’s Compensation,” International Encyclopedia of the Social Sciences 16 (New York: Macmillan and The Free Press, 1968), pp. 572-576.

8. Searle F. Charles, “Federal Emergency Relief Administration,” in Franklin D. Roosevelt: His Life and Times, ed. Otis L. Graham, Jr., and Meghan Robinson Wander (Boston: G. K. Hall, 1985), pp. 132-133.

9. Jeremy Atack and Peter Passell, A New Economic View of American History from Colonial Times to 1940, 2nd ed. (New York: Norton, 1994), p. 670.

10. Michael R. Darby, “Three-and-a-Half Million U.S. Employees Have Been Mislaid: Or, an Explanation of Unemployment, 1934-1941,” Journal of Political Economy 84 (February 1976): 7.

11. Lester V. Chandler, America’s Greatest Depression, 1929-1941 (New York: Harper & Row, 1970), pp. 203-205.

12. “Leaning on a Shovel,” by John LaTouche, reprinted in Richard D. McKinzie, “Works Progress Administration,” in Franklin D. Roosevelt, ed. Graham and Wander, p. 462. Econometric historians have tried to determine how much of the New Deal’s relief spending reflects playing politics. See John Joseph Wallis, “Employment, Politics, and Economic Recovery during the Great Depression,” Review of Economics and Statistics 69 (August 1987): 516-520.

13. Theda Skocpol with G. John Ikenberry, “The Road to Social Security,” in Skocpol, Social Policy, p. 162.

14. Bipartisan Commission on Entitlement and Tax Reform, Interim Report to the President (August 1994), pp. 14-15, 18-19.

15. Chandler, America’s Greatest Depression, p. 237; Gregory King, “Wages and Hours Legislation,” in Franklin D. Roosevelt, ed. Graham and Wander, p. 438.

16. Jack Stokes Ballard, The Shock of Peace: Military and Economic Demobilization after World War II (Washington: University Press of America, 1983), pp. 48-49.

17. “G.I. Bill,” in The Reader’s Companion to American History, ed. Eric Foner and John A. Garraty (Boston: Houghton Mifflin, 1991), p. 449.

18. Davis R. B. Ross, Preparing for Ulysses: Politics and Veterans During World War II (New York: Columbia University Press, 1969), p. 290.

19. Ibid.

20. Robert Higgs, Crisis and Leviathan: Critical Episodes in the Growth of American Government (New York: Oxford University Press, 1987), pp. 246-251.

21. Nathan Glazer, Affirmative Discrimination (New York: Basic Books, 1975); Thomas Sowell, Civil Rights: Rhetoric or Reality? (New York: William Morrow, 1984), pp. 38-42.

22. Edgar K. Browning and Jacquelene M. Browning, Public Finance and the Price System, 2nd ed. (New York: Macmillan, 1983), p. 128.

23. Allen J. Matusow, The Unraveling of America: A History of Liberalism in the 1960s (New York: Harper & Row, 1984), pp. 231-232.

24. Office of Management and Budget, Budget of the United States Government: Historical Tables, Fiscal Year 1996 (Washington: Superintendent of Documents), pp. 105, 108.

25. Thomas Sowell, Preferential Policies: An International Perspective (New York: William Morrow, 1990).

26. William C. Mitchell and Randy T. Simmons, Beyond Politics: Markets, Welfare, and the Failure of Bureaucracy (Boulder: Westview Press, 1994), pp. 146-162.

27. Dale H. Gieringer, “The Safety and Efficacy of New Drug Approval,” Cato Journal 5 (Spring/Summer 1985): 177-201; Robert M. Goldberg, “Breaking Up the FDA’s Medical Information Monopoly,” Regulation, No. 2 (1995): 40-52; Hazardous To Our Health? FDA Regulation of Health Care Products, ed. Robert Higgs (Oakland, Calif.: The Independent Institute, 1995), passim.

28. Bertrand de Jouvenel, On Power: The Natural History of Its Growth (Indianapolis: Liberty Fund, 1993; original French edition 1945), pp. 388-389.

29. Peter Pindar’s lines as quoted by Jouvenel, On Power, p. 353.

Robert Higgs
Robert Higgs

Robert Higgs is a Senior Fellow in Political Economy at The Independent Institute. He is also the Editor at Large for The Independent Review, the Institute's quarterly journal. He is a member of the FEE Faculty Network.