research

The Urban Mortality Transition in the United States, 1800-1940

Abstract

In the United States in the 19th and early 20th centuries, there was a substantial mortality 'penalty' to living in urban places. This circumstance was shared with other nations. By around 1940, this penalty had been largely eliminated, and it was healthier, in many cases, to reside in the city than in the countryside. Despite the lack of systematic national data before 1933, it is possible to describe the phenomenon of the urban mortality transition. Early in the 19th century, the United States was not particularly urban (only 6.1% in 1800), a circumstance which led to a relatively favorable mortality situation. A national crude death rate of 20-25 per thousand per year would have been likely. Some early data indicate that mortality was substantially higher in cities, was higher in larger relative to smaller cities, and was higher in the South relative to the North. By 1900, the nation had become about 40% urban (and 56% by 1940). It appears that death rates, especially in urban areas, actually rose (or at least did not decline) over the middle of the 19th century. Increased urbanization, as well as developments in transport and commercialization and increased movements of people into and throughout the nation, contributed to this. Rapid urban growth and an inadequate scientific understanding of disease processes contributed to the mortality crisis of the early and middle nineteenth century in American cities. The sustained mortality transition only began about the 1870s. Thereafter the decline of urban mortality proceeded faster than in rural places, assisted by significant public works improvements and advances in public health and eventually medical science. Much of the process had been completed by the 1940s. The urban penalty had been largely eliminated and mortality continued to decline despite the continued growth in the urban share of the population.

    Similar works