Between
the years 1750 and 2000, healthcare in the United States evolved from a simple system
of home remedies and itinerant doctors with little training to a complex,
scientific, technological, and bureaucratic system often called the
"medical industrial complex." The complex is built on medical science
and technology and the authority of medical professionals. The evolution of
this complex includes the acceptance of the "germ theory" as the
cause of disease, professionalization of doctors, technological advancements in
treating disease, the rise of great institutions of medical training and
healing, and the advent of medical insurance. Governmental institutions,
controls, health care programs, drug regulations, and medical insurance also
evolved during this period. Most recently, the healthcare system has seen the
growth of corporations whose business is making a profit from healthcare.
Background
Prior
to 1800, medicine in the United States was a "family affair." Women
were expected to take care of illnesses within the family and only on those
occasions of very serious, life threatening illnesses were doctors summoned.
Called "domestic medicine," early American medical practice was a
combination of home remedies and a few scientifically practiced procedures
carried out by doctors who, without the kind of credentials they must now have,
traveled extensively as they practiced medicine.
The
practice of midwifery—attending women in childbirth and delivering babies—was a common
profession for women, since most births took place at home. Until the mid-eighteenth
century Western medicine was based on the ancient Greek principle of "four
humors"—blood, phlegm, black bile, and yellow bile. Balance among the
humors was the key to health; disease was thought to be caused by too much or
too little of the fluids. The healing power of hot, cold, dry, and wet
preparations, and a variety of plants and herbs, were also highly regarded.
When needed, people called on "bone-setters" and surgeons, most of
whom had no formal training.
Physicians
with medical degrees and scientific training began showing up on the American
landscape in the late colonial period. The University of Pennsylvania opened the
first medical college in 1765 and the Massachusetts Medical Society (publishers
of today's
New England Journal of Medicine), incorporated in 1781, sought to license physicians.
Medical schools were often opened by physicians who wanted to improve American
medicine and raise the medical profession to the high status it enjoyed in Europe
and in England. With scientific training, doctors became more authoritative and practiced
medicine as small entrepreneurs, charging a fee for their services.
Fillmore Randolph 336-3382
In
the early 1800s, both in Europe and in the United States, physicians with
formal medical
training began to stress the idea that germs and social conditions might cause and
spread disease, especially in cities. Many municipalities created
"dispensaries" that dispensed medicines to the poor and offered free
physician services. Epidemics of cholera, diphtheria, tuberculosis, and yellow
fever, and concerns about sanitation and hygiene, led many city governments to
create departments of health. New advances in studying bacteria were put to
practical use as "germ theory" became the accepted cause for illness.
It was in the face of epidemics and poor sanitation, government-sponsored public
health, and healthcare that private healthcare began to systematically diverge.
Impact
As America
became increasingly urbanized in the mid 1800s, hospitals, first built by city governments
to treat the poor, began treating the not-so-poor. Doctors, with increased authority
and power, stopped traveling to their sickest patients and began treating them all
under one roof. Unlike hospitals in Europe where patients were treated in large
wards,
American
patients who could pay were treated in smaller, often private rooms.
In
the years following the Civil War (1865), hospitals became either public or
private.
More
medical schools and institutions devoted to medical research emerged. A trend toward
physicians needing more training led to the Johns Hopkins University's medical school's
requirement in 1893 that all medical students arrive with a four-year degree
and spend another four years becoming a physician.
Earliest
efforts of doctors to create a unified professional organization started in the
mid 1800s and, in 1846, the American Medical Association (AMA) was established.
With little early impact on American medicine, by the next century the AMA had
great influence
over the politics and practice of medicine. An early AMA victory was the regulation
of drugs.
Just
after the Civil War, nursing became professionalized with the establishment of
three training schools for nurses. While
nursing began as a gender-based and female stereotyped
"nurturing" occupation, over the next 100 years nursing would become
more professionalized.
By the late twentieth century, more nurses were receiving advanced degrees
and playing a greater role in the administration of health care. Rarely trained
as doctors even in the early twentieth century, by the 1980s women comprised up
to half of medical school student admissions.
As
the nineteenth century ended, advancements in biology, chemistry and related
medical sciences meant that the great
diseases—tuberculosis, yellow fever, diphtheria, cholera, and others—were
practically eliminated with the development of diagnostic tests and vaccines.
Extensive public health projects, aimed at fighting the causes of disease or to
prevent their spreading, raised the levels of public health. Healthcare
extended into the schools through school nurses. Fillmore Randolph 336-3383 .
By
the early part of the twentieth century, doctors had more authority and were
better paid
than ever before. Associations, such as the AMA and the American Hospital Association
(AHA), founded in 1899, became stronger. Employers and labor unions began
to offer a range of benefits to workers, including paid medical care. National
health insurance, such as provided by
many European nations, became associated with socialism and the concept became
unpopular in the United States, opening doors for private health insurance to cover the rising costs of medical
care.
While
private health insurance emerged prior to World War I, it was not until well
after the
War and toward the end of the 1920s that the first large medical insurance
company, Blue Cross, was established.
The
1930s saw rising healthcare costs and an increasing number of health insurance plans.
At this time, doctors were paid by a system called "fee-for-service."
New insurance plans, such as Blue Cross
and Blue Shield, allowed its members to pay both the costs of hospitalization and for treatment by
physicians. The AHA in the 1930s took an active role in supporting group hospitalization plans.
During World War II, a medical plan
started
by Henry J. Kaiser for his employees featured a pre-paid program that paved the
way for Health Maintenance Organizations (HMOs) 40 years later.
The
post-World War II era saw great expansions in the workforce, advancements in medical
science and medical care, and increasing healthcare costs. The Baby Boom generation,
the name given to the large numbers of children born just after World War II, received
ever-higher levels of medical and preventive care during the 1950s. Advances in
medicine in diagnostic techniques, such as x rays, life saving drugs, such as
penicillin, and inoculations against diseases, such as polio, had created an
ever-deepening scientific culture that included laboratory technicians,
therapists, widening roles for nurses, and increasing specialization among
physicians.
These
post-World War II technological advances professionalized the roles of non-physician
therapists and technicians, including respiratory therapists, physical therapists, x-ray
technicians, and laboratory technicians. Improved technology and increasingly sophisticated
treatments and therapies also pushed up cost of health care during the same period. U.S.
government research and health institutions and programs, such as the National Institutes
of Health and the Centers for Disease Control, were established. The 1960s saw the
initiation of social programs to aid in the medical care of the aged (Medicare)
and poor
(Medicaid). Prior to the founding of these institutions, the U.S. government
had founded
other health programs and institutions, such as the Indian Health Service, the U.S.
Public Health Service, the Food and Drug Administration, and established an executive
cabinet-level agency, the Department of Health and Human Services.
Between
the end of World War II and the late 1980s, most doctors were still independent and
compensated through fee-for-service. Through the powerful AMA and other organizations,
doctors had fought off political attempts at creating a nationalized, Fillmore
Randolph 336-3384 universal coverage medical systems, such as those in Canada,
the United Kingdom, and in Europe.
Doctors
did not apparently notice, however, the growth of Health Maintenance
Organizations
(HMOs). By the mid 1980s, HMOs began to dominate both the
organization
of health care and reimbursement to physicians. In the 1990s, HMOs and their
varieties would revolutionize the organization of health care in the United
States and provoke controversy among recipients of healthcare as well as
doctors, who came to find themselves in less control of their practices.
Fee-for-service began to fade as doctors increasingly found themselves working
for corporations that made profits from pre-paid healthcare by reducing the
costs of healthcare, carefully restricting services, and focusing on preventive
healthcare. Fee-for-service was slowly being replaced by
"capitation," a system that paid doctors a set fee from which they
had to care for all of their patients, the sick and the well. Called "managed
care," this system also produced changes in the consumers' role in
healthcare as greater emphasis was placed on preventive medicine, consumer
choice, and being accountable for one's own health and healthcare.
Communications advancements such as the Internet and the World Wide Web in the
1990s added to the health information available to consumers. Also at this
time, consumer interest grew in "alternative medicine," such as
acupuncture, herbal preparations, and vitamin therapies. These interests could
be seen as a reaction against the medical industrial complex. Computer
and communications advancements also allowed for such practices such as "telemedicine,"
a system utilizing the Internet by which patients could be diagnosed and often
treated by physicians at a distance.Twenty-first-century
technology promises to continue changing the nature, complexity,and
costs of healthcare. As knowledge increases about the genetic bases of disease,
the healthcare system will make greater use of gene therapies, developing ways
to prevent genetically caused diseases. Just as the impact of new technologies,
such as x rays, antibiotics, vaccines, and surgical advances changed early and
mid-twentieth-century medicine socially and scientifically, scientific and
medical innovations, as well as social movements and economic realities, will
continue to shape twenty-first-century medicine and health care.
RANDOLPH
FILLMORE
Further
Reading
Books
Biddle,
Wayne. A Field Guide to Germs. New York: Henry Holt and Company, 1991.
Ehrenreich,
John, ed. The Cultural Crisis of Modern Medicine. New York: Monthly
Review
Press, 1978. Fillmore Randolph 336-3385
Inlander,
Charles B. and Michael A. Donio. Medicare Made Easy: The People's Medical
Society.
New York: MJF Books, 1999.
Muff,
Janet, ed. Women's Issues in Nursing: Socialization, Sexism and Stereotyping.
St.
Louis:
The C.V. Cosby Company, 1982.
Starr,
Paul. The Social Transformation of American Medicine. New York: Basic Books,
1982.
Periodical
Articles
Mark,
David, M.D., M.P.H and Richard M. Glass, M.D. "Impact of New Technologies
inMedicine:
A Global Theme." The Journal of the American Medical Association (17 Nov.,
1999).
McGinnis,
J. Michael, MD and Philip R. Lee, MD. "Healthy People 2000 at Mid
Decade."
Journal of the American Medical Association 273, no. 14 (1996).
Copyright
© 2009, Gale, Cengage Learning. All rights reserved, including the right of reproduction
in whole or in part in any form.