Skip to main content
Orange and yellow grid of six medical and public health icons: pill bottle, stethoscope, apartment building, graduation cap, apple, syringe

It’s time to free public health from health care

For too long, a bias toward medicine has limited public health's potential.
Written by
Eric Coles
Published
May 22, 2024
Read Time
7 min

In March 2021, I helped organize a mass COVID vaccination day on the Tule River Indian Reservation in Central California. The tribe’s concert venue was the only building with enough indoor space, but where would our equipment go? How would patients flow through the interior? And what information would convince the community to come?

The CDC website had detailed information on our clinical concerns: the temperature at which to store the vials, the types of syringes needed for jabs, and the paperwork to give patients. But I could not find solutions to our nonmedical questions. Fortunately, we had a staff member with emergency management experience design the event layout, but our signage and publicity campaign lagged, resulting in far fewer attendees than expected.

This clinical, biomedical bent of public health practice was visible in other ways during the COVID response: Risk levels nationally were based on hospital bed capacity, and agency recommendations focused on helping individuals avoid getting sick, rather than people’s needs to work and go to school—a problem public health officials acknowledged earlier this year. This bias is not unique to the pandemic. Medicine has dominated our public health system since its inception, constraining our health outcomes and marginalizing the social and communal aspects of health.

As policymakers debate about funding public health capacity, emphasizing prevention, and building a team-based approach, we can learn from three key moments in history—all of which show us why de-medicalizing would actually improve public health practice.

Sign up for Harvard Public Health

Exploring what works, what doesn't, and why.

Delivered to your inbox weekly.

  • By clicking “Subscribe,” you agree to receive email communications from Harvard Public Health.
  • This field is for validation purposes and should be left unchanged.

A medical school bias dominates public health education

The medicalization of public health starts with our political leaders. Laws in 21 states and the District of Columbia require the lead state health officer to have a medical degree, aligning with a 1959 position statement of the American Public Health Association. Training and education from schools of public health should be required instead, but public health education became entwined with medicine with assistance from two influential men: William Welch, the first dean of the Johns Hopkins School of Medicine, and Abraham Flexner, a famed medical education reformer who had backing from the Rockefeller Foundation to fund the first school of public health.

Both men viewed public health largely through a medical lens, focused on the prevention and management of disease. But some of their contemporaries disagreed. Edwin Seligman, a professor of political science at Columbia University, proposed a school based on viewing public health as a social science with political economy at its core. Seligman’s vision was a precursor to social epidemiology and the social determinants of health, the nonmedical factors that determine so much of our health.

It should be no surprise, however, that a Flexner-led committee selected Johns Hopkins, with its biomedically focused curriculum, to fund as the first school of public health. For decades after, nearly half of Hopkins public health graduates were physicians, greatly impacting the early public health workforce. The choice reverberated to other schools, including Columbia, which centered biomedicine in their eventual school and marginalized education in the social and political factors for decades.

Congress rejects a national health program—twice

Practitioners in the early- and mid-twentieth century also debated how to reconcile individual medicine, which was seen as curative, with public health services, which were considered preventative. A key figure in this debate was Thomas Parran, the U.S. Surgeon General from 1936 to 1948. Parran was a confidant of President Franklin Roosevelt and helped get public health funding included in the final legislation for the New Deal. (His record as a public health champion is, however, marred by his approval of Public Health Service research that infected more than 1,300 Guatemalans with syphilis during his last two years as surgeon general.) But his proposal for health insurance splintered political support and failed. When Truman became president, Parran convinced him to back a congressional proposal for a comprehensive national health program. But Parran, Truman, and their congressional allies were outmaneuvered by the American Medical Association, which argued the bill gave too much control to the federal government.

Public health and health care had a tumultuous relationship throughout the next few decades. The two systems bifurcated—accentuated by medicine moving into large hospitals and the expansion of health insurance under Medicare and Medicaid bypassing public health departments. By 1988, national leaders were describing public health system as falling into “disarray.” It has never fully recovered.

Health centers innovate—but can’t fund—social services

Efforts to merge social services with health care resurfaced in the 1960s, when a federal agency focused on ending poverty created what we would now call federally qualified health centers (FQHCs). These were inspired by Jack Geiger, an American medical student who had worked at a health center in South Africa that also offered school meal programs, community vegetable gardens, health education, and more. During President Lyndon B. Johnson’s “War on Poverty” the government provided grants for two demonstration health centers that offered medical care, water supply protection in rural areas, a 600-acre vegetable farm, home repair, and even a bus system for patients who lacked transportation. The belief was that the grants would be slowly phased out as the centers relied on billing Medicare and Medicaid to support all their services. However, this billing never took off due to state-level restrictions that limited the spread of Medicaid. By Nixon’s second term, financial concerns forced the centers to focus on billable medical services and cut back on other social services.

Today, there are nearly 1,400 FQHCs, which are federally funded, and they deliver health care to more than 30 million people in underserved areas. But they—still—lack the holistic support that Geiger envisioned.

A new chance to learn from history

Today, Seligman, Geiger, and Parran’s ideas are being revived. At schools of public health, accreditation requirements are deemphasizing medical skills and focusing more on advocacy, leadership, and policymaking. The CDC, meanwhile, aims to create a nationwide technology infrastructure connecting public health agencies and health care providers. It will hopefully lay the groundwork for more coordination between federal, tribal, territorial, state and local public health agencies and health care organizations. And the Centers for Medicaid and Medicare Services are approving demonstration projects that use Medicaid funds to pay for social services in 20 states and growing.

But we need to do more to de-medicalize public health. We can start by changing laws and policies that give preference to physicians as public health leaders. For example, although I am a tribal public health officer in California, I am legally excluded from the California Conference of Local Health Officers because the state requires public health leaders to have an M.D., and I do not. Participating would help me build relationships with county and state colleagues as well as learn more about state plans for public health.

We should also recalibrate funding for public health research and services to focus on the social factors that largely determine our health. In but one glaring example of the neglect of these factors, the National Institutes of Health spent only around nine percent of its 2022 budget to study them. And we should expand federal funding for general public health services so that staff like me can decide where our community needs it most, rather than earmark it for prevention of specific diseases.

It took decades to get the public health system we have today. Because of its failures during COVID, we have an opportunity to reform. We can learn from the past to expand the vision and practice of public health beyond medicine. Health is a team sport, and neither health care nor public health can work without the other.

Source icons: efuroku / iStock, Ognjen18 / iStock

Filed Under
Contributors
Eric_Coles_Headshot_sq
Eric Coles
Eric Coles is the public health officer at the Tule River Indian Health Center in Porterville, California.

More in Policy & Practice

See all