To a certain extent, the national conversation about the importance of a robust public health system has been happening for the last couple of years in the context of dealing with the COVID-19 pandemic. And yet, rather than discussing how the system can be strengthened and expanded, we’ve instead largely limited it to what the government can and cannot ask the public to do, as though the necessity of a public health system is still in doubt.
For those who work in the arena, this is not an open question; a functioning, just society cannot sustain itself without public health. But the unfortunate product of a society with modest levels of commitment to the social contract and a jaundiced opinion of massive government budgets is underfunding and mistrust of public health.
Even so, public health professionals endure because they know they must. And within this reality, one might ask what could enhance public health without requiring comparatively huge budgets or much active participation by the public.
The answer, as with every corner of modern society, is technology. Not just any technology, however, as the digital health tools available to large medical centers are prohibitively expensive for public health and not necessarily what public health needs to do an effective job.
So, what would empower public health agencies and professionals?
The Basics of Public Health
At its most foundational, public health is about preventing the rapid spread of highly communicable diseases, and a primary tool in that prevention effort is vaccines. Without vaccines, diseases we mostly think of as relics of a less sophisticated age—smallpox, measles, mumps, polio—would continue to decimate populations and shorten life spans.
Of course, especially in the age of mRNA, vaccines are considered a technological solution. Less so are masks, social distancing, hand washing, and all other efforts to prevent disease spread. Because germ theory has come to dominate the response to disease, there is a tendency to focus on vaccines and ignore more simple, cheap measures can make a tremendous difference, not to mention the fact that poverty and education also play a role in who gets sick and who gets robust treatment.
“Both the Trump and Biden administrations have described the pandemic in military metaphors,” says Ed Yong in the Atlantic. “Politicians, physicians, and the public still prioritize biomedical solutions over social ones. Medicine still overpowers public health, which never recovered from being ‘relegated to a secondary status: less prestigious than clinical medicine [and] less amply financed,’ wrote the sociologist Paul Starr.”
And even with that focus on “biomedical solutions,” much of the American population still refuse vaccines, as well as masks and distancing.
“In theory, the answer to the question as to how to prevent future outbreaks in Upper Silesia is quite simple: education, together with its daughters, freedom and welfare,” wrote Rudolf Virchow regarding an 1848 outbreak of typhus. “However, in practice, it is more difficult to see how this social problem is to be solved … We have often referred to ‘the scientific method’. We now find that through applying it, we have moved from medicine into the social field, and in so doing we have had to consider some of the fundamental issues of our times.”
Nearly 175 years later, these fundamental social issues remain, even as the communication dynamics and greater numbers of people make everything more complex. With that complexity as a given, modern technology enables us to move beyond the benefits of vaccines and personal action to tools that enable tracking and data compilation without the public’s active participation.
The Benefits of Digital Health
In the explosion of digital health technology over recent years, the electronic health record (EHR) does a lot of heavy lifting. While the internet of things (IoT) and wearable devices probably account for many more individual innovations, the EHR has served as the flagship for health IT because it represents healthcare’s long-overdue transition to electronic systems and because EHRs can be staggeringly expensive and complex to implement.
That latter characterization, which is often accurate, illustrates why EHRs are a key component in the arsenal of public health technology tools but still primarily the tool of hospitals and medical systems, not local public health authorities.
The pandemic has also demonstrated the value of telehealth technology, but that same technology is perhaps more valuable to rural hospitals, clinics, and providers in the long term because it enables immediate connectivity with more sophisticated medical centers without the costs of travel. As telehealth solutions evolve, they are frequently integrated or interfaced with EHRs, which adds costs to the already expensive platform.
A better example of an affordable public health technology might be Twitter, which costs users nothing but is a highly effective means of disseminating key public health information. After all, public health in the United States went from spending 3 cents of every healthcare dollar in 1930 to just 2.5 cents 90 years later, so we have to value every penny. Viva la progress.
And while Twitter, Facebook, YouTube, and Instagram can cost users nothing more than their attention, a valuable commodity indeed, many wearable devices like the Apple Watch, Fitbits and similar have relatively modest costs compared with things like EHRs.
Often considered to be a public health emergency in its own right, social media channels can be tools through which public health officials can affordably disseminate information, track disease outbreaks, educate the populace, and receive feedback on initiatives and programs. Properly integrated with data gathering technology, wearable devices warn providers when a patient is close to a negative health event or when a virus outbreak is taking off within a community.
Which brings us back to EHRs. Rolling up data and spitting out reports is something most EHRs can do effectively, but unless that data is shared with health information exchanges and local public health officials, it remains in a silo and is of limited value, which is at the heart of an argument for networks, interfaces, and integrated systems.
The challenge is that we’ll wait forever expecting the private sector to create integrated systems and comprehensive public health data. The federal government has already mandated the end of information blocking, the practice of retaining patient and health data with health IT systems, and still effective data sharing remains a challenge.
“The delegation of public-health decision making to the private sector might also accelerate the consolidation of market power,” writes Wendy Parmet. “Large employers and big corporations will be better able to assess and implement health policies that are good for their business. A vaccinated workplace may become a competitive edge, and large employers will then be better positioned to purchase vaccines and mandate them. Smaller employers and their workers will suffer as their stores, restaurants, and workplaces are left unvaccinated.”
While it’s hard to argue against the downsides to outsourcing public health to the private sector, we have to also acknowledge that public health finds itself in a unique situation, what with confidence in expertise and government agencies having reached a nadir.
Still, the government has a crucial role to play in making useful all the data private sector technological innovations have and are going to generate. Importantly, government and non-profit public organizations have a primary mandate that focuses on public benefit, not private revenue.
And while we are at it, we may as well devote a bit more of that healthcare dollar to public health if we want to get serious about future pandemics, opioid crises, deaths of despair and the like. We can be certain of future challenges; we should be equally certain of the availability of resources to meet them.
About Irv Lichtenwald