Review: IT in Health Care Has Produced Modest Changes — So Far
Safety First
The paper itself is a broad-ranging review of 975 academic research papers on technology and health care services; Doyle is a leading health care economist whose own quasiexperimental studies have quantified, among other things, the difference that increased health care spending yields. This literature review was developed as part of MIT’s Work of the Future project, which aims to better understand the effects of innovation on jobs. Given that health care spending accounted for 18 percent of U.S. GDP in 2020, grasping the effects of high-tech tools on the sector is an important component of this effort.
One facet of health care that has seen massive IT-based change is the use of electronic health records. In 2009, fewer than 10 percent of hospitals were using such records; by 2014, about 97 percent hospitals had them. In turn, these records allow for easier flow of information within providers and help with the use of clinical decision-support tools — software that helps inform doctors’ decisions.
However, a review of the evidence shows the health care industry has not followed up to the same extent regarding other kinds of applications, like decision-support tools. One reason for that may be patient-safety concerns.
“There is risk aversion when it comes to people’s health,” Doyle observes. “You [medical providers] don’t want to make a mistake. As you go to a new system, you have to make sure you’re doing it very, very well, in order to not let anything fall through the cracks as you make that transition. So, I can see why IT adoption would take longer in health care, as organizations make that transition.”
Multiple studies do show a boost in overall productivity stemming from IT applications in health care, but not by an eye-catching amount — the total effect seems to be from roughly 1 percent to about 3 percent.
Complements to the Job, Not Substitutes, So Far
Patient outcomes also seem to be helped by IT, but with effects that vary. Examining other literature reviews of specific studies, the authors note that a 2011 survey found 60 percent of studies showed better patient outcomes associated with greater IT use, no effect in 30 percent of studies, and a negative association in 10 percent of studies. A 2018 review of 37 studies found positive effects from IT in 30 cases, 7 studies with no clear effect, and none with negative effects.
The more positive effects in more recent studies “may reflect a learning curve” by the industry, Bronsoler, Doyle, and Van Reenen write in their paper.
Their analysis also suggests that despite periodic claims that technology will wipe out health care jobs — through imaging, robots, and more — IT tools themselves have not reduced the medical labor force. In 1990, there were 8 million health care workers in the U.S., accounting for 7 percent of jobs; today there are 16 million health care workers in the U.S., accounting for 11 percent of jobs. In that time there has been a slight reduction in medical clerical workers, dropping from 16 percent to 13 percent of the health care workforce, likely due to automation of some routine tasks. But the persistence of hands-on jobs has been robust: The percentage of nurses has slightly increased among health care jobs since 1990, for example, from 15.5 percent to 17.1 percent.
“We don’t see a major shock to the labor markets yet,” Doyle says. “These digital tools are mostly supportive [for workers], as opposed to replacements. We say in economics that they’re complements and not substitutes, at least so far.”
Will Tech Lower Our Bills, or Not?
As the authors note in the paper, past trends are no guarantee of future outcomes. In some industries, adoption of IT tools in recent decades has been halting at first and more influential later. And in the history of technology, many important inventions, like electricity, produce their greatest effects decades after their introduction.
It is thus possible that the U.S. health care industry could be headed toward some more substantial IT-based shifts in the future.
“We can see the pandemic speeding up telemedicine, for example,” Doyle says. To be sure, he notes, that trend depends in part on what patients want outside of the acute stages of a pandemic: “People have started to get used to interacting with their physicians [on video] for routine things. Other things, you need to go in and be seen … But this adoption-diffusion curve has had a discontinuity [a sudden increase] during the pandemic.”
Still, even the adoption of telemedicine also depends on its costs, Doyle notes.
“Every phone call now becomes a [virtual] visit,” he says. “Figuring out how we pay for that in a way that still encourages the adoption, but doesn’t break the bank, is something payers [insurers] and providers are negotiating as we speak.”
Regarding all IT changes in medicine, Doyle adds, “Even though already we spend one in every five dollars that we have on health care, having more access to health care could increase the amount we spend. It could also improve health in ways that subsequently prevent escalation of major health care expenses.” In this sense, he adds, IT could “add to our health care bills or moderate our health care bills.”
For their part, Bronsoler, Doyle, and Van Reenen are working on a study that tracks variation in U.S. state privacy laws to see how those policies affect information sharing and the use of electronic health records. In all areas of health care, he adds, continued study of technology’s impact is welcome.
‘There is a lot more research to be done,” Doyle says.
Peter Dizikes is the social sciences, business, and humanities writer at the MIT News Office. The article is reprinted with permission of MIT News.