Berkowitz is a primary care physician and health researcher.
When I was a medical resident, one of my patients was brought into the emergency room with a change in his mental status. Unlike the medical mysteries I love to read about in magazines, his diagnosis was clear from the first test: a fingerstick glucose test. He was taking his diabetes medication regularly, but he had food anxiety and ate irregularly. This led to hypoglycemia, which caused his symptoms.
Fifteen years later, that patient’s story still stays with me. I’m still a family physician, but I’m also a health researcher focused on interventions to improve health by addressing food insecurity and other health-related societal needs. I’m amazed every day by biomedical scientific advances, yet frustrated by how something as basic as access to nutritious food remains out of reach for many.
In this context, the United States Preventive Services Task Force (USPSTF) is reviewing the evidence on screening for food insecurity in primary care. A draft statement recently released for public comment reveals that after careful and thorough review, the USPSTF plans to issue an “I” recommendation, meaning that “current evidence is insufficient to assess the balance of benefits and harms of screening for food insecurity in primary care settings on health outcomes.”
Many will find this disappointing. How can this be, when the harms of food insecurity are so clear, and the solution (food or money for food) so clear?
It is important to note that these types of recommendations are by no means specific to food insecurity, and the USPSTF’s determination is much more complex than simply determining whether something poses a threat to health. Many serious diseases, such as ovarian and pancreatic cancer, do not have screening recommendations.
For a test to improve health outcomes, many things need to come together: not only must the condition be harmful, but the test must also find more cases than usual (or find them at an earlier, more treatable stage) and provide effective treatment that would not normally be available.
Given all this, I agree with the USPSTF that the evidence needed to recommend screening for food insecurity in primary care does not currently exist. In this case, it’s not a matter of not knowing that food insecurity is harmful (it is) or not knowing how to identify it (simple two-item screeners are very effective), but rather whether a specific medical intervention can make a difference in health outcomes. That’s where the evidence is lacking.
So what does that leave us?
First, we need more and better quality research (as you might expect researchers to say). The good news is that it’s happening: the NIH has issued a Notice of Special Interest on food insecurity research, the American Heart Association has launched the Food as Medicine initiative, and Tufts University has founded the Food as Medicine Institute. These are just a few recent developments. Five years from now, the research landscape will likely look very different.
Second, even if it’s not clear that we should be screening for food insecurity right now, there are still things health care providers can do when food insecurity is identified. In particular, they can contextualize care by learning from individuals how food insecurity is affecting their health and tailoring care plans to better fit their circumstances. For example, a diabetic patient who experiences food insecurity could be offered a medication that minimizes the risk of hypoglycemia.
Third, while the narrow question of screening for food insecurity in primary care settings may need more evidence, the broader question of whether we should, as a society, take action to reduce food insecurity does not need more evidence. We know that food insecurity is an injustice that no one should have to face and that it can be reduced by public policy.
The United States has higher rates of food insecurity than many other countries in several key policy areas, including child benefits, elderly pensions, income support for people with work-limiting disabilities, and unemployment insurance, in each of which the United States stands out as significantly weaker than other countries.
And food insecurity has fallen significantly during periods when the U.S. has implemented stronger policies for people in these situations, such as the expanded child tax credit and pandemic-related expansion of unemployment insurance in 2021. Indeed, the expiration of these policies is likely the reason for the large increase in food insecurity in 2022.
Putting all this together, while I am disappointed that the state of the evidence for food insecurity screening in primary care is not stronger at present, I am very optimistic about the evidence pipeline and know there is something we can do to help people who experience food insecurity in the meantime. Perhaps most importantly, we should see this draft report as an opportunity to recognize that addressing food insecurity outside of healthcare is even more important than addressing it inside of healthcare.
Building a public policy system that guarantees the conditions everyone needs to be healthy is the real path to improving the health of our population. Indeed, if we only try to mitigate the health effects of food insecurity from within the health system once it has occurred, we will miss our best opportunity to achieve a healthier nation.
Seth A. Berkowitz, MD, MPH, is an associate professor in the departments of general medicine and clinical epidemiology at the University of North Carolina at Chapel Hill and author of the recently published Equal Care: Health Equity, Social Democracy, and the Egalitarian State.