Lucinda Myles McCray
For the last 40 years, I have been studying sufferers’ experiences of ill-health, health care, and disease prevention — issues associated with what I call health culture. The COVID-19 pandemic has been a kind of participant observation in the most significant health-related development of my lifetime.
We can’t predict the future with any certainty. But history can help.
For the past two years, we have been thinking about the pandemic as a crisis that will end. Both the foolish wishful thinking at its beginning (e.g., this bug will never amount to much, and will disappear on its own), and the hubris that made us assume we could beat it (e.g., if 80% of the population gets vaccinated, the virus will completely go away) have tailed off in the light of experience, increasing scientific knowledge and changes in the disease itself.
Our crisis mentality has affected our policy decisions: complete shutdowns of social and economic activities, for example, and setting up temporary hospital wards in conference facilities and on cruise ships. It has also affected our expectations regarding services, causing us to demand heroic labor and selflessness from health care workers.
The general population seems to be saying to these folks: "Well, this is the job you signed up for. Anyone can stand or do anything for a short period of time. Just suck it up and get over yourself. Things will get back to normal soon.”
In the long term, this approach will not work well for us. We’re already hemorrhaging health care workers. It seems to me that what we need is a moment to draw breath and consider what the future is likely to look like, then come up with sensible ways to manage it.
It is noteworthy that the omicron strain is following a pattern humanity has seen before. New contagious diseases can hit very hard indeed, and then transition to become routine elements of individual, family, and community expectation and experience.
For example, bubonic plague, which arrived in Europe in 1347, killed between 30 to 60% of the population by 1353. Then it became endemic and less deadly, flaring up regularly until its last major European outbreak in 1665. Now plague can be cured with antibiotics.
Similarly, diseases such as measles, whooping cough, scarlet fever and diphtheria, although serious and occasionally deadly, were so common and expected by the 19th century that they were considered ordinary "childhood illnesses.”
We can’t know yet, but one possible silver lining of the omicron variant, which is highly infectious but perhaps less deadly than its delta predecessor, may be that we can get used to living with it and controlling it with annual immunizations as we do with influenza.
But our success in dealing with COVID-19 will depend on calibrating our expectations to our capabilities. We have advantages people have never had before. We need to use them wisely and well.
Public health historians talk about the "Mortality Transition,” which was the point at which most deaths ceased to be caused by contagious diseases and began to be caused by chronic disorders, such as heart disease and cancers. This transition arrived at different times in different places, depending on factors including enhanced understanding of how contagion is transmitted and activating that understanding at the community level with measures ranging from public water and sewer systems to isolation of the sick.
In the United States, the mortality transition happened at the end of the 19th century. By the time we got to the middle of the 20th century, we began to think we’d beaten infection. We had a proliferating number of immunizations: first, for smallpox (which, in 1980, became the first contagious disease to be eliminated globally), then for diphtheria, tetanus, whooping cough, measles, mumps, polio and other nasties.
We started using antibiotics against a range of infections in the 1940s. By the 1960s, clinical medicine had become so powerful that public health — the effort to prevent disease in populations — seemed somewhat quaint and old-fashioned.
Our new self-confidence was challenged in the 1980s with the advent of HIV-AIDS; but even that global scourge did not change the perception of many folks that AIDS only threatened other people — not themselves.
SARS, Ebola and MERS were further shots across the bow, but again they did not undermine the consensus that modern medicine and American know-how could keep deadly pandemics from our doors.
Then came COVID, which was the third leading cause of death in the United States in 2020. This has been a game-changer. We may be heading into a post-Mortality Transition world, where infection lurches back to the front rank of the health threats we fear, experience and must manage.
This change will affect our health culture enormously.
The good news is that we have much better tools to manage COVID than our 19th-century forebears had to deal with the contagious diseases that stalked them. Because we understand how COVID is transmitted, we have a number of practices we can use to limit its spread, including immunization, testing, masking, social distancing and upgraded ventilation systems. We also have improving ways to treat COVID and its sequelae; during the 1918-1919 flu pandemic, there were no effective treatments for the devastating pneumonias that killed millions. We have interventions that make it possible for a growing number of patients to survive.
The bad news is that we seem unwilling to recalibrate our expectations and behaviors to deal with a post-pandemic world. This must change. The expectation that COVID will be with us for the foreseeable future will help us generate realistic policies and systems. We need to get over the silly politicization of immunization. We need a beefed-up, highly professional and politically independent public health system operating at national, state and local levels to manage disease prevention, communication, services and practices. This system should ideally be integrated with a unified single-payer clinical medical system that prioritizes patient care and staff well-being over profits.
There are huge barriers to accomplishing these goals. But we need to get over them to achieve a new normal that will give us a healthier world and stop the confusion, waste and misery we are suffering now.
(Lucinda Myles McCray, of Minneapolis, is a retired history professor.)
For the last 40 years, I have been studying sufferers’ experiences of ill-health, health care, and disease prevention — issues associated with what I call health culture. The COVID-19 pandemic has been a kind of participant observation in the most significant health-related development of my lifetime.
We can’t predict the future with any certainty. But history can help.
For the past two years, we have been thinking about the pandemic as a crisis that will end. Both the foolish wishful thinking at its beginning (e.g., this bug will never amount to much, and will disappear on its own), and the hubris that made us assume we could beat it (e.g., if 80% of the population gets vaccinated, the virus will completely go away) have tailed off in the light of experience, increasing scientific knowledge and changes in the disease itself.
Our crisis mentality has affected our policy decisions: complete shutdowns of social and economic activities, for example, and setting up temporary hospital wards in conference facilities and on cruise ships. It has also affected our expectations regarding services, causing us to demand heroic labor and selflessness from health care workers.
The general population seems to be saying to these folks: "Well, this is the job you signed up for. Anyone can stand or do anything for a short period of time. Just suck it up and get over yourself. Things will get back to normal soon.”
In the long term, this approach will not work well for us. We’re already hemorrhaging health care workers. It seems to me that what we need is a moment to draw breath and consider what the future is likely to look like, then come up with sensible ways to manage it.
It is noteworthy that the omicron strain is following a pattern humanity has seen before. New contagious diseases can hit very hard indeed, and then transition to become routine elements of individual, family, and community expectation and experience.
For example, bubonic plague, which arrived in Europe in 1347, killed between 30 to 60% of the population by 1353. Then it became endemic and less deadly, flaring up regularly until its last major European outbreak in 1665. Now plague can be cured with antibiotics.
Similarly, diseases such as measles, whooping cough, scarlet fever and diphtheria, although serious and occasionally deadly, were so common and expected by the 19th century that they were considered ordinary "childhood illnesses.”
We can’t know yet, but one possible silver lining of the omicron variant, which is highly infectious but perhaps less deadly than its delta predecessor, may be that we can get used to living with it and controlling it with annual immunizations as we do with influenza.
But our success in dealing with COVID-19 will depend on calibrating our expectations to our capabilities. We have advantages people have never had before. We need to use them wisely and well.
Public health historians talk about the "Mortality Transition,” which was the point at which most deaths ceased to be caused by contagious diseases and began to be caused by chronic disorders, such as heart disease and cancers. This transition arrived at different times in different places, depending on factors including enhanced understanding of how contagion is transmitted and activating that understanding at the community level with measures ranging from public water and sewer systems to isolation of the sick.
In the United States, the mortality transition happened at the end of the 19th century. By the time we got to the middle of the 20th century, we began to think we’d beaten infection. We had a proliferating number of immunizations: first, for smallpox (which, in 1980, became the first contagious disease to be eliminated globally), then for diphtheria, tetanus, whooping cough, measles, mumps, polio and other nasties.
We started using antibiotics against a range of infections in the 1940s. By the 1960s, clinical medicine had become so powerful that public health — the effort to prevent disease in populations — seemed somewhat quaint and old-fashioned.
Our new self-confidence was challenged in the 1980s with the advent of HIV-AIDS; but even that global scourge did not change the perception of many folks that AIDS only threatened other people — not themselves.
SARS, Ebola and MERS were further shots across the bow, but again they did not undermine the consensus that modern medicine and American know-how could keep deadly pandemics from our doors.
Then came COVID, which was the third leading cause of death in the United States in 2020. This has been a game-changer. We may be heading into a post-Mortality Transition world, where infection lurches back to the front rank of the health threats we fear, experience and must manage.
This change will affect our health culture enormously.
The good news is that we have much better tools to manage COVID than our 19th-century forebears had to deal with the contagious diseases that stalked them. Because we understand how COVID is transmitted, we have a number of practices we can use to limit its spread, including immunization, testing, masking, social distancing and upgraded ventilation systems. We also have improving ways to treat COVID and its sequelae; during the 1918-1919 flu pandemic, there were no effective treatments for the devastating pneumonias that killed millions. We have interventions that make it possible for a growing number of patients to survive.
The bad news is that we seem unwilling to recalibrate our expectations and behaviors to deal with a post-pandemic world. This must change. The expectation that COVID will be with us for the foreseeable future will help us generate realistic policies and systems. We need to get over the silly politicization of immunization. We need a beefed-up, highly professional and politically independent public health system operating at national, state and local levels to manage disease prevention, communication, services and practices. This system should ideally be integrated with a unified single-payer clinical medical system that prioritizes patient care and staff well-being over profits.
There are huge barriers to accomplishing these goals. But we need to get over them to achieve a new normal that will give us a healthier world and stop the confusion, waste and misery we are suffering now.
(Lucinda Myles McCray, of Minneapolis, is a retired history professor.)