As the pandemic progresses, the data around infections, hospitalisations and deaths is becoming less reliable – and it takes multiple perspectives to truly understand it. Siouxsie Wiles and Jin Russell explain.
As scientists and clinicians, part of our job is reading published studies and incorporating the findings with what we already know about the topic, either from our own studies or from the others we’ve read, to move our understanding forward. An important feature of this system is that as the evidence changes, so does our understanding. That’s why, over the course of the pandemic, you’ll have seen Siouxsie change her mind on things like how the virus spreads and the need for masks.
One of the problems with relying only on published studies though is the time it takes for them to be carried out, written up, peer-reviewed and published. Because of the speed at which new variants of the Covid-19 virus are evolving, this time lag is making things really difficult. When a new study lands, the first thing we have to do is look at when it was carried out. Which variant was prevalent at the time? Was it done before or after the introduction of vaccines? If after, what vaccines were in use where the study was done? The widespread release of new studies as preprints, the version before peer review, has definitely helped speed up our access to new information. But that comes with its own disadvantage. Might the results be wrong, perhaps because the studies were badly designed or done by people without the relevant expertise? This is one of the reasons why we stay on social media despite all the harassment. Platforms like Twitter are full of experts carrying out real-time peer review of both preprints and published studies.
As countries have taken different approaches to deal with the pandemic, another thing that’s proved really useful has been the many Covid dashboards that have sprung up showing near real-time data for things like cases, hospitalisations, deaths, and over the last year or so vaccinations. One of the first dashboards to get up and running in January 2020 was by the Center for Systems Science and Engineering (CSSE) at Johns Hopkins University. It pulls data from a range of official sources – you can see the list for yourself here.
Another website that’s proved enormously valuable is Our World in Data. The site was started more than a decade ago by Dr Max Roser, programme director of the Oxford Martin Programme on Global Development at the University of Oxford. Our World in Data covers everything from health and education to human rights and democracy. When the pandemic struck, they soon started gathering and displaying data related to Covid-19 and are now one of the go-to sites to compare how different countries are tracking. We use this one a lot as a quick and dirty way of seeing how different countries’ policy choices around public health interventions are impacting case numbers.
When looking at sites like the CSSE dashboard and Our World in Data and thinking about what the data is showing, it’s important to remember the saying rubbish in, rubbish out. If the data going in is no good, then it will impact the conclusions we can draw from it. So how reliable is the data feeding into these sites? It’s hard to say. From the very beginning of the pandemic, we’ve known that plenty of countries have been under-reporting cases. Some of this may well be deliberate. Who can forget the time the former United States president asked to “slow the testing down” as testing was identifying cases and making the US “look bad”?!
Much of the under-reporting of cases though has been because of a lack of critical testing reagents and/or infrastructure, because people haven’t wanted to/been able to get tested, or because of the use of “at-home” testing. Now that we’re using rapid antigen tests for the majority of our testing, the only way we know people’s results is if they voluntarily notify them. That means that like many places in the world, New Zealand’s current daily cases are an underestimate, though exactly by how much is hard to say.
The official global death toll is also an underestimate. Currently, the count is a little over 6.1 million deaths. However, the Economist estimates that the likely toll is currently somewhere between 14.1 and 24.7 million when accounting for all the excess deaths that have happened during the pandemic. And yes, not all of those will be people dying after being infected with the Covid-19 virus. Some will be because, at times, hospitals and healthcare systems have been so overwhelmed that people with other unrelated health issues weren’t able to get the care they needed.
What worries us is that as more and more countries transition to the new Covid normal, the data is going to get more and more difficult to interpret. For example, on April Fool’s Day, the UK government ended free Covid testing for most people in England and removed the requirement for infected people to isolate. And no, it wasn’t a joke. Now if people want to know if they have Covid-19, they will have to pay for a test, even if they are symptomatic. Yes, there are some exceptions, but limiting access to testing is not only going to impact the course of the disease in England, it will also show up in the quality of the data recorded on sites like Our World in Data. See how France’s cases are currently increasing again while the UKs are now dropping? It’s no surprise that has coincided with the removal of free testing.
Hospitalisation data is another really tricky one. Covid-19 hospitalisations can broadly be broken down into several groups. Firstly, there are hospitalisations where Covid-19 is the main reason the person needs to be in hospital. Secondly, there are hospitalisations of people who happen to have Covid but that’s not the reason why they’re in hospital. These are also called “incidental” Covid-19 hospitalisations. Think of people who are pregnant, or who have had an injury of some sort. Then there are hospitalisations where Covid-19 infection is a contributor to their need to be in hospital. Maybe the person has a chronic illness that has been made worse, or is revealed, by them catching Covid. If it weren’t for their Covid-19 infection, these people may not need to be in hospital.
In young children, “precautionary” hospitalisations are quite common. This is where children, typically infants, are admitted to hospital as a precautionary measure without giving them any specific treatment, to observe them and make sure that they do not deteriorate. Precautionary hospitalisations among very young children can also happen because medical guidelines routinely recommend an admission while doctors wait for the results of lab tests, for instance blood cultures, to come back. Finally, infected children can be hospitalised because their parents or caregivers are unwell with Covid-19 and are too ill to look after them, not because the children themselves need medical care. These are referred to as “social admissions”. This was a very big issue with delta, but hasn’t been such a problem with omicron. The raw numbers of hospitalisations that are often reported don’t distinguish between these different types of admissions.
Not all hospitalisations due to Covid-19 are the same either. A person with severe Covid-19 pneumonia may take several days to weeks to be well enough to go home, while someone with a bad bout of vomiting and diarrhoea due to Covid-19 may recover swiftly within 48 hours after intravenous fluids and anti-nausea medication. These all get lumped together in the reporting too.
As we said earlier, the majority of our Covid-19 testing is now being done using rapid antigen testing. Coupled with the increasing transmissibility of the virus making contact tracing every close contact pretty much impossible, this means we don’t know the true number of people who have or have had Covid. And that means working out the proportion of Covid-19 infections that lead to hospitalisation, or death, has become very difficult. The “denominator” – the number of cases – is no longer accurate.
For instance, at the time we wrote this article, the Ministry of Health reports that there have been 281 hospital admissions among 136,393 infections among young people aged 10-19 years since the August delta outbreak. If we try to use the raw numbers to estimate the hospitalisation ratio, we’d end up calculating that Covid-19 put 0.2% of infected people in this age group in hospital. But this would definitely be an overestimate because those 281 hospitalisations include incidental Covid-19 admissions, and we can be pretty certain that there have been many more infections than the 136,393 reported to the ministry.
To sum up, the real-time global data on Covid is getting harder and harder to interpret. Looking at the graphs on sites like Our World in Data, it can be easy to misinterpret what’s happening in a given country without knowing the local context. And to really understand it takes multiple perspectives. It all very much reminds us of this version of a cartoon by David Somerville.
As our borders open and we adjust to a world with one more highly infectious disease in it, we need good-quality data more than ever. Here in New Zealand we desperately need our own version of the UK’s ONS prevalence survey, which regularly gets nose and throat swabs as well as blood from a representative sample of people in the UK. Without this, how will we know what the true rate of Covid-19 infection is in New Zealand, and what variants are circulating? We also need better data to understand how the pandemic may influence people’s health and wellbeing in the longer term. To do that, we need to think about the things we aren’t directly measuring. The pandemic has upended everyone’s lives and the true impact won’t be fully understood for many years.