The new transparency index (T-index) shows that the official pandemic fatality figure is deeply flawed in many countries, starting with Russia
Once a war starts, propaganda becomes a chief weapon, so it is impossible to know the number of casualties from only one source. But even before the Russian-Ukrainian war, in the fight against the COVID 19 pandemic the truth seems to have been among the first casualties. Not only had a real information war been waged around the source of the virus – after all, this remains controversial after one hundred years even for the ‘Spanish’ flu- or its label (‘Chinese virus” versus Corona versus COVID 19), but the death count as well has become the object of many controversies. Officially, Russia had a moderate number of casualties. By other estimates, however, it is leading the world in the amount of COVID 19 related deaths. While waiting for the historians to shed light on the real number of war casualties, could we at least try to shed some light on the pandemic casualties’ figures? Yes, we can!
For instance, by the spring of 2022, the United States of America’s death count from the COVID 19 pandemic reached one million. In a population of 330 million, this represents 0.3%. Scary as the figure is, it is far less than we would expect. Spanish flu had a crude mortality rate of 2.7% of the world population at the time, killing 50 million people. But probably not everybody caught Spanish flu, just as not everybody was contaminated with COVID 19. To really compare the killing potential of the two viruses we have to calculate how many people died from the total of people infected or the case fatality ratio (CFR). That is apparently simple, one just divides confirmed COVID 19 related deaths by COVID 19 confirmed cases. In the absence of hard evidence that some populations are genetically more vulnerable the differences across countries, then, would allow to calculate fatalities and map the effectiveness of the government response. A null hypothesis would imply that if all governments perform similarly in dealing with the infection, we would encounter similar CFR across countries, controlling for the number of months of the pandemics in each case and the age structure of the population.
But that is not so simple. In fact, we do not know either the total number of confirmed cases (the number of tests varies wildly across countries, by a margin that we cannot know accurately) or the number of confirmed deaths. Belgium, for instance, has been on top of COVID deaths per 1 million population for quite a while, and they have defended themselves from the accusation that they lose more people than any other country by the comprehensiveness and transparency of their reporting. Dying with and dying from COVID-19 implies a thorough investigation into the cause of death, beyond citing a positive SARS-CoV-2 test, which many countries do not do for lack of resources or fear to panic their population. The simple measure of CFR is thus not really useful to understand either the toll that countries paid to the pandemics or the performance of health systems to control the sickness. The BBC together with researchers from the UN’s Economic Commission for Africa (UNECA) surveyed the deaths’ reporting on the occasion of this pandemic, to uncover the fact that except for Albania and Monaco all European countries have one, but just over half of Asian countries and a minority of African countries have functioning, compulsory and universal civil registration systems – known as CRVS systems.
Research explaining the fatality rate of COVID 19 has focused so far on three key groups of determinants and the social and political factors that shape them: the baseline characteristics of the population and communities they live in; the response policies by governments; and the health care systems’ capacity. We argue in this paper that controls are needed for transparency of data and quality of governance for all the inferences made on explaining the official figures of fatalities, irrespective of the debate being on the sickness or the response to it. Indeed, it is naïve to presume that seeing the wide variation in the quality of government around the world one can trust absolutely the figures reported by governments (such as Russia or China, on one hand, or low-capacity states in Africa, on the other) and not collected and observed by an independent source. Trackers of government performance such as Oxford’s cover a variety of government responses (regulations, tests’ numbers, contact tracing), but the context differences in which such measures operate, such as the rule of law or the degree of informality that impacts hygiene may be strong intervening variables. A context which, if known, would allow us to weigh down Belgium’s casualties compared with a country that reports far fewer infections and deaths despite having in general a far worse life expectancy. And, of course, that would lead to a totally different assessment of the effectiveness of Belgium’s response to the pandemic.
Such specific governance measures do exist which could and should be used to control the national fatality reports to compare better across countries and understand the evolution of the pandemics. The recent fact-based Indexes for Public Integrity and transparency (T-index) developed by the European Research Centre for Anticorruption and State-building (ERCAS) at Hertie School in Berlin measure transparency on the basis of objective indicators, not perceptions, so could be used for such a purpose.
The attempts to measure real transparency have come in general by sector: statistics data made available to the World Bank by countries, pharmaceutical sector data, party finance data, procurement data. Such measurements have the advantage of specificity and actionability, but they are hard to come by for many countries. A global measurement so far has only been available by proxies: the UN E-government Survey and the freedom of information act (FOI) based measures are the most used ones. They differ at first sight but, in fact, they belong to the same category of legal or de jure transparency measurements (the existence of laws instituting freedom of information or the existence of specific obligations and provisions to this effect). The UN survey, which measures e-government (which looks at the ITC government infrastructure as well as the human capital) and ‘e-participation’ (transparency and openness) consists of self-reports by governments through the METEP questionnaire, based exclusively on regulation and organization (laws and decisions). The existence of measurement on real transparency would allow establishing benchmarks of transparency, and thus inform a very specific reform agenda, offer an international ranking based on facts, not perceptions, and enable policy-relevant research. It would allow relating transparency to different policy areas, such as public health.
The T-Index created by ERCAS in 2021 for 129 countries (after dropping Afghanistan) is precisely that. Similar to the World Bank statistics paper by Hollyer and all, the T-index is based on the direct observation of public data, its accessibility and coverage, the practice of transparency rather than just the legal provision of it. The specific information monitored was selected from the universe of unclassified information based on the United Nations Convention against Corruption (UNCAC) and the Sustainable Development Goal 16, which both make transparency a crucial point for government accountability. T-index directly observes 14 web pages that any country should offer, covering all the national data and free of access, from public expenditures or procurement to online publication of data repositories on ownership of businesses and land estates. Additionally, the older Index of Public Integrity (IPI), measures the capacity to enforce public integrity in a society (or corruption risk) using six different factors proven by previous literature to serve either as enabling or disabling circumstances of corruption. IPI and T-index both have very high interval validity and correlate at 70%, as they measure overlapping, but different concepts.
Using the newly released T-index and the official fatality data reported by countries and published by WHO (where Belgium is on top of the world) we find indeed a strong significant association (60% correlation index, 40% variation explained by running a bivariate regression) between higher casualties and transparency. Belgium, with high transparency and high fatality, appears worse than Uzbekistan, where both fatality and transparency are very low, for instance. Countries in the higher upper right corner have high fatalities and high transparency, including Belgium, Estonia, France, Spain, and many Central European countries, while those close to the line on the left (like Yemen or Venezuela) appear to have fewer deaths and low transparency. Unless we presume that government transparency kills people, the obvious explanation is that reported deaths depend to a very large extent on how transparent a government is and how well it reports its fatalities.
Figure 1. The association between transparency and official death figures
It may be that this striking finding is only a coincidence, despite the magnitude of the association. We next introduce some minimal test controls to explain deaths: the age structure of the population, as COVID 19, unlike the Spanish flu, targeted older and more vulnerable people, and health expenditure, the obvious control for the capacity of a health system (which largely explains the number of hospital beds in intensive care, the diagnosis capacity, etc.). All determinants are highly significant, and remain so in the multivariate model, with transparency scores predicting the number of reported official deaths per 100k for at least 32% of the 127 countries we have the data for, slightly more than the second powerful predictor, the number of people over 65 years of age as percentage of the total populations. In other words, even when countries are similar in age structure and health expenditure the countries with more transparency have a higher number of casualties due to COVID 19, which we interpret as indicating the higher accuracy and transparency of their deaths’ reporting. This indicates also that using these figures without controlling for data accuracy is flawed and may lead to misinterpretations and erroneous conclusions by doctors and public health officials who rely on such data.
Table 1. Multivariate regression explaining the official death count as of February 2022
Legend: OLS regression with total deaths per million (log) as dependent variable.
So, the official fatality data cannot be trusted. But what about the alternative? Researchers developed the concept of excess mortality, a metric that involves comparing all deaths recorded with those expected to occur, reviewed by the leading scientific magazine Nature. That also has limitations, as statistics from more than 100 countries on expected or actual deaths are either not accessible or not reliable, so the number of cases declines. Such alternative figures (based on door-to-door surveys, machine-learning computer models and even satellite images of new graves) are computed as number of deaths per 100k or % estimated deaths from presumed deaths (p-score). The end results, for instance, are figures such as the widely cited The Economist magazine excess deaths dashboard, which are 2 and 4 times higher than the official ones. Using alternatively these available measurements (per 100 k as well as the p-score) we test the T-index again to find it insignificant this time. Excess deaths do not correlate with transparency, although they do correlate with other governance indicators (for instance, public integrity or corruption), which is normal, since they are, in fact, meant to adjust for the lack of accuracy of deaths’ reports. The association between fatalities (excess deaths) and governance now looks more normal, with an inverse correlation (the more corrupt countries have higher casualties), with Tajikistan in the upper left-hand corner with more deaths and low transparency and New Zealand in the lower right-hand corner. Alongside a handful of other countries (Singapore, Taiwan, Australia, Lichtenstein…) New Zealand’s excess deaths figures are in the negative- so accurate their reporting is and so good their management of the pandemics. The association between governance quality, proxied by corruption, and excess deaths holds with control for health expenditure, vaccination rate, HDI and religion. This is worth further in-depth study, as this pandemic made the issue of governance quality more salient than ever.
Figure 2. The association between public integrity and real (‘excess’) deaths figure
The governance context is of course grounded in a development one, and their disentanglement needs further, more in-depth research at the level of each country. However, it was our point to prove in this small paper that any research on COVID 19 fatalities and any assessment of the pandemics toll and the government response impact can only start with accurate fatalities’ figures. Transparency and accuracy of public health figures are indispensable for an effective response not only during a pandemic but also before and after.
 Kontis, V., Bennett, J. E., Rashid, T., Parks, R. M., Pearson-Stuttard, J., Guillot, M., … & Ezzati, M. (2020). Magnitude, demographics and dynamics of the effect of the first wave of the COVID-19 pandemic on all-cause mortality in 21 industrialized countries. Nature medicine, 26(12), 1919-1928.
 Hollyer, J. R., Rosendorff, B. P., & Vreeland, J. R. (2014). Measuring transparency. Political analysis, 413-434.
 Mugellini, G., Villeneuve, J. P., & Heide, M. (2021). Monitoring sustainable development goals and the quest for high‐quality indicators: Learning from a practical evaluation of data on corruption. Sustainable Development, 29(6), 1257-1275.
 See Mungiu-Pippidi, Alina, and Ramin Dadašov. “Measuring control of corruption by a new index of public integrity.” European Journal on Criminal Policy and Research 22.3 (2016): 415-438. Database accessible at Public Integrity Index.xlsx (live.com), last accessed 30.03.2022.
 Karlinsky & Kobak, (2021), Tracking excess mortality across countries during the COVID-19 pandemic with the World Mortality Dataset, eLife; ERCAS, database available at Transparency Index.xlsx (live.com), both last accessed 20.03.2022.
 Our world in data COVID 19 dataset, accessible at https://github.com/owid/covid-19-data/tree/master/public/data; T- index dataset, accessible at Transparency Index.xlsx (live.com); Health expenditure per capita from World Bank, for the year 2018, accessible at https://datos.bancomundial.org/indicador/SH.XPD.CHEX.GD.ZS, All data sources last accessed 20.02.2022.
 Giuliana Viglione. How many people has the coronavirus killed? Nature. Vol 585: 3 September 2020, available at How many people has the coronavirus killed? (nature.com), last accessed 20.03.2022
 Covid 19 Excess Deaths tracker, https://www.economist.com/graphic-detail/coronavirus-excess-deaths-tracker, last accessed 20.03.2022.
 See Hannah Ritchie, Edouard Mathieu, Lucas Rodés-Guirao, Cameron Appel, Charlie Giattino, Esteban Ortiz-Ospina, Joe Hasell, Bobbie Macdonald, Diana Beltekian and Max Roser (2020) – “Coronavirus Pandemic (COVID-19)”. Published online at OurWorldInData.org. Retrieved from: ‘https://ourworldindata.org/coronavirus’ [Online Resource], last accessed 20.03.2022.
 See Mungiu-Pippidi, Alina, and Ramin Dadašov. “Measuring control of corruption by a new index of public integrity.”