What the GenderSci Lab learned from 8 months of tracking state reporting of socially relevant variables in COVID-19 outcomes

By Mimi Tarrant


This post is the second in a three-part series. Links to the other two are below:

Blog 1: Introduction to the GenderSci Lab US State Covid-19 Report Card and The Final Report Card: The “State” of Affairs as of February 2021

Blog 2: What the GenderSci Lab learned from 8 months of tracking state reporting of socially relevant variables in COVID-19 outcomes

Blog 3: Introducing a new state “Report Card” and The First New Report Card: Results from April 2021


In the first blog post in this series, we described the creation of the Report Card, and outlined our findings for February 2021. Here, we will summarise our general reflections from our 8 months of tracking data collection and reporting through the current iteration of our Report Card.  First, we'll describe the challenges of data collection, then we will explore changes we observed in the reporting of socially relevant variables by each state. What we learned, in short, is that variables that a state reported at the beginning of our tracking period were generally the variables that the state continued to report throughout the pandemic. With little change in reporting being observed in our latest Report Cards, it's clear that it is time to revise our methods and tracking as we look to the next phase of the pandemic in the U.S.

Challenges of data collection

The process of collecting data for the creation of our Report Cards involved many challenges.  Such challenges expose poor data practices by state health departments and impose certain limitations on the interpretability of our Report Cards. 

First, the accessibility of the data available on state websites varied dramatically state by state, as there is no uniformity across states in how data is reported. Some states, such as Arkansas, use interactive dashboards to present their data. Others, such as Minnesota, only provide static PDF uploads which can be downloaded from their state department health websites. There are also some states, such as Alaska, which report certain socially relevant variables on interactive dashboards and other variables only within downloadable CSV or Excel files. A final example is New Mexico, which reports certain socially relevant variables in its dashboard, as well as others under its ‘Epidemiological Reports’, which can be found through a hyperlink on their main webpage. 

A second limitation of the Report Card is the variation in frequency in which states update data. Certain states, such as North Carolina, update data on socially relevant variables (such as age) on a weekly basis, while others, such as Illinois, provide live updates on their data portal as data is entered. For these states, it is relatively easy to discern which socially relevant variables are being reported, and which variables aren’t. For other states, the matter is not as straightforward. For example, for the state of Montana, our December Report Card recorded the presence of interactional data for cases, as we found a report brief released by the state on October 1, 2020, that reported such data. Despite this report brief being 2 months old, we recorded it as evidence that this type of socially relevant variable has been reported. The report brief has since been removed from the website, and as such Montana no longer receives a point for reporting interactional data for cases. 

Therefore, each analysis of state department websites has been highly unique to that website and state, with it proving increasingly important for the data collector to have an understanding of the layout of each state department health page to help determine whether socially relevant variables were being reported. This challenge in finding data means that there is the inevitable limitation of human error within our Report Card; it is possible that states were reporting socially relevant variables, but as we were unable to find such reporting, we recorded that states were not reporting these variables.

Changes in reporting of socially relevant variables across 8 months of the pandemic

Variables that increased

Across 8 months of the pandemic, reporting practices for our socially relevant variables improved only minimally. As of June 2020, 49 states and the District of Columbia reported both age and gender/sex for cases, while 47 states reported race/ethnicity for cases. As of February 2021 only one additional state has begun reporting gender/sex for cases, taking the total number of states reporting gender/sex for cases to 50, while the total number of states reporting race/ethnicity for cases increased to 50. There was also an increase in the number of states reporting comorbidities for cases, with an additional 2 states reporting by February 2021, joining the 5 states that had been reporting in June 2020. 

Fatality reporting practices saw improvements in the number of states reporting for all of its socially relevant variables over the tracking period. The number of states reporting age for fatalities increased from 48 to 50, while those reporting gender/sex for fatalities increased from 45 to 49. Race/ethnicity reporting had the largest increase, with an additional 7 states reporting this variable by 2021 in addition to the 43 that were reporting in June 2020. Comorbidity reporting also saw an increase from 12 to 17 states.

States that improved

At the state level, Hawaii and West Virginia both increased their scores by 4, while New Mexico saw the largest increase in score by reporting an additional 6 variables between June 2020 and February 2021. While these improvements in reporting practices should be lauded, it is important to contextualize these improvements through the understanding that all three of these states were reporting fewer than 4 variables at the beginning of our Report Card collection. This means that they had significant room for improvement in their reporting practices, and thus their improvements reflect the states meeting an already low data reporting standard, rather than exceeding it.  

Variables that saw the least improvement or regressed

Two variables stand out for their lack of improvement in reporting rates from June 2020 to February 2021: interactions between socially relevant variables, for both cases and fatalities. In June 2020, a total of 16 states were reporting interactions between variables for cases, yet by February 2021 this had decreased to only 13 states. Similarly, improvements in the number of states reporting interactions between variables for fatalities were very minimal: 11 states that reported interactional data for fatalities initially increased to only 12 states by February 2021. 

The lack of rapid improvement, and in some cases regression, seen in data reporting practices for variables and the interactions between them is particularly worrying given the importance of interactional data for effective policy responses. As argued in our Health Affairs blog post, data on interactions between socially relevant variables “allows for interpreting socially relevant variables across contexts, improving our understanding of how variables mediate one another”. For example, work by the GenderSci Lab has demonstrated the importance of examining intersectional data for understanding an individual’s risk to a disease. Investigating the variable of gender/sex in isolation for COVID-19 outcomes suggests that men have higher COVID-19 mortality rates than women, yet research shows that this sex disparity does not hold across racial groups, with the COVID-19 mortality rate for Black women being higher than that for white men, white women, and Asian/Pacific Islander men and women (Rushovich et al. 2021). This highlights the importance of understanding how race and gender/sex intersect with one another to influence COVID-19 outcomes among individuals.  

Four states saw a decrease in score by 1 over the period of tracking: Florida, Minnesota, Rhode Island and South Dakota. Interestingly, all four of these states saw their score decrease by 1 over the 8 month period because they no longer reported interactional data for cases, a worrying trend given the aforementioned importance of intersecting different socially relevant variables. 

 
Figure 1: Points scored by each state, in our first (June 2020) and last (February 2021) Report Card. States in green have seen an increase in the number of points scored by increasing the number of socially relevant variables being reported; states…

Figure 1: Points scored by each state, in our first (June 2020) and last (February 2021) Report Card. States in green have seen an increase in the number of points scored by increasing the number of socially relevant variables being reported; states in red have reduced the number of variables being reported across the 8 month period.

 

The need for a more sensitive, comprehensive Report Card

Over the time period of tracking, from June 2020 to February 2021, very little has changed. As shown in Figure 1, only 11 states increased the number of socially relevant variables being reported across the 8 month period, while 5 states saw decreases. In the first Report Card, 16 states reported interactions for cases, 11 states reported interactions for fatalities, and the average grade was 6.41, or a D. As of February 2021, only 13 states report interactions for cases, 12 states report interactions between variables for fatalities, and the average grade was 6.84. That is, the average grade remained a D throughout the 8 month tracking period, although it is important to note that the standard deviation of scores has decreased from 1.51 in June 2020 to 1.25 in February 2021, meaning the spread of scores has slightly reduced in the 8 month period of tracking. 

The lack of rapid improvement, and in some cases regression, seen in data reporting practices for variables and the interactions between them is particularly worrying.

The lack of improvement in the reporting of socially relevant variables over the observation period, as shown in Figure 1, is disappointing. However, we acknowledge that the COVID-19 pandemic has placed tremendous strain on public health resources, with state public health departments facing unprecedented and unpredictable challenges in the last year. Our Report Card has called attention to the inflexibility that state public health departments currently operate under. The fact that very few states saw a dramatic change in reporting practices over the course of our Report Card analysis suggests that states experienced significant barriers to reporting new variables for cases and fatalities. 

It appears that variables that a state initially recorded and reported were the variables that it continued to report throughout the pandemic. This underscores the importance of having sufficient and comprehensive reporting practices in place, proactively preparing for unexpected epidemiological events such as COVID-19. It is clear that the U.S. was ill-equipped to deal with a large scale public health disaster going into 2020; indeed, Trump even ended a government-funded early warning program for future pandemics in September 2019, delivering a clear message that pre-emptive, proactive public health policies were not a federal priority. The COVID-19 pandemic has highlighted the need to center comprehensive, transparent, and accessible epidemiological data collection practices within federal and state initiatives in the future. We hope that our Report Card can serve as an archive of the lack of comprehensive reporting practice, and in the future act as a reference to help direct such planning. 


Recommended Citation

Tarrant, M. “What the GenderSci Lab learned from 8 months of tracking state reporting of socially relevant variables in COVID-19 outcomes.” GenderSci Blog. 2021 June 21, genderscilab.org/blog/what-we-learned-from-8-months-of-covid-report-card-tracking.

Statement of Intellectual Labor

Tarrant drafted the initial blog post, led the writing process and contributed to data collection and validation. Capri D’Souza and Kai Jillson collected and validated data, and provided edits. Kelsey Ichikawa and Sarah Richardson provided feedback and edits.