If you asked someone what a positivity rate was back in 2019, you’d probably get a blank stare and some response like…
“Um…, you mean how positive am I?”
Now in 2021 unless your questionee has been living under a rock, you’ll most definitely be hearing a semi-informed answer, one that public health professionals have had on the tips of their tongues for a long time.
How about type 2 errors? Predictive analytics? Data dashboards?
Get real. The average Joe didn’t have a lot of exposure to these things in 2019.
But here were are in 2021, and I bet you money that you and most people have been exposed to all 3 of those, even if you didn’t recognize it.
The COVID Data Goldrush & The Drive To “Semi” Nerd-dome
TL;DR — If there’s a need, there’s a will to become a “semi” data nerd.
Back in the initial stages of the pandemic, there was an absolutely insatiable desire to understand the dangers of Covid-19. With that rush of attention came a rush to find data and the words to starting talking about it.
The first term was quite easy: case numbers. Pretty easy to understand for everyone, but as the cases started to increase alongside testing, vocabularies had to be updated.
- Hospitalization Rates / Counts
- Death Rates / Death Count
- R0 — “R naught” and Exponential Growth Factors
- Herd Immunity Rates
To understand these terms, we needed numbers. To have numbers, we needed reliable data; not just data scientists and policymakers, but for everyone.
The Data “Nerdification” Begins
In the beginning, everyday people needed data to make difficult and sometimes life-altering decisions:
- Should I shut down my business if the community has X number of cases in my community?
- What’s the risk for me if I go out in my community?
- How effective are masks, and should I wear one?
- How likely am I to die if I get infected?
People needed reliable data. With every initial data source there we HUNDREDS of caveats. Researchers scrambled to find any accurate data sources for COVID-19. It was the wild wild west with many unreliable sources popping up.
One tool that “helped” a lot of business owners was the “Should I shut down my business?” tool. The idea was simple. Provide a Google Sheet that could be used by entrepreneurs and business owners across the US to determine risk stats to decide if they should close down their business. The intentions were good. Look up your locality and see what’s the risk of you interacting with someone covid-19 positive at that date. The problem was that no one knew how to use such a tool, and it was quite a complex UI.
Although it didn’t really help much mostly due to its UI issues, it did show one thing: people were hungry for data.
To fill this what could be argued as a market gap, companies and data scientists started throwing together Covid-19 dashboards and news briefs. With each new exposure to these new forms of data and analysis, people subversively became ever so slowly data “nerdified”.
Fast forward 14 months later, and it is now common to hear people discussing statistics out in the open. Googleing for data confirmation sources. It’s commonplace. Overall, people have become more accustomed to using data to inform their everyday decision-making, even if it is just for not dying from a highly infectious disease. This is good for everyone, and I argue that this opportunity shouldn’t be wasted.
Can We Continue Being “Semi” Data-Driven Nerds?
Data has been informing everyone’s decisions during the pandemic, and as a society, we should hope this slight nerdification continues into the future.
We should hope that people can continue to use data to…
- make more-informed health decisions.
- reduce personal and business risks.
- make more money for their businesses.
Being forced to dip our collective toes into the field of public health and data science this past year has been a silver lining of the pandemic. My hope is that everyone can continue to improve their data literacy and knowledge in other subjects as well and that the data “nerdification” can continue on.