January 24, 2021 - 188 views|
It won't be easy finding the relevant and precise data needed to end COVID, but that's part of the work we're doing with the Pandemic Response Challenge.
As an advisory board member for the Pandemic Response Challenge launched by Cognizant and XPRIZE, I’m all too aware of the complexities involved with accurately predicting how policy interventions will change the trajectory of COVID-19. There’s so much data that needs to be considered to accurately make these predictions, and that data can be hard to find – if it exists at all.
The fact is, government policy is just the tip of the iceberg when it comes to understanding the pandemic. We already know that similar policies can have vastly different outcomes in different countries. Compare Singapore and France, two countries with similar overall responses as measured by the Oxford COVID-19 Government Response Tracker (OxCGRT).
Further, even if a certain policy worked well in the past, that doesn’t mean it will work again in future contexts. Predicting the impact of today’s policy choices will only grow more complicated as vaccines slowly roll out in the next 12 months. In short, context matters, which makes it crucial to get the data on that context.
It’s all about the data
This is why the Pandemic Response Challenge is so important, as we’ll learn more from participants about the best data to strengthen their models. Here are a few areas where I believe additional data will make a big difference in producing highly accurate prediction models of daily COVID-19 cases and the best prescription models for intervention plans:
Finding the data
None of these ideas are easy. If they were, someone else would already be doing it.
With the exception of aggregate-level mobility data (such as that published by Google and Apple) and testing statistics (such as those collected by Our World in Data), I don't know of any good data sources on these issues that cover every country in the world. That is part of the challenge, and any contribution of additional data would be meaningful.
But I am optimistic that teams will rise to the challenge. I could imagine, for instance, creative teams scraping social media posts to analyze public sentiment and willingness to comply with stricter measures. Or perhaps using satellite imagery to track patterns of activity (cars on the road; night-time lights) or creating maps of population density over time. Air traffic and import records might show how international flows correlate with transmission. Financial transaction microdata might indicate the extent to which people in a village will comply with a lockdown. Global survey responses might tell us how values differ around the world, or how much the general population actually comprehends public health campaigns.
The Pandemic Response Challenge isn't about figuring out what worked in the past. It’s much harder than that. It's about trying to predict what will work in the future. I’m convinced that the best entrants won't just have the most efficient machine-learning algorithm to crunch through our OxCGRT policy data – they will also find the most novel, relevant and precise additional data to strengthen their models.
This blog was adapted from a post that originally appeared on the XPRIZE website.
Learn more about our Pandemic Response Challenge with XPRIZE, a $550k, four month challenge that focuses on the development of data-driven AI systems to predict COVID-19 infection rates and to prescribe intervention plans.
Or listen to our podcast, in which author and entrepreneur Jason Stoughton speaks to blog author Toby Phillips and Bret Greenstein, Senior Vice President of AI and Analytics at Cognizant, about the XPRIZE Pandemic Response Challenge.
a $550k, four month challenge that focuses on the development of data-driven AI systems to predict COVID-19 infection rates and to prescribe intervention plans.