COVID-19 has turned our professional and personal lives upside down. But it’s also giving us a crash course in using artificial intelligence (AI) to predict and navigate an uncertain future. While there’s still a lot to learn, we believe so strongly in the potential for AI to lessen the social and economic impacts of COVID-19 that we’ve partnered with The XPRIZE Foundation to launch the Pandemic Response Challenge.

During this four-month challenge, participants will use the latest in AI to develop models that accurately predict local outbreaks and produce prescriptive intervention and mitigation approaches that minimize infection cases and economic costs.

The competition will produce evidence-based, actionable plans to be shared as a public resource, so that communities around the world can learn from each other to safely reopen their societies and economies. Two winning teams will produce prediction models that estimate future numbers of daily COVID-19 cases with the greatest accuracy and the best prescription models for intervention plans.

xprize

Keeping Up with Continuous Change

A key challenge for these types of endeavors is the unrelenting change in everything from infection rates to data quality. What we’re learning from both our clients and our own work is the need to more frequently and carefully update both our data and the models used to analyze it.

Our recent study of 600 North American executives, reinforces this. Roughly three-quarters of respondents changed their data management processes and analytic models in the wake of the pandemic. The most common issues faced were analytics and models that were inflexible, out of date or overly weighted toward pre-COVID data. 

As a result, about two-thirds of respondents said they are developing new analytics and models, refreshing their models and data, and integrating new data types such as geo-location and social media more often than before.

These efforts are paying off. Almost 60% of respondents said they were more confident in the accuracy of the projections of their analytics/models than before, and 68% said their companies were relying on the projections from their analytics/models more than before.

Fine-Tuning Data and Analytics

In our ongoing work to leverage AI to navigate not just the pandemic but also any complex business challenge, we’ve learned five valuable lessons about using data and AI models: 

  • Even inexact data can be useful, as long as it’s consistently inexact. For instance, if you suspect a country or community is underreporting COVID infections but their reported numbers rise over time, you can use that data to infer the underlying trends. The same could be true, for example, for assessing quality trends from a plant manager who consistently underreports poor production yields. Using data from multiple sources also helps to accurately sense the underlying trends beneath the noise of poor-quality data.
  • Running your models with more varied data makes them more robust. It may seem like a good idea to train a model to specialize for a particular geography or time period; however, it’s often better to expose the model to more varied situations, i.e., using data from other geographies and periods, as well. This way, rather than just repeating specific past experiences, the models can develop a better understanding of the underlying causes and effects, which allows them to perform more robustly in the situation of interest. We run our COVID models continuously across all geographies, with all the data available for analysis each day.
  • The biggest hurdle to effectively using AI is the lack of ongoing budget and commitment to data and model updates. Even when our customers see compelling value from an AI demo, they often don’t follow up because they’re used to thinking of analytics as a one-off activity rather than an ongoing process. Success requires formalizing the processes, staff, skills and budget to make such updates routine.
  • Managing the granularity of your analysis is as much an art as a science. It’s all about balance: Focus your data and models too narrowly on one geography or market, and you risk being misled by poor data or local, transient events. Expand them too broadly, and you risk the specific insights you need to, for example, understand disease spread in countries with different levels of compliance or the needs of customers in different geographies or cultures.
  • Match the frequency of your data and model updates to your needs. Sometimes, as with COVID, you will need to make updates every day as new data arrives, old data is updated or conditions change, like a new outbreak. Tracking customer preferences in clothing, for instance, might require updates only when the seasons change; in consumer electronics, however, updates are needed only when a new product is being developed or introduced. In each case, you will need to ensure the quality of the data and, in some cases, transform it from legacy data formats into a form that’s manageable for both modern databases and AI algorithms.

For the last century, analytics activities have relied on static mathematical formulas sifting through static data. Today, the foundation of analytics is data, analyzed by ever-improving and evolving models. In this new world, organizations can make the most complex and important decisions of our day, such as how to balance reopening economies while protecting human health. Agile analytics enabled by AI can tell us how to reach those goals – but only if we give the underlying data and models the care they need.    

Learn more about our XPRIZE Foundation challenge.

Risto Miikkulainen

Risto Miikkulainen

Risto Miikkulainen is Associate VP of Evolutionary AI at Cognizant and a Professor of Computer Science at the University of Texas at... Read more