It’s no secret that in today’s dynamic digital age, data-driven enterprises that effectively apply advances in artificial intelligence (AI) and analytics technology can increase profitability, improve customer interactions, identify inefficiencies, and spur the development, manufacturing and distribution of high-value, in-demand products and services. 

However, translating these notions into reality is rife with complexity and challenges.

For starters, C-level business leaders must set clear, outcome-oriented goals and define specific success criteria to measure the value created by their data and analytics initiatives. To truly gain enterprise-level adoption and reap maximum value, organizations must understand, at a strategic level, their ultimate goals, and then identify the data necessary to achieve them.  It’s also critical to develop a focused, outcome-oriented business use case to identify and generate insights from the subsequent accumulation of (the right) data. 

Whether your organization is in the media and entertainment business seeking data-informed insights to help improve box office revenues, or a retailer looking to expand its “share of wallet” by offering smart recommendations, it’s vital to clearly define and socialize your goals, as well as establish a well-thought-out business use case.

From there, businesses need to acquire, store, integrate, prepare and manage that data to ensure that a particular AI and/or analytics initiative delivers true business value.  Trying to analyze data without clearly understanding a desired outcome (i.e., insights sought) is like chasing unicorns in a fantasy data management game.

Moving Quickly But In the Right Direction

Over the years, I’ve seen far too many organizations try to run before they can walk, jumping feet-first into the latest and greatest technology solutions that promised better control over their data and more actionable organizational insights.  While this can be successful at a departmental, regional and even business unit level, undertaking large-scale, enterprise-wide initiatives without first defining, socializing and deploying the proper foundational elements is a recipe for failure. 

One such example is a global financial services client that spent nearly 18 months implementing an enterprise data lake to fuse massive volumes of structured and unstructured data, with the goal of gaining better insights to run and grow the business.  Unfortunately, while the data is (largely) unified in a common data repository, it has offered little to no analytic value to date due to a lack of upfront planning.  In this case, the old insurance business adage applies: “People do not plan to fail, they fail to plan.” 

Establishing the Right Foundation

Distilling value from large volumes of disparate data is a complex and ever-changing challenge from a tool selection perspective. To ensure insights are meaningful and actionable, enterprises need to ensure the correct data is prepared and governed properly by a scalable, cost-effective and integrated technology architecture, enabled by the right tools, accelerators and platforms.  

The underlying data foundation (from architecture to governance) must also be well-thought-out, scalable and fully integrated (legacy data from traditional sources and new forms of data from sensors, smart devices, etc.). Sadly, there really is not a one-size-fits-all model.  Some companies prefer to supplement their existing technology investments with emerging technology solutions in more of a retrofit approach, while I’ve seen others move in a complete rip-and-replace direction.

The chosen approach is often dictated by cost, political pressures from leadership or even cultural obstacles within some organizations.  I’m reminded of a leading U.S.-based energy company that long struggled to create even fairly basic reports and too often at an unacceptable pace, missing the deadline of early-morning leadership meetings. We discovered that the underlying issues centered around the data – the volume, lack of governance and poor integration.  We were able to migrate terabytes of historical data from the enterprise data warehouse to a lower cost, higher performance, open source solution that was more tightly integrated with the existing warehouse. We also put in place a robust governance framework to ensure go-forward data integrity. 

By addressing the underlying data foundational issues, this client was able to decrease data storage costs, improve data processing speed and, most importantly, benefit from real data insights to help run the business. 

Shifting into a Higher Gear

Many organizations are now shifting from small, regional, divisional or use-case-specific data and analytics “experiments” – framed as pilots or limited prototypes – to more extensive, enterprise initiatives. 

In fact, more medium-size and large, global companies are undertaking large-scale data and analytics initiatives that utilize the massive volumes of data that they generate internally and across  their extended value chains (structured and unstructured, as well as internally and externally sourced). These companies are coupling these initiatives with advanced, emerging analytics and AI tools to extract maximum value, enabling their businesses to move from data-driven historical insights to knowledge-based foresights that see around corners and market obstacles into customer desires and market needs.  

A good example comes from the energy sector – specifically oil field services. While we hope that another oil field disaster doesn’t occur, hope is not an effective strategy.  That’s why, at one company we’ve worked with, drillers are leveraging sensors, large volumes of data, sophisticated algorithms and emerging technology solutions that will facilitate better predictive asset maintenance (leveraging the IoT, advanced data acquisition, storage and management capabilities, AI and machine learning, and advanced analytics). The goal: better predict catastrophic failures that could result in millions (if not billions) in lost revenue, fines and ecological issues, not to mention bad publicity. 

As the digital data that surrounds us grows exponentially, it will increasingly power AI and machine learning systems that, over time, will augment human capabilities to make us smarter, more productive, safer and more effective in our personal and professional lives. But in order for that to happen, businesses need to instill a data management capability that embraces all of today’s advanced technology benefits. Doing so is not only good for businesses – it will improve our lives.  

Scott Schlesinger

Scott Schlesinger

Scott H. Schlesinger is Chief Analytics Officer at Cognizant Digital Business. He is a recognized thought leader with more than two decades... Read more