Our personal data has become currency in the ever expanding digital economy. This shift has become most  apparent among tech companies whose business models pivot around the monetization of customer data.

The ensuing uproar over the use and abuse of personal data has academics, industry leaders, politicians, the media and policy experts calling for a spectrum of changes, ranging from more transparency and accountability to an outright breakup of “Big Tech” platform companies.

Many of the largest and most successful internet and platform companies are the focal point for the debate over data privacy. A recent Churchill Club open forum on privacy drew sharp distinctions around what the tech industry must do to maintain customer trust while preserving its legacy and livelihood.

Updating Data Privacy Mindsets

Here are key takeaways from the forum and what they mean to companies inside the tech space – and beyond.

  • Transparency is important. But it’s not enough. Transparency is about identifying each individual’s expectations and then matching those expectations with a certain level of protection. Yet there’s enormous variation in how we experience privacy. Fair use of data for one person may feel like a violation for another. Where companies get into trouble is when there’s a mismatch: When you turn off tracking on your phone, you have a reasonable expectation that it’s off, but is it? Transparency begins the privacy journey. It’s not the end.

    What it means for businesses: It’s critical to understand the long-term risks that the loss of consumer trust poses to the brand and platform. Short-term gains in revenue or customer growth can quickly be destroyed if consumer trust is willfully or unintentionally violated.

    Rather than investing in algorithms to define customers, we recommend a human-centered approach, which involves speaking and working directly with the people who use your products and services to understand their expressed and unexpressed needs. This approach helps put humans back at the center of decision making.

  • Privacy is inherently personal and contextual. The outlook of a seasoned baby boomer is likely different from that of a youthful member of Gen Z. Policies for handling customer data should be equally fluid. How much time is reasonable? Which data should be collected? How should permissions be granted? What mechanisms can be put in place for consumers to change their mind? Personalizing privacy options is key to  accommodating individuals’ varying privacy thresholds. Without a specific plan, any promise to ensure data privacy can ring hollow.

    What it means for businesses:  It’s essential to include your customers in the development and evolution of your privacy protection policies, and to continually educate them on what these policies mean for them. While consumers are savvy enough to make their own choices, they appreciate a full understanding of what they’re agreeing to through explicit, clear and forthright communications. Don’t trade short-term user growth, profits or sales for long-term trust and relationship-building with customers. Exploiting your brand for fast gains won’t pay off.

  • Build in privacy from the start. Companies are not only responsible for being clear and upfront about how their app or platform uses data; they also need to build in privacy from the start by incorporating it into the app or platform design. Just because data can be put to a particular use doesn’t mean it should. What is the platform or app’s intent, and how will data fulfill those goals? Reasonable limits on data use should be set from the beginning.

    What it means for businesses: Businesses have become accustomed to focusing on many aspects of the customer experience. The privacy experience – which is barely touched upon today – needs to become a key aspect of the platform or app design. The litmus test for any intended use of data is whether it delivers value to the customer. Determining the nature of that value is not for the business to decide; it warrants debate and discussion with customers themselves. By considering consumer preferences for privacy and data use from the start, businesses can  avoid future backlash.

  • The consumer stance on privacy is still unclear. Does privacy matter to consumers? The answer is: It’s complicated. As much as consumers voice concern over the privacy implications of using digital platforms and social media, they’re reluctant to stop using them, according to a Pew Research report. And even as consumers consistently say they want privacy, they’re actually pretty forgiving. At the same time, Pew says, consumers are also open to various mechanisms for restricting access to their data, adjusting privacy settings and otherwise shielding their information. Even in the face of major violations like the Facebook-Cambridge Analytica debacle, consumers are still climbing the learning curve and adjusting their opinions when it comes to privacy.

    What it means for businesses: It’s time to stop asking the question of whether privacy matters. There are an equal number of arguments on either side. It’s far more essential to continue exploring – in collaboration with customers – this incredibly complex issue that spans ages, geographies and cultures. Open and honest dialog and public debate is the right path forward.

A Continuous Evolution Is Needed

Privacy issues will continue to evolve at a rapid pace, and there’s no doubt the continuum is pointing toward greater protections. To walk the balance of “too much,” “too little,” “too slow” and “too fast,” businesses will need to adapt to changing cultural, societal and regulatory needs and combine early-stage planning with the ability to make real-time course-corrections.   

Andrew Worzella

Andrew Worzella

Andrew Worzella is Chief Digital Officer and Global Consulting Lead for the technology industry at Cognizant. He brings more than 19 years... Read more