Why Privacy Matters

Greg Cohen |

May 22, 2014

Why Privacy Matters

In the Declaration of Independence, the founding fathers outlined what they saw as inalienable rights to “life, liberty, and the pursuit of happiness.” Privacy, however, was not mentioned. 

In fact, privacy is not guaranteed by the Declaration or the Constitution. It was never established as a right by the documents that brought our nation into being. Instead, privacy has been a moving target that, as a result, means different things to different people. Libertarians, for example, believe that maintaining privacy is fundamental to our liberty and actively advocate for increased privacy in the private and public sectors. Millennials, on the other hand, are famously blasé about sharing private information online (though recent studies contradict this perception). Regardless of your personal or political stance, many negative outcomes can result from invasions of privacy. 

Yet few people today seek to live a life of complete privacy. Doing so requires going “off the grid” and disconnecting completely from society. It’s not something most of us are willing to consider. But, we each have expectations around the degree to which information about ourselves should be shared with other people and organizations.

The Internet, of course, has made issues of privacy more complex than ever before. In fact, 90 percent of all data was created in the last two years. And that rapid accumulation of data -- about ourselves and about the world around us -- is only increasing. Let’s face it. Privacy issues are only going to become more important, not less.

Most people recognize that they are giving up a certain amount of privacy when they use the Internet, join social networks, download apps and otherwise engage with connected technologies. And they are willing to trade some of their valuable privacy for the value they receive from these technologies -- benefits like entertainment, information, convenience and connection. 

For example, in joining Facebook and creating a profile, you agree to share your data with both Facebook and its wide network of advertisers. In exchange for allowing companies to market to you based on information you share on Facebook, you are able to nurture connections with friends, family and colleagues. Many people deem that to be a fair trade.

Similarly, in exchange for accepting the terms and conditions of an app, users can download and use it for free. Often those terms involve sharing users’ data with advertisers and agreeing to view those targeted ads within the app. Retailers also collect and store their customers’ personal data in exchange for offers, discounts, birthday gifts and a variety of other personalized marketing.

However, many people do not realize the extent to which their data is being captured, analyzed and put to use. In fact, most don’t have much visibility into this process. It’s why many find themselves creeped out by technologies that use our data to reach us in ways we may not fully understand, ranging from “retargeting” ads to users based on their activities online to employing facial recognition to identifying customers when they walk into a store.

It’s often hard to fully articulate why these technologies creep us out -- but they do all the same. Of course, that doesn’t mean we’ll stop using them. It just means we won’t fully trust the people and businesses who employ them. This uneasy truce is not actually good for anyone.

Privacy is vital for consumers. Having privacy allows us to protect our interests, financial and otherwise. Ultimately, we should be able to decide what information we share with whom, how and when. Our personal data should be seen as our property, and privacy should be viewed as a core aspect of the inalienable human right to liberty. 

Of course, the above primarily refers to privacy issues that result from intentional, sanctioned activity. A whole other can of worms are the people and groups out there interested in exploiting consumers’ information for nefarious purposes. That’s what happened when, last December, Target’s data breach compromised 40 million customers’ credit and debit card information and 70 million customers’ personal information. Similarly, in January, 4.6 million Snapchat users’ data was leaked by hackers. These are obvious and egregious breaches of privacy. Though the Target breach was primarily an attempt to swipe credit card numbers, there is plenty of personal information that can be skimmed this way, and Snapchat is an app with entire interface built on the notion of privacy -- and impermanence. Both companies need users’ trust in order to succeed. Plain and simple.

So while it’s clear that we can’t achieve absolute privacy, it’s equally clear that we need some checks and balances built into the system of exchanging personal information for value. 

In early 2012, the White House introduced the Consumer Privacy Bill of Rights, a basic framework for protecting customers in the “global digital economy.” In it, Obama wrote that, “we must reject the conclusion that privacy is an outmoded value. It has been at the heart of our democracy from its inception, and we need it now more than ever.” However, Congress all but ignored the manifesto, and two years later, privacy is still far from guaranteed. The NSA revelations made that painfully clear. The basic groundwork the Obama Administration put in place is a step in the right direction, but it’s clearly toothless, and today’s consumers need more.

We can’t wait for the government to write ineffectual manifestos with no legal muscle. And some argue that we also can’t expect businesses to police themselves. Self-regulation is often a pipe dream and rarely pans out. However, privacy is becoming an increasingly important issue for consumers, and a growing number are rethinking their purchasing behaviors and online activities in light of the NSA revelations, Target breach and Snapchat leaks, among other disconcerting privacy-related news.

Businesses can’t afford to wait around for legislators to catch up with technology. It’s time for them to come together and assert their users’ right to privacy. Instead of continuing down a dangerous path that pushes the limits of privacy until consumers revolt, companies need to decide what they are able to offer in exchange for personal data, and when that value cannot account for the cost -- and they need to be transparent with consumers about this.

That means legalese-ridden privacy notices aren’t going to cut it. Companies need to communicate directly, clearly and on-the-level with consumers about what information they are collecting, how they are storing it, what security measures they are employing and what they plan to do with that data.

So what do businesses gain from these self-imposed restrictions on data use? Exactly what all companies need most to survive and thrive: customer trust. 

By listening to and acting upon the privacy concerns of their customers, businesses will be able to build far stronger relationships at a time when trust is a valuable and rare commodity. Companies that choose to do this proactively will likely find that it gives them a competitive edge. Most importantly, it will set a precedent that respect for privacy and liberty are fundamentally necessary for us to build a healthy economy and one where consumers can expect to be treated like people.