Why Privacy is a Corporate Responsibility Issue

Many organizations have Corporate Responsibility programs that focus on social issues and philanthropy. Especially in today's Big Data era, why is privacy not part of the program?

Today's companies are promising to lower their carbon footprints and save endangered species. They're donating to people in developing countries who have far less than we do, which is also noble. But what about the fact that American citizens are a product whose information is bought, sold, and obtained without consent? In light of recent events, perhaps the privacy policies deserve more consideration than just two linked words at the bottom of a website home page.

"Privacy is a big issue for a host of reasons -- legal, ethical, brand protection and moral," Mark Cohen, Chief Strategy Officer at consultancy and technology service provider https://elevateservices.com/ Elevate. "[Privacy] is an element of corporate culture [so what goes into a privacy policy depends on] your values and priorities."

Problems with Privacy Policies

There are three big problems with privacy policies, at least in the US: what's in them, how they're written, and how they're ignored.

One might think that privacy policies are tailored to a particular company and its audience. However, such documents are not necessarily original. Rather than penning a privacy policy from scratch, some are literally cutting and pasting entire privacy policies regardless of their contents. In fact, the people who are simply grabbing another company's privacy policy might not even bother to read the content before using it.

The boilerplate language is also a problem. In-house counsel often uses freely available forms to put together a privacy policy. They may use one form or a combination of forms available to lawyers, but again, they're not thinking about what should be in the document.

In addition, the documents are written in legalese, which is difficult for the average person to read. Businesses are counting on that because if you don't know what's in a privacy policy, what you're giving away and what they intend to do with your information, you'll probably just hope for the best. Even better, you'll click an "I agree" button without knowing what clicking that button actually means. It's a common practice, so you're not alone if that's the case.

Oh, and what's stated in the documents may or may not be true, either because the company changed the policy since you last read it or they're ignoring the document itself.

"After May 2018 when the new GDPR [General Data Protection Regulation] goes into effect, it's going to force many companies to look at their privacy policies. their privacy statements and consents and make them more transparent," said Sheila Fitzpatrick, Data Governance & Privacy counsel and chief privacy officer at data services for hybrid cloud company NetApp. "They're going to have to be easily understandable and readable."

Businesses Confuse Privacy with Security

Privacy and security go hand-in-hand, but they're not the same thing. However, the assumption is, if you're encrypting data then you're protecting privacy.

"Every company focuses on risk, export control trade compliance, security, but rarely you find companies focused on privacy," said Fitzpatrick. "That's changing with GDPR because it's extraterritorial. It's forcing companies to start really addressing areas around privacy."

It's entirely possible to have all kinds of security and still not address privacy issues. OK, so the data is being locked down, but are you legally allowed to have it in the first place? Perhaps not.

"Before you lock down that data, you need the legal right to have it," said Fitzpatrick. "That's the part that organizations still aren't comprehending because they think they need the data to manage the relationship. In the past organizations thought they need the data to manage employment, customer or prospect relationships, but they were never really transparent about what they're doing with that data, and they haven't obtained the consent from the individual."

In the US the default is opt-in. In countries that have restrictive privacy policies, the default is opt-out.

(Image: TheDigitalArtist/Pixabay)

(Image: TheDigitalArtist/Pixabay)

The Data Lake Mentality Problem

We hear a lot about data lakes and data swamps. In a lot of cases, companies are just throwing every piece of data into a data lake, hoping it will have value in the future. After all, cloud storage is dirt cheap.

"Companies need to think about the data they absolutely need to support a relationship. If they're an organization that designs technology, what problem are they trying to solve and what data do they need to solve the problem?" said Fitzpatrick.

Instead of collecting massive amounts of information that's totally irrelevant, they should consider data minimization if they want to lower privacy-related risks and comply with the EU's GDPR.

"Companies also need to think about how long are they're maintaining this data because they have a tendency to want to keep data forever even if it has no value," said Fitzpatrick. "Under data protection laws, not just the GDPR, data should only be maintained for the purpose it was given and only for the time period for which it was relevant."

The Effect of GDPR

Under the GDPR, consent has to be freely given, not forced or implied. That means companies can't pre-check an opt-in box or force people to trade personal data for the use or continued use of a service.

"Some data is needed. If you're buying a new car they need financial information, but they'd only be using it for the purpose of the purchase, not 19 other things they want to use it for including sales and marketing purposes," said Fitzpatrick.

Privacy may well become the new competitive advantage as people become more aware of privacy policies and what they mean and don't mean.

"Especially Europeans, Canadians, and those who live in Asia-Pacific countries that have restrictive privacy laws, part of their vetting process will be looking at your privacy program," said Fitzpatrick. "If you have a strong privacy program and can answer a privacy question with a privacy answer as opposed to answering a privacy question with a security answer, [you'll have an advantage]."

On the flip side, sanctions from international countries can destroy a company from reputational, brand and financial points of view. The sanction under the new GDPR regulation can be as high as 4% of a company's annual turnover.

Does Your Company's Corporate Responsibility Program Include Privacy?

Perhaps or perhaps not. You tell us. What's your take on privacy policies? What do you think needs to happen? We'd love to continue the discussion with you in the comments section.

Lisa Morgan, Freelance Writer

Lisa Morgan is a freelance writer who covers big data and BI for InformationWeek. She has contributed articles, reports, and other types of content to various publications and sites ranging from SD Times to the Economist Intelligent Unit. Frequent areas of coverage include big data, mobility, enterprise software, the cloud, software development, and emerging cultural issues affecting the C-suite.

New Year's Resolutions of the Data Team

As you look back on 2017, we're sure there are improvements you would want to make. Now, as we head into 2018, what changes are you planning?

Deloitte: 5 Trends That Will Drive Machine Learning Adoption

Machine learning isn't as widely adopted as some may think, mainly because there are serious barriers to adoption. Researchers are making progress in reducing those barriers.

Re: Data is money
  • 11/30/2017 11:58:29 PM

Tom agreed and that's probably the intention of the T and Cs.

Re: Data is money
  • 11/30/2017 10:38:25 PM

SaneIT, we were headed in that direction in terms of more and more punishment for guilty firms. But given the swing in US politics, expect the cyber pendulum to swing the other way for a while.

Re: Data is money
  • 11/30/2017 7:44:15 PM

Most of us know this is a need. I am afriad the average consumer still doesn't understand how he is at risk.

Re: Data is money
  • 11/30/2017 6:24:50 PM

sometimes sadly consumers and businesses don't have a choice they are doing businesses with these companies through another company they are working with and become a victim of the breaches. When companies are held responsible through legislation the Tand C issues will not get them out of breaches.

Re: Data is money
  • 11/30/2017 6:17:34 PM

Those consumers right now are too concerned about losing their healthcare and their tax deductions so it might be a while

Re: Data is money
  • 11/30/2017 11:22:57 AM

I am afriad this is not an agenda item with the current administration. It will take a lot of noise from consumers to get attention.

Re: Data is money
  • 11/30/2017 11:11:38 AM

@Tomsg that's exactly the issue until there are stiff penalties applied to companies for not securing client information these breaches will continue. Congress needs to start protecting consumers!

Re: Data is money
  • 11/30/2017 8:59:31 AM

There really were no implications or any punishment  for the lax security. Most people who had information breached never knowingly used Equifax- their credit providers did. Consumers need to have the right to know who has their information and how it will be used. Providers should have a responsibilty to protect this information - and pay the price if they don't.

Re: Data is money
  • 11/30/2017 8:37:14 AM

Until it hits them in the wallet nothing will change.  It can't be a little bump either it has to be a significant amount and it has to go to the victims of the data loss not into a pool of government money that will come right back to them.  We need to start holding the data companies accountable for what they are doing with data.

Re: Data is money
  • 11/29/2017 11:43:53 PM

All three credit rating agencies have tried to market their ability to protect consumers and I wouldn't trust any of them.

Page 1 / 3   >   >>