As the era of big data dawns, so too must a new debate on privacy.
That's the mandate coming out of the Future of Privacy Forum, a Washington think tank that encourages responsible data practices and advances projects aimed at increasing transparency and control of online data, mobile data, apps, and social media in a "business practical manner."
Granted, the availability of abundant data provides the possibilities for great social and economic benefits, said Jules Polonetsky, co-chair and director of the Future of Privacy Forum, and Omer Tene, associate professor at the College of Management Haim Striks School of Law, in a recent article on the Stanford Law Review site.
As one example of the benefits associated with the ability to analyze big data, Polonetsky and Tene point to the discovery of the link between Vioxx and 27,000 cardiac arrest deaths and the subsequent recall of the arthritis medicine. Another, they said, is Google Flu Trends, a tool using aggregate search queries to identify flu outbreaks by region. (AllAnalytics.com blogger Ariella Brown recently wrote about Google Flu Trends in her post, A Dose of Google Data for Doctors & Hospitals .)
I'd also add in HealthMap, which I discussed in my post, Tracking Disease at the Speed of Social Media. A project of Children’s Hospital Boston, HealthMap extracts disease and location data from thousands of information sources to deliver real-time disease tracking across the globe.
But this sort of big data project and the availability of limitless information fuel the fears of privacy advocates. They worry this unprecedented data access threatens an individual's ability to control, track, and understand what becomes of his or her personal data. Recent revelations that social networking giant Facebook collects data on the destinations of users leaving its pages further agitate those calling for greater regulations governing what data companies can collect about users.
As Polonetsky discusses in this video, many companies using data collected from customers are walking a tightrope. Failure to think through privacy policies could present a variety of difficulties.
The Future of Privacy Forum advocates a vigorous debate over the balance between individual rights and the benefits of data use by businesses and researchers to power innovation. Tene and Polonetsky argue that failure to have this debate could lead to a regulatory backlash. Potential dangers include a regulatory environment in which protection of all data as potentially private could stifle innovation. In such an environment, we might also see “perverse incentives,” with companies abandoning current privacy protections and increasing the risk of privacy and security issues.
How can companies strike a balance between big data access and privacy issues? Leave your comments on the board below.
Shawn, I've raised privacy concerns all along in our discussions about many of the big data- and analytics-related projects that we've seen, as cool as they might. So, I agree wholeheartedly with the Future of Privacy Forum's call for vigorous debate about how to balance individual rights and the pursuit of innovation or improvements for mankind.
For all their support of individual privacy, it must be noted that the Future of Privacy Forum is an industry-supported organization with a distinctly "business-practical" outlook, so they do not favor curtailing use of big data for practical business purposes. What they strive for are ways of rationally balancing business and research needs against the desire for privacy to avoid a privacy backlash that might make some industry use of data unworkable.
Of course they have to do something - it's almost scary how well actions and then merged with other information. I don't mind being aggregated - but invidualized is different.
I think the problem is that the way the advertisers use data to ANNOY me and then don't use to prevent annoying me. For instance, I wish Company X would stop sending me offers for Blah service; truth is - I never want it. Nothing will ever change my mind about it. In fact if Blah service was free; I would pay double for the alternative service I am currently using.
Please invade my privacy and use your big data to figure this out. Use your big data to save yourself some advertising dollars - no way it's free to send me a postcard every month reminding me you have the service.
On the other hand, it would be really cool if Yelp (or a like service) knew that I loved fancy French food and if I was driving within a half-mile of a new eatery with 4+ stars and $10 off coupon, it would alert me. I can see where that service could get out of hand quickly. However if I could opt in or even pay for service it would not feel invasive.
@TAanderud You bring up an interesting aspect of privacy - the unwanted email ads. I am like you in that there are some products I have absolutely no interest in buying ever and yet we are bombarded by these companies thinking they can somehow change our minds.
This is an invasion of privacy as well and thank you for pointing it out!
I think the number one privacy concern isn't that big companies are taking and using our data, but rather that we are just giving it away.
For all my concerns, I'm one of the most guilty people I know. I've allowed several websites to link into my Facebook or Linkedin acount for the sake of convenience and cool applications. If you asked me if I would give my information away point blank, my answer would be "No, of course not", but save me five minutes and I'm there.
Seth, you raise a good point. For so many people, it's about convenience and the "cool apps." I'm guilty of it too, especially on Facebook. Just because it IS easier to allow a given application access to your info just to save some time, more people do that without even thinking of the potential consequences. From that perspective, the companies that take advantage of that cannot be held accountable for people freely giving information away. The only way that I can surmise a possibility to get a hold on sharing information is to put some sort of warning prior to installing the application or anything else. At least that way, no one can claim "well I didn't know" and refuse to hold themselves accountable for giving away their own information. Companies know that they can't steal the data, but if it's given away freely, then they will use it in order to gain the upper hand in any way that they can. I'm not at all saying that it is right, but it is what it is and the people can't expect the corporate world to change. Instead, it is up to the people to realize what they are doing and what information they are simply giving away freely.
Shawn, like you mentioned there should be some balance or boundary line between privacy and personal issues. Without data and analysis, we may not be able to derive any conclusions and trends. So we have to collect data either through public or private sources, with in the limit of privacy regulations. Again datas can be differentiating as vital and non vital, where vital datas can be collected and sampled with due acknowledgment or permission.
While 97% of insurers say that insurance fraud has increased or remained the same in the past two years, most of those companies report benefits from anti-fraud technology in limiting the impact of fraud, including higher quality referrals, the ability to uncover organized fraud, and improve efficiency for investigators.