Buying and Selling Privacy- Big Data's Different Burdens and Benefits

更新时间:2023-07-17 17:00:04 阅读: 评论:0

66 S TAN . L. R EV . O NLINE 47
September 3, 2013
47
B UYING AND S ELLING P RIVACY :
脱误
B IG D ATA ’S D IFFERENT B URDENS
AND B ENEFITS
Joph W. Jerome*
Big data is transforming individual privacy—and not in equal ways for all. We are increasingly dependent upon technologies, which in turn need our personal information in order to function. This reciprocal relationship has made it incredibly difficult for individuals to make informed decisions about what to keep private. Perhaps more important, the privacy considera-tions at stake will not be the same for everyone: they will vary depending upon one’s socioeconomic status. It is esntial for society and particularly policymakers to recognize the different burdens placed on individuals to protect their data.
I. T HE V ALUE OF P RIVACY
摊位
Privacy norms can play an important role defining social and individual life for rich and poor. In his essay on the social foundations of privacy law, the dean of Yale Law School, Robert Post, argued that privacy upholds  social “rules of civility” that create “a certain kind of human dignity and  autonomy which can exist only within the embrace of community norms.”1 He cautioned that the benefits would be threatened when social and communal relationships were replaced by individual interactions with “large scale surveillance organizations.”2
Today, privacy has become a commodity that can be bought and sold. While many would view privacy as a constitutional right or even a funda-
* Legal and Policy Fellow, Future of Privacy Forum.    1. Robert C. Post, The Social Foundations of Privacy: Community and Self in the Common Law Tort , 77 C ALIF . L. R EV . 957, 959 (1989).
2. See id. at 1009 (suggesting that the relationships between individuals and large organizations are “not sufficiently textured or den to sustain vital rules of civility” and in-stead emphasize raw efficiency in data collection).
车赛48 STANFORD LAW REVIEW ONLINE[Vol. 66:47 mental human right,3 our age of big data has reduced privacy to a dollar fig-ure. There have been efforts—both rious and silly—to quantify the value of privacy. Browr add-ons such as Privacyfix try to show urs their value to companies,4 and a recent study suggested that free Internet rvices offer $2,600 in value to urs in exchange for their data.5 Curiously, this number tracks cloly with a claim by Chief Judge Alex Kozinski that he would be willing to pay up to $2,400 per year to protect his family’s online privacy.6 In an interesting Kickstarter campaign, Federico Zannier decided to mine his own data to e how much he was worth. He recorded all of his online activity, including the position of his mou pointer and a webcam image of where he was looking, along with his GPS location data for $2 a day and raid over $2,700.7
形容树木的成语
“Monetizing privacy” has become something of a holy grail in today’s data economy. We have en efforts to establish social networks where  urs join for a fee and the ri of reputation vendors that protect urs’  privacy online, but the rvices are luxuries. And when it comes to our privacy, price nsitivity often dictates individual privacy choices. Becau the “price” an individual assigns to protect a piece of information is very different from the price she assigns to ll that same piece of information, individuals may have a difficult time protecting their privacy.8Privacy clearly has financial
value, but in the end there are fewer people in a posi-tion to pay to cure their privacy than there are individuals willing to ll it for anything it’s worth.
A recent study by the European Network and Information Security Agency discovered that most consumers will buy from a more privacy-invasive provider if that provider charges a lower price.9The study also
3. See, e.g., Griswold v. Connecticut, 381 U.S. 479, 485-86 (1965) (suggesting that constitutional guarantees create zones of privacy); Convention for the Protection of Human Rights and Fundamental Freedoms, art. 8, Nov. 4, 1950, 213 U.N.T.S. 222.
4. Joe Mullin, How Much Do Google and Facebook Profit from Your Data?, A RS T ECHNICA(Oct. 9, 2012, 6:38 AM PDT), /tech-policy/2012/10/how-much-do-google-and-facebook-profit-from-your-data.
5. Net Benefits: How to Quantify the Gains that the Internet Has Brought to Consum-ers, E CONOMIST (Mar. 9, 2013), /news/finance-and-economics/21573091-how-quantify-gains-internet-has-brought-consumers-net-benefits.
迎新晚会策划方案
6. Matt Sledge, Alex Kozinksi, Federal Judge, Would Pay a Maximum of  $2,400    a Year for Privacy, H UFFINGTON P OST(Mar. 4, 2013, 5:51 PM EST), /2013/03/04/alex-kozinski-privacy_n_2807608.html.
7. Federico Zannier, A Bite of Me, K ICKSTARTER, /projects/1461902402/a-bit-e-of-me (last visited Aug. 29, 2013).
8. See, e.g.,Alessandro Acquisti et al., What Is Privacy Worth? 27-28 (2010) (un-published manuscript), available at u.edu/~acquisti/papers/acquisti-ISR-worth.pdf.
9. N ICOLA J ENTZSCH ET AL.,E UR.N ETWORK &I NFO.S EC.A GENCY,S TUDY ON M ONETISING P RIVACY:A N E CONOMIC M ODEL FOR P RICING P ERSONAL I NFORMATION 1 (2012), available isa.europa.eu/activities/identity-and-trust/library/deliverables /monetising-privacy/at_download/fullReport.
September 2013] BUYING AND SELLING49 noted that when two companies offered a product for the same price, the more privacy-friendly provider won out. This was hailed as evidence that a pro-privacy business model could succeed, but this also anticipates that, all things being equal, one company would choo not to collect as much  information as a competitor just to be en as “privac
y friendly.” This  defeats much of the benefit that a big data economy promis.
征兵视力II.T HE B IG D ATA C HALLENGE
The foundations of big data rest on collecting as much raw information as possible before we even begin to understand what insight can be deduced from the data. As a result, long-standing Fair Information Practices like col-lection limits and purpo limitations are increasingly viewed as anachro-nistic,10and a number of organizations and business associations have called for privacy protections to focus more on how data might be ud  rather than limit which data can be collected.11 The conversation has moved away from structural limitations toward how organizations and business can build “trust” with urs by offering transparency.12 Another suggestion is to develop business models that will share the benefits of data more  directly with individuals. Online data vaults are one potential example, while the Harvard Berkman Center’s “Project VRM” propos to rethink how to empower urs to harness their data and control access to it.13 In the meantime, this change in how we understand individual privacy may be
10. Since their inception three decades ago, the Fair Information Practices, which in-clude principles such as ur notice and connt, data integrity, and u limitations, have be-come the foundation of
data protection law. For a thorough discussion and a critique, e Fred H. Cate, The Failure of the Fair Information Practice Principles, in C ONSUMER P ROTECTION IN THE A GE OF THE “I NFORMATION E CONOMY”343(2006).
11. See, e.g., W ORLD E CON.F.,U NLOCKING THE V ALUE OF P ERSONAL D ATA:F ROM
C OLLECTION TO U SAGE  4 (2013), available at www3.weforum/docs /WEF_IT_UnlockingValuePersonalData_CollectionUsage_Report_2013.pdf. In the lead-up to the National Telecommunications and Information Administration’s privacy multi-stakeholder process, the Telecommunications Industry Association demanded that the group’s “focus should be on regulating how personal information is ud, rather than how it is collected.” Press Relea, Telecomms. Indus. Ass’n, Telecommunications Industry Asso-ciation Says NTIA Privacy Code Should Focus on Data U, Not Collection Method (July 12, 2012), www.tiaonline/news-media/press-releas/telecommunications-industry-association-says-ntia-privacy-code-should.
12. Michael Fertik, Big Data, Privacy, and the Huge Opportunity in the Monetization of Trust, W ORLD E CON.F.B LOG (Jan. 25, 2012, 2:13 AM), forumblog/2012/01 /davos-daily-big-data-privacy-and-the-huge-opportunity-in-the-monetization-of-trust.
13. VRM stands for “Vendor Relationship Management.” According to the Harvard Berkman Center, the goal of the project is to “provide customers with both independence from vendors and better ways of engaging with vendors.” ProjectVRM, H ARV.U NIV.
B ERKMAN
C TR. FOR I NTERNET &S OC’Y, cyber.law.harvard.edu/projectvrm/Main_Page (last updated Mar. 27, 2013, 07:07 PM). It hopes Project VRM can improve individuals’ re-lationships with not just business, but schools, churches, and government agencies. Id.
50 STANFORD LAW REVIEW ONLINE[Vol. 66:47 inevitable—it may be beneficial—but we need to be clear about how it will impact average individuals.
知识的英语怎么说A recent piece in the Harvard Business Review posits that individuals should only “ll [their] privacy when the value is clear,” explaining that “[t]his is where the homework needs to be done. You need to understand the motives of the party you’re trading with and what [he] ha[s] to gain. The need to align with your expectations and the degree to which you feel comfortable giving up your privacy.”14 It could be possible to better align the interests of data holders and their customers, processing and monetizing data both for business and individual ends. However, the big challenge pre-nted by big
data is that the value may not be clear, the motives let alone the identity of the data collector may be hidden, and individual expectations may be confud. Moreover, even basic reputation-management and data-privacy tools require either urs’ time or money, which may price out  average consumers and the poor.
III.B IG D ATA AND C LASS
Ever-increasing data collection and analysis have the potential to exac-erbate class disparities. They will improve market efficiency, and market efficiency favors the wealthy, established class. While the benefits of the data economy will accrue across society, the wealthy, better educated are in a better position to become the type of sophisticated consumer that can take advantage of big data.15 They posss the excellent credit and ideal consum-er profile to ensure that any invasion of their privacy will be to their benefit; thus, they have much less to hide and no reason to fear the intentions of da-ta collectors. And should the well-to-do desire to maintain a sphere of pri-vacy, they will also be in the best position to harness privacy-protection tools and reputation-management rvices that will cater to their needs. As a practical matter, a monthly privacy-protection fee will be easier for the wealthy to pay as a matter of cour. Judge Kozinski may be willing and able to pay $200 a month to protect his privacy, but the average consumer might have little understanding what this surcharge is
getting him.
The lower class are likely to feel the biggest negative impact from big data. Historically, the poor have had little expectation of privacy—castles and high walls were for the elite, after all. Even today, however, the poor are the first to be stripped of fundamental privacy protections. Professor
14. Chris Taylor & Ron Webb, A Penny for Your Privacy?, HBR B LOG N ETWORK (Oct. 11, 2012, 11:00 AM), blogs.hbr/cs/2012/10/a_penny_for_your_privacy.html.
15. For a discussion of the “winners and lors” of big data, e Lior Jacob Strahi-levitz, Toward a Positive Theory of Privacy Law, 126 H ARV.L.R EV. 2010, 2021-33 (2013);  Omer Tene, Privacy: For the Rich or for the Poor?, C ONCURRING O PINIONS (July 26, 2012, 2:05 AM), /archives/2012/07/privacy-for-the-rich-or-for-the-poor.html (discussing the argument that the pervasive collection of personal infor-mation allows companies “to make the poor subsidize luxury goods for the rich”).
September 2013] BUYING AND SELLING51 Christopher Slobogin has noted what he calls a “poverty exception” to the Fourth Amendment, suggesting that our expectations of privacy have been defined in ways that make the less well-off more susceptible to experience warrantless government i
ntrusions into their privacy and autonomy.16Big data worns this problem. Most of the biggest concerns we have about big data—discrimination, profiling, tracking, exclusion—threaten the lf-determination and personal autonomy of the poor more than any other class. Even assuming they can be informed about the value of their privacy, the poor are not in a position to pay for their privacy or to value it over a pric-ing discount, even if this places them into an ill-favored category.
And big data is all about categorization. Any given individual’s data only becomes uful when it is aggregated together to be exploited for good or ill. Data analytics harness vast pools of data in order to develop elaborate mechanisms to categorize and organize. In the end, the worry may not be so much about having information gathered about us, but rather being sorted into the wrong or disfavored bucket.17 Take the example of an Atlanta man who returned from his honeymoon to find his credit limit slashed from $10,800 to $3,800 simply becau he had ud his credit card at places where other people were likely to have a poor repayment history.18 Once everyone is categorized into granular socioeconomic buckets, we are on our way to a transparent society. Social rules of civility are replaced by information efficiencies. While this dynamic may produce a number of very significant societal and communal benefits, the benefits will not fall evenly on all people. As Helen Nisnbaum has explained, “the needs of wealthy government actors and business enterpris are f
ar more salient drivers of their information offerings, resulting in a playing field that is far from even.”19Big data could effectuate a democratization of information but, generally, information is a more potent tool in the hands of the  powerful.
修改路由器密码Thus, categorization and classification threaten to place a privacy squeeze on the middle class as well as the poor. Increasingly large swaths of people have little recour or ability to manage how their data is ud. Encouraging people to contemplate how their information can be ud—and how best to protect their privacy—is a positive step, but a public educa-
16. Christopher Slobogin, The Poverty Exception to the Fourth Amendment, 55 F LA. L.R EV. 391, 392, 406 (2003).
17. See Tene, supra note 15.
18.See Lori Andrews, Facebook Is Using You, N.Y.T IMES(Feb. 4, 2012), /2012/02/05/opinion/sunday/facebook-is-using-you.html. Tech ana-lyst Alistair Croll discuss this example, arguing that big data will become a difficult civil rights issue. Alistair Croll, Big Data Is Our Generation’s Civil Rights Issue, and We Don't Know It: What the Data Is Must Be Linked to How It Can Be Ud, O’R EILLY R ADAR (Aug. 2, 2012), /
2012/08/big-data-is-our-generations-civil-rights-issue-and-we-dont-know-it.html.
19. H ELEN N ISSENBAUM,P RIVACY IN C ONTEXT 211 (2010).

本文发布于:2023-07-17 17:00:04,感谢您对本站的认可!

本文链接:https://www.wtabcd.cn/fanwen/fan/82/1101661.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

标签:树木   路由器   策划   视力
相关文章
留言与评论(共有 0 条评论)
   
验证码:
推荐文章
排行榜
Copyright ©2019-2022 Comsenz Inc.Powered by © 专利检索| 网站地图