簡易檢索 / 詳目顯示

研究生: 葛陵偉
Greene, Travis
論文名稱: GDPR時代的數據科學:數據科學家、經理人及學術研究人員的實務啟示
Situating Data Science in the Wake of the GDPR: Practical Implications for Data Scientists, Their Managers, and Academic Researchers
指導教授: 徐茉莉
Shmueli, Galit
口試委員: 雷松亞
Ray, Soumya
林勤富
Lin, Ching-Fu
學位類別: 碩士
Master
系所名稱: 科技管理學院 - 國際專業管理碩士班
International Master of Business Administration(IMBA)
論文出版年: 2019
畢業學年度: 107
語文別: 英文
論文頁數: 127
中文關鍵詞: 數據科學數據科學家
外文關鍵詞: GDPR, data regulation, privacy law, personal data, behavioral big data, industry-academic collaboration
相關次數: 點閱:1下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 歐盟(EU)於於2018年年5月月通過數據保護條例(GDPR)。此項全球性的新法規將改變企業和研究人員對於個人數據的收集、處理和分析。到目前為止,很少有學術研究單位探討討GDPR如如何影響專門處理行為大數據的數據科學家和研究人員。因此透過更深了解解GDPR的的核心概念、定義和原則,數據科學家和行為研究人員將大大受益,特別是應用於數據科學的工作流程中。我們透過Kenett和和Shmueli(2014)的資訊品質架構 (InformationQuality) 來分析GDPR核心概念和原則,並說明它們如何影響數據科學工作。由於GDPR對全球帶來前所未有的新高度,我們將跨國企業與研究合作之個人數據傳輸的影響納入研究。由於許多數據科學家具有STEM背景,我們特別強調將GDPR置於更廣泛的社會、法律、政治和經濟背景之下。在這個新的數據隱私監管時代,數據科學家和研究人員不僅得知道他們在GDPR下的法律義務,還要了解他們工作內容對社會和政治的潛在影響。


    In May 2018, the European Union’s (EU) General Data Protection Regulation (GDPR)
    went into effect. The new Regulation is global in scope and will require a shift in the
    way companies and researchers collect, process, and analyze personal data. To date
    however, little academic work has focused on how the GDPR will impact data sci-
    entists and researchers whose work relies on processing behavioral big data. Data
    scientists and behavioral researchers would therefore benefit from a deeper under-
    standing of the GDPR’s key concepts, definitions, and principles, especially as they
    apply to the data science workflow. We use the Information Quality framework by
    Kenett and Shmueli (2014) to identify key GDPR concepts and principles and de-
    scribe how they affect typical data science work. Because of the unprecedented global
    reach of the GDPR, we also consider its impact on personal data transfers for in-
    ternational corporations and research collaborations. As many data scientists come
    from a STEM background, we place special emphasis on situating the GDPR within
    a broader social, legal, political, and economic context. In this new data privacy reg-
    ulation era, data scientists and researchers must know not only their legal obligations
    under GDPR, but also be aware of the potential social and political implications of
    their work.

    1 Introduction 2 2 A Brief Overview of the GDPR 6 I The GDPR: Justifications and Responses 12 3 Justifications for the GDPR 15 3.1 The Importance of Personal Data in the Global Economy . . . . . . . 17 3.2 Regulatory Certainty & Reduced Bureaucracy . . . . . . . . . . . . . 18 3.3 Legal Coherence and Precedent . . . . . . . . . . . . . . . . . . . . . 19 3.4 GDPR as a Global Gold Standard: Advancing European Soft Power . 20 3.4.1 European vs. Chinese Approaches to Data . . . . . . . . . . . 21 3.4.2 The 2014 Market Abuse Regulation . . . . . . . . . . . . . . . 21 3.5 GDPR as Increasing Consumer Trust . . . . . . . . . . . . . . . . . . 22 3.6 GDPR as Slowing “Privacy lurch” . . . . . . . . . . . . . . . . . . . . 23 3.7 General Criticisms of European Data Regulation . . . . . . . . . . . . 25 3.7.1 Will the GDPR Help Create a Global “Splinternet?” . . . . . 25 3.7.2 The EU Legal-Regulatory Framework is Relatively Ineffective 26 4 Responses to the GDPR 28 4.1 Economic and Business Implications . . . . . . . . . . . . . . . . . . 30 4.1.1 Creation of Data-Backed Securities Markets . . . . . . . . . . 30 4.1.2 GDPR Insurance . . . . . . . . . . . . . . . . . . . . . . . . . 30 4.1.3 Deepening Data Divides among Firms . . . . . . . . . . . . . 31 4.2 Political and Legal Implications . . . . . . . . . . . . . . . . . . . . . 36 4.2.1 The GDPR & Environmental Protection Law . . . . . . . . . 36 4.2.2 Zarsky’s Three Prognoses . . . . . . . . . . . . . . . . . . . . 37 4.3 Technical Implications . . . . . . . . . . . . . . . . . . . . . . . . . . 38 4.3.1 Issues with the Purpose Limitation Principle . . . . . . . . . . 38 4.3.2 Issues with the Data Minimization Principle . . . . . . . . . . 40 4.3.3 Issues with “Special Categories” of Personal Data . . . . . . . 40 4.3.4 Issues with Automated Profiling . . . . . . . . . . . . . . . . . 40 4.3.5 Algorithmic Explanation, Bias, and Transparency . . . . . . . 42 II Implications for Managers of Multinational Corpora- tions 46 5 International Personal Data Transfers under the GDPR 49 5.1 Personal Data Regulation in the US: The FTC’s Role . . . . . . . . . 49 5.2 EU Personal Data Regulation 1995-Present . . . . . . . . . . . . . . . 51 5.2.1 The 1995 EU Data Protection Directive . . . . . . . . . . . . 51 5.2.2 The GDPR . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 5.3 Safe Harbor and Key EU Court Rulings . . . . . . . . . . . . . . . . 53 5.3.1 The Basics of Safe Harbor . . . . . . . . . . . . . . . . . . . . 53 5.3.2 The Schrems Case . . . . . . . . . . . . . . . . . . . . . . . . 55 5.3.3 The Digital Rights Ireland Case . . . . . . . . . . . . . . . . . 56 5.3.4 The Google-Spain Case . . . . . . . . . . . . . . . . . . . . . . 56 5.3.5 Safe Harbor is Reborn as Privacy Shield . . . . . . . . . . . . 56 5.4 Comparing Standard Contractual Clauses with Binding Corporate Rules for Corporate Data Transfers . . . . . . . . 58 5.4.1 Shortcomings of Standard Contractual Clauses . . . . . . . . . 58 5.4.2 Binding Corporate Rules and the Role of Adequacy . . . . . . 59 5.5 Three Domain Analysis of Binding Corporate Rules . . . . . . . . . . 61 5.5.1 Legal Effects . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 5.5.2 Economic Effects . . . . . . . . . . . . . . . . . . . . . . . . . 62 5.5.3 Ethical Effects . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 III Implications for Data Scientists and Behavioral Re- searchers 66 6 Information Quality: A Framework for Analyzing the Effects of GDPR 69 6.1 Introduction to the InfoQ Framework . . . . . . . . . . . . . . . . . . 69 6.1.1 Data Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . 70 6.1.2 Data Structure . . . . . . . . . . . . . . . . . . . . . . . . . . 71 6.1.3 Data Integration . . . . . . . . . . . . . . . . . . . . . . . . . 71 6.1.4 Temporal Relevance . . . . . . . . . . . . . . . . . . . . . . . 72 6.1.5 Chronology of Data and Goal . . . . . . . . . . . . . . . . . . 73 6.1.6 Generalizability . . . . . . . . . . . . . . . . . . . . . . . . . . 73 6.1.7 Operationalization . . . . . . . . . . . . . . . . . . . . . . . . 74 6.1.8 Communication . . . . . . . . . . . . . . . . . . . . . . . . . . 75 6.1.9 Assessing InfoQ . . . . . . . . . . . . . . . . . . . . . . . . . . 75 7 The Objective of the GDPR: Important Terms and Concepts for Data Scientists 77 7.1 Goal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 7.2 Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 7.3 Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 7.4 Utility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 7.5 The Impact of the GDPR on Data Scientists: Analyzing a Typical Workflow . . . . . . . . . . . . . . . . . . . . . . 87 7.6 Collecting Data: Pre and Post . . . . . . . . . . . . . . . . . . . . . . 88 7.6.1 Pre-Collection: Data minimization and purpose limitation . . . . . . . . . . . 88 7.6.2 Post-collection: pseudonymization . . . . . . . . . . . . . . . . 91 7.6.3 The data environment . . . . . . . . . . . . . . . . . . . . . . 92 7.7 Using Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 7.7.1 Reconsent of pre-GDPR data . . . . . . . . . . . . . . . . . . 94 7.7.2 Data availability . . . . . . . . . . . . . . . . . . . . . . . . . 95 7.7.3 Data storage and duration limits . . . . . . . . . . . . . . . . 96 7.7.4 Data subject heterogeneity . . . . . . . . . . . . . . . . . . . . 97 7.7.5 Choice of algorithms and models . . . . . . . . . . . . . . . . 97 7.8 Sharing data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 7.8.1 Legal liability under GDPR . . . . . . . . . . . . . . . . . . . 99 7.8.2 Data access divides . . . . . . . . . . . . . . . . . . . . . . . . 99 7.9 Generalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 7.9.1 GDPR and consent bias . . . . . . . . . . . . . . . . . . . . . 101 7.9.2 Concerns of scientific reproducibility . . . . . . . . . . . . . . 102 7.10 Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 7.10.1 Communication with data subjects . . . . . . . . . . . . . . . 103 7.10.2 Communication with data protection authorities . . . . . . . . 104 8 Conclusion & Future Work 105 Bibliography 108 Appendices 116 Appendix A: Doing Academic Research under the GDPR 118 Appendix B: Checklist for Corporate GDPR Compliance 120 Appendix C: Industry-Academic Collaboration under the GDPR 122 Appendix D: Glossary of GDPR terms and their definitions 122

    Albrecht, J. (2016). How the gdpr will change the world. European Data Protection
    Law Review, 2(3):287–289.
    Alexander, L., Das, S. R., Ives, Z., Jagadish, H., and Monteleoni, C. (2017). Research
    challenges in financial data modeling and analysis. Big data, 5(3):177–188.
    Allen, D. W., Berg, A., Berg, C., Markey-Towler, B., and Potts, J. (2019). Some
    economic consequences of the gdpr. Available at SSRN 3160404.
    Allen & Overy (2016). Binding corporate rules. (white paper).
    Barocas, S. and Selbst, A. D. (2016). Big data’s disparate impact. Calif. L. Rev, 104.
    Bird & Bird (2017). Guide to the general data protection regulation.
    Borgesius, F. J. Z. (2015). Personal data processing for behavioural targeting: Which
    legal basis? International Data Privacy Law, 5:163.
    Buttarelli, G. (2016). The eu gdpr as a clarion call for a new global digital gold
    standard.
    Calder, A. (2016). EU GDPR: A Pocket Guide. IT Governance Publishing.
    Carroll, A. B. (1979). A three-dimensional conceptual model of corporate perfor-
    mance. Academy of management review, 4(4):497–505.
    Chen, D., Fraiberger, S. P., Moakler, R., and Provost, F. (2017). Enhancing trans-
    parency and control when drawing data-driven inferences about individuals. Big
    data, 5(3):197–212.
    Chen, Y.-J., Lin, C.-F., and Liu, H.-W. (2018). ‘rule of trust’: The power and perils
    of china’s social credit megaproject. Columbia Journal of Asian Law, 32:1.
    Danziger, S., Levav, J., and Avnaim-Pesso, L. (2011). Extraneous factors in judicial
    decisions. Proceedings of the National Academy of Sciences, 108(17):6889–6892.
    de Leeuw, K. and Bergstra, J. (2007). The History of Information Security: A Com-
    prehensive Handbook. Amsterdam: Elsevier.
    Dedman, M. (2006). The origins and development of the European Union 1945-1995:
    a history of European integration. Routledge.
    Determann, L. (2016). Adequacy of data protection in the usa: myths and facts.
    International Data Privacy Law, 6(3):244–250.
    DLA Piper & AON (2018). The price of data security: A guide to the insurability of
    gdpr fines across europe.
    Drumond, I. (2009). Bank capital requirements, business cycle fluctuations and the
    basel accords: A synthesis. Journal of Economic Surveys, 23(5):798–830.
    Dwork, C. and Roth, A. (2014). The algorithmic foundations of differential privacy.
    Foundations and Trends in Theoretical Computer Science, 9 (3-4):211–407.
    Federal Trade Commission (2012). Protecting consumer privacy in an era of rapid
    change. ftc report. march 2012. Technical report.
    Friedman, B. and Nissenbaum, H. (1996). Bias in computer systems. ACM Transac-
    tions on Information Systems (TOIS), 14(3):330–347.
    Future of Privacy Forum (2017). White paper: Understanding corporate data sharing
    decisions: Practices, challenges, and opportunities for sharing corporate data with
    researchers. Technical report.
    Garcia, D., Kassa, Y. M., Cuevas, A., Cebrian, M., Moro, E., Rahwan, I., and Cuevas,
    R. (2018). Analyzing gender inequality through large-scale facebook advertising
    data. Proceedings of the National Academy of Sciences, page 201717781.
    Georgiadou, Y., de By, R. A., and Kounadi, O. (2019). Location privacy in the wake
    of the gdpr. ISPRS international journal of geo-information, 8(3):157.
    Granger, M.-P. and Irion, K. (2018). The right to protection of personal data: the
    new posterchild of european union citizenship? In Civil Rights and EU Citizenship.
    Edward Elgar Publishing.
    Greenleaf, G. (2009). Five years of the apec privacy framework: Failure or promise?
    Computer Law & Security Review, 25(1):28–43.
    Hand, D. J. (2018). Aspects of data ethics in a changing world: Where are we now?
    Big Data, 6 (3):176–190.
    Harari, Y. N. (2018). 21 Lessons for the 21st Century. Random House.
    Hettne, B. and Soderbaum, F. (2005). Civilian power or soft imperialism-the eu as a
    global actor and the role of interregionalism. Eur. Foreign Aff. Rev, 10.
    Hintze, M. and LaFever, G. (2017). Meeting upcoming gdpr requirements while max-
    imizing the full value of data analytics. Technical report.
    Hosanagar, K. (2019). A Human’s Guide to Machine Intelligence: How Algorithms
    Are Shaping Our Lives and How We Can Stay in Control. Viking.
    Iltis, A. S. (2006). Research ethics. Routledge.
    Jia, J., Jin, G. Z., and Wagman, L. (2018). The short-run effects of GDPR on tech-
    nology venture investment (No. w25248). National Bureau of Economic Research.
    Junghans, C. and Jones, M. (2007). Consent bias in research: how to avoid it.
    Kenett, R. S. and Shmueli, G. (2014). On information quality. Journal of the Royal
    Statistical Society, Series A, 177 (1):3–38.
    Kenett, R. S. and Shmueli, G. (2015). Clarifying the terminology that describes
    scientific reproducibility. Nature methods, 12(8):699.
    Kenett, R. S. and Shmueli, G. (2016). Information Quality: The Potential of Data
    and Analytics to Generate Knowledge. John Wiley & Sons.
    King, G. and Persily, N. (2018). A new model for industry-academic partnerships.
    Korff, D. (2016). Practical implications of the new eu general data protection regula-
    tion for eu-and non-eu companies.
    Kosinski, M., Stillwell, D., and Graepel, T. (2013). Private traits and attributes are
    predictable from digital records of human behavior. Proceedings of the National
    Academy of Sciences, 110(15):5802–5805.
    Koszegi, S. T. (2019). High-Level Expert Group on Artificial Intelligence.
    Kramer, A. D. I., Guillory, J. E., and Hancock, J. T. (2014). Experimental evidence
    of massive-scale emotional contagion through social networks. Proceedings of the
    National Academies of Sciences, 111 (24):8788–8790.
    Kulesza, J. (2011). Walled gardens of privacy or binding corporate rules: A critical
    look at international protection of online privacy. UALR L. Rev, 34:747.
    Lanier, J. (2010). You are not a gadget: A manifesto. Vintage.
    Loidean, N. N. (2016). The end of safe harbor: Implications for eu digital privacy
    and data protection law, 19 no. J. INTERNET L., 8:1–12.
    Lowthian, P. and Ritchie, F. (2017). Ensuring the confidentiality of statistical outputs
    from the ADRN. Technical report, Administrative Data Research Network.
    Mansfield-Devine, S. (2013). Biometrics in retail. Biometric Technology Today.
    Martens, D., Provost, F., Clark, J., and de Fortuny, E. J. (2016). Mining massive
    fine-grained behavior data to improve predictive analytics. MIS quarterly, 40(4).
    Metzl, J. (2019). Hacking Darwin: Genetic Engineering and the Future of Humanity.
    Sourcebooks, Inc.
    Michalski, A. (2005). The eu as a soft power: the force of persuasion. In Melissen,
    J., editor, The New Public Diplomacy: Studies in Diplomacy and International
    Relations. Palgrave Macmillan, London.
    Moerel, L. (2012). Binding Corporate Rules: Corporate Self-Regulation of Global Data
    Transfers. OUP, Oxford.
    Mourby, M., Mackey, E., Elliot, M., Gowans, H., Wallace, S. E., and Bell, J. (2018a).
    Are ‘pseudonymised’data always personal data? implications of the gdpr for admin-
    istrative data research in the uk. Computer Law & Security Review, 34(2):222–233.
    Mourby, M., Mackey, E., Elliot, M., Gowans, H., Wallace, S. E., Bell, J., Smith, H.,
    Aidinlis, S., and Kaye, J. (2018b). Are ‘pseudonymised’data always personal data?
    implications of the GDPR for administrative data research in the uk. Computer
    Law & Security Review, 34(2):222–233.
    O’Connor, B., Balasubramanyan, R., Routledge, B. R., Smith, N. A., et al. (2010).
    From tweets to polls: Linking text sentiment to public opinion time series. Icwsm,
    11(122-129):1–2.
    Ohm, P. (2012). Branding privacy. Minn. L. Rev., 97.
    Olshannikova, E., Olsson, T., Huhtamäki, J., and K¨ arkk¨ ainen, H. (2017). Conceptu-
    alizing big social data. Journal of Big Data, 4(1):3.
    O’Neil, C. (2016). Weapons of Math Destruction: how big data increases inequality
    and threatens democracy. Crown Publishers, New York.
    Orlitzky, M., Schmidt Frank, L., and L., R. S. (2003). Corporate social and financial
    performance: A meta-analysis. Organization Studies, 24(3):403 – 441.
    Parasuraman, R. and Manzey, D. H. (2010). Complacency and bias in human use of
    automation: An attentional integration. Human factors, 52(3):381–410.
    Politou, E., Alepis, E., and Patsakis, C. (2018). Forgetting personal data and re-
    voking consent under the gdpr: Challenges and proposed solutions. Journal of
    Cybersecurity, 4:1.
    Quinn, M. J. (2006). On teaching computer ethics within a computer science depart-
    ment. Science and Engineering Ethics, 12(2):335–343.
    Rahwan, I., Cebrian, M., Obradovich, N., Bongard, J., Bonnefon, J.-F., Breazeal, C.,
    Crandall, J. W., Christakis, N. A., Couzin, I. D., Jackson, M. O., et al. (2019).
    Machine behaviour. Nature, 568(7753):477.
    Reding, V. (2011). Binding corporate rules: unleashing the potential of the digital sin-
    gle market and cloud computing. Technical report, IAPP Europe Data Protection
    Congress.
    Reding, V. (2012). The european data protection framework for the twenty-first
    century. International Data Privacy Law, 2(3):119–129.
    Rhoen, M. (2017). Rear view mirror, crystal ball: Predictions for the future of data
    protection law based on the history of environmental protection law. Computer law
    & security review, 33(5):603–617.
    Robbins, S. P. and Judge, T. (2017). Organizational Behavior. Pearson, seventeenth
    edition.
    Russell, B. (2001). The problems of philosophy. OUP Oxford.
    Saddiqi, N. (2017). Intelligent Credit Scoring: Building and Implementing Better
    Credit Risk Scorecards. Hoboken, New Jersey: Wiley, 2nd edition.
    Safari, B. A. (2016). Intangible privacy rights: How europe’s gdpr will set a new
    global standard for personal data protection. Seton Hall L. Rev., 47:809.
    Sauro, J. and Lewis, J. R. (2012). Quantifying the User Experience: Practical Statis-
    tics for User Research. Elsevier, 1st edition.
    Scholz, T. and Schneider, N. (2017). Ours to hack and to own: The rise of platform
    cooperativism, a new vision for the future of work and a fairer internet. OR books.
    Schwartz, M. S. and Carroll, A. B. (2003). Corporate social responsibility: A three-
    domain approach. Business ethics quarterly, 13(4):503–530.
    Shmueli, G. (2017). Analyzing behavioral big data: Methodological, practical, ethical
    and moral issues. Quality Engineering, 29(1):57–74.
    Shneiderman, B. (2016). Opinion: The dangers of faulty, biased, or malicious al-
    gorithms requires independent oversight. Proceedings of the National Academy of
    Sciences, 113(48):13538–13540.
    Spradling, C., Soh, L.-K., and Ansorge, C. (2008). Ethics training and decision-
    making: do computer science programs need help? ACM SIGCSE Bulletin,
    40(1):153–157.
    Steppe, R. (2017). Online price discrimination and personal data: A general data
    protection regulation perspective. Computer Law & Security Review, 33(6):768–
    785.
    Tene, O. and Polonetsky, J. (2013). A theory of creepy: technology, privacy and
    shifting social norms. Yale JL & Tech, 16.
    Tene, O. and Polonetsky, J. (2016). Beyond irbs: Ethical guidelines for data research.
    Washington and Lee Law Review Online, 72(3):458.
    Tesfay, W. B., Hofmann, P., Nakamura, T., Kiyomoto, S., and Serna, J. (2018).
    Privacyguide: towards an implementation of the eu gdpr on internet privacy policy
    evaluation. In Proceedings of the Fourth ACM International Workshop on Security
    and Privacy Analytics, pages 15–21. ACM.
    Tikkinen-Piri, C., Rohunen, A., and Markkula, J. (2018). Eu general data protec-
    tion regulation: Changes and implications for personal data collecting companies.
    Computer Law & Security Review, 34(1):134–153.
    Turow, J. (2017). The Aisles Have Eyes: How Retailers Track Your Shopping, Strip
    Your Privacy, and Define Your Power. New Haven : Yale University Press.
    Turow, J., Hennessy, M., Draper, N., Akanbi, O., and Virgilio, D. (2018). Divided
    we feel: Partisan politics drive american’s emotions regarding surveillance of low-
    income populations.
    Tversky, A. and Kahneman, D. (1974). Judgment under uncertainty: Heuristics and
    biases. science. 185(4157):1124–1131.
    US Chamber of Commerce (2014). Business without borders.
    Van Den Hoven, J. (2012). ”fact sheet-ethics subgroup iot-version 4.0.”. Technical
    report.
    Veale, M., Binns, R., and Ausloos, J. (2018a). When data protection by design and
    data subject rights clash. International Data Privacy Law, 8(2):105–123.
    Veale, M., Binns, R., and Van Kleek, M. (2018b). Some hci priorities for gdpr-
    compliant machine learning. arXiv preprint arXiv:1803.06174.
    Veale, M. and Edwards, L. (2018). Clarity, surprises, and further questions in the
    article 29 working party draft guidance on automated decision-making and profiling.
    Computer Law & Security Review, 34(2):398–404.
    Voigt, P. and dem Bussche, A. V. (2017). The EU General Data Protection Regulation
    (GDPR): A Practical Guide. Springer International Publishing, 1st edition.
    Wachter, S. (2018). Normative challenges of identification in the internet of things:
    Privacy, profiling, discrimination, and the gdpr. Computer law & security review,
    34(3):436–449.
    Wachter, S., Mittelstadt, B., and Russell, C. (2018). Counterfactual explanations
    without opening the black box: automated decisions and the gdpr. Harvard Journal
    of Law & Technology, 31(2):2017.
    Weiss, M. and Archick, K. (2016). Us-eu data privacy: From safe harbor to privacy
    shield.
    West, S. M., Whittaker, M., and Crawford, K. (2019). Discriminating systems: Gen-
    der, race and power in ai. Technical report, AI Now Institute.
    Wugmeister, M., Retzer, K., and Rich, C. (2006). Global solution for cross-border
    data transfers: Making the case for corporate privacy rules. Geo. J. Int’l L., 38:449.
    Zarsky, T. Z. (2016). Incompatible: The gdpr in the age of big data. Seton Hall L.
    Rev., 47:995.
    Zook, M., Barocas, S., Crawford, K., Keller, E., Gangadharan, S. P., Goodman, A.,
    Hollander, R., Koenig, B. A., Metcalf, J., Narayanan, A., et al. (2017). Ten simple
    rules for responsible big data research.
    Zuboff, S. (2019). The age of surveillance capitalism: the fight for the future at the
    new frontier of power. Profile Books.

    QR CODE