TechReg hardcopies
TechReg is also available in full colour print for those of us that prefer hardcopies. Volumes 2019 (1), 2020 (2), 2021 (3) and the special issue 'Should Data drive Private Law' are up for sale.
Read more about TechReg hardcopiesTechReg is also available in full colour print for those of us that prefer hardcopies. Volumes 2019 (1), 2020 (2), 2021 (3) and the special issue 'Should Data drive Private Law' are up for sale.
Read More Read more about TechReg hardcopiesThis article examines how self-regulatory and then competition authority-imposed requirements for data portability (Midata) then interoperability (Open Banking) in the UK’s retail banking markets were used to increase competition and create better-functioning, more innovative and diverse markets for personal accounts and small business banking, and related services. These requirements went further than the EU’s second payment services directive, including a co-regulatory obligation for the nine largest retail and small business banks to agree a common technical interface (API) and standards for security, user experience, and other areas identified as important to customers, overseen by a trustee appointed by the Competition & Markets Authority. This case study explores how these requirements evolved from ineffective portability requirements to in-depth interoperability obligations, which have enabled hundreds of firms to create a thriving UK “fintech” market of complementary financial services, although so far having less impact on direct competition with incumbent banks.
Given the continuing fascination with “magic computers” and “self-executing code,” it is necessary to re-examine the promises – and premises - of technology-driven improvements to transacting practices purportedly introduced by smart contracts. Contrary to the popular narrative, smart contracts do not eliminate the need for trust and are technically incapable of guaranteeing performance. The fascination with clear and unbreakable rules that are executed by code obfuscates the fact that such rules may be suboptimal and may incorrectly represent what was agreed. It also obscures the fact that it is impossible to write perfect code. Being in plain view and impossible to modify, changes nothing in this regard. Trust and certainty do not magically emerge from immutability or transparency. Regulatory efforts in this area must be based on facts, not fairy tales.
We present a research agenda for secure and intelligent regulatory technology (’regtech’). This encompasses an overview of the conceptual, theoretical, and practical challenges that arise when using digital technologies to comply with regulatory regimes. Intelligent regtech ‘solutions’ are often tailor-made to achieve better oversight and compliance outcomes. Such tools can make regulation and compliance easier and more efficient. Their use poses security challenges in respect of data, cybersecurity, and the consumer. Regtech also raises competition and antitrust issues, as well as commercial and operational ones. We explore these by a targeted review of the literature. In doing this, we deliver new insights and highlight considerations for scholars. We articulate the concepts requiring further investigation. Our contribution is in defining regtech and establishing an interdisciplinary roadmap for further scholarly study.
The article addresses human rights requirements for person-based predictive policing. It looks into human rights standards, as elaborated in the selected European Court of Human Rights case law on creating police databases, watchlists and registries, and the police’s use of new technologies. The article argues that in the case of new technologies deployed by law enforcement the availability of evidence on the effectiveness and accuracy of a given method should be essential to assess that an interference with a human right using this technology is ‘necessary in a democratic society’. The article notes that the Court’s unwillingness to assess the claims about the utility of technology critically might suggest that its evaluation of human rights compliance of person-based predictive policing and other experimental technologies would suffer from a severe blind spot.
EU regulatory initiatives on technology-related topics has spiked over the past few years. On the basis of its Priorities Programme 2019-2024, while creating “Europe fit for the Digital Age”, the EU Commission has been busy releasing new texts aimed at regulating a number of technology topics, including, among others, data uses, online platforms, cybersecurity, or artificial intelligence. This paper identifies three basic phenomena common to all, or most, EU new technology-relevant regulatory initiatives, namely (a) “act-ification”, (b) “GDPR mimesis”, and (c) “regulatory brutality”. These phenomena divulge new-found confidence on the part of the EU technology legislator, who has by now asserted for itself the right to form policy options and create new rules in the field for all of Europe. These three phenomena serve as indicators or early signs of a new European technology law-making paradigm that by now seems ready to emerge.
Wastewater analysis and surveillance are well-established practices whose use has dramatically expanded during the COVID-19 pandemic. In this article, we argue that the extraction of diverse types of data from wastewater is part of the larger phenomenon of ‘datafication’. We explore the evolving technologies and uses of wastewater data and argue that there are insufficient legal and ethical frameworks in place to properly govern them. We begin with an overview of the different pur- poses for wastewater data analyses as well as the location and scale of collection. We then consider legal and ethical principles and oversight frameworks that shape current approaches to wastewater collection. After situating wastewater collection within its particular civic context, we argue in favour of greater engagement with legal and ethical issues and propose doing so through a civic perspective. Our paper concludes with a discussion of the normative shifts that are needed and how we might achieve these.
On 2 February 2022, the Belgian Data Protection Authority handed down a decision concerning IAB Europe and its Transparency and Consent Framework (TCF), a system designed to facilitate compliance of real-time bidding (RTB), a widespread online advertising approach, with the GDPR. Here, we summarise and analyse this large, complex case. We argue that by characterising IAB Europe as a joint controller with RTB actors, this important decision gives DPAs an agreed-upon blueprint to deal with a structurally difficult enforcement challenge. Furthermore, under the DPA’s simple-looking remedial orders are deep technical and organisational tensions. We analyse these “impossible asks”, concluding that absent a fundamental change to RTB, IAB Europe will be unable to adapt the TCF to bring RTB into compliance with the decision.
A growing body of literature discusses the impact of machine-learning algorithms on regulatory processes. This paper contributes to the predomi- nantly legal and technological literature by using a sociological-institutional perspective to identify nine organisational challenges for using algorithms in regulatory practice. Firstly, this paper identifies three forms of algorithms and regulation: regulation of algorithms, regulation through algorithms, and regulation of algorithms through algorithms. Secondly, we identify nine organisational challenges for regulation of and through algorithms based on literature analysis and empirical examples from Dutch regulatory agencies. Finally, we indicate what kind of institutional work regulatory agencies need to carry out to overcome the challenges and to develop an algorithmic regu- latory practice, which calls for future empirical research.
This paper investigates the practice of algorithmic price discrimination with a view to determining its impact on markets and society and making a possible plea for regulation. Online market players are gradually gaining the capacity to adapt prices dynamically based on knowledge generated through vast amounts of data, so that, theoretically, every individual consumer can be charged the maximum price he or she is willing to pay. The article discusses the downsides of data-driven price discrimination. It considers the extent to which such downsides are mitigated by European Union law, and what role remains for national provisions and consumer-empowering technologies. We find that the existing EU provisions address price discrimination only marginally and that full harmonisation, a goal pursued through many acts regulating consumer markets, restricts the Member States’ margin for independent legislation. Accordingly, consumer protection against algorithmic pricing may rely, in practice, on consumer-empowering technologies and initiatives. We investigate the implications of this state of affairs, arguing that an unbalanced “digital arms race” between the use of algorithms as market devices on the one hand, and their use as consumer protection tools on the other, does not ensure consumer protection. Based on these findings, we advance a claim for regulation which pursues two main goals: first, to make the race more balanced by strengthening the digital tools available to consumer protection actors and, second, to limit the battlefield by clarifying and refining the applicable rules and defining clearer categories of impermissible behaviours.
This special issue tackles the question of whether and how data shapes private law. The development of new technologies enabled the generation, collection and processing of both personal and non-personal data on an unprecedented scale. The implications
of this phenomenon for private law are threefold. One, how does data affect our understanding of technology regulation in private law relationships? Two, how does data affect the way in which private law is applied? Three, what is the role of data in the design of law from a public policy perspective that transcends doctrinal considerations relating to private law?
Earlier this year, the European Commission presented its highly anticipated proposal for the European Health Data Space, with a view to harness “the power of health data for people, patients and innovation”. Considering this objective, the present special issue addresses the governance of health data for research uses by discussing the balancing approach among the diverse aspirations of this ambitious proposal and ways to ultimately preserve trust in such endeavours.
In the recent years the importance of secondary uses of health data for clinical, research and policy making purposes has been further stressed in view of the availability of health-related data collected in traditional and non-traditional settings. However, processing health data - which are sensitive type of personal data - requires adopting adequate legal and ethical protections, to ensure that rights of the data subjects have been respected, while also facilitating responsible access to data. In this paper we aim to shed light on the interplay between the existing and emerging relevant European regulatory frameworks related to data processing, including the General Data Protection Regulation (GDPR), the upcoming Data Governance Act and the legislative proposal for European Health Data Space. In doing that, we will focus mainly on the legal bases for secondary uses of data in view of the overarching princi- ples of data protection.
The proposed European Health Data Space regulation (the proposed EHDS regulation) intends to facilitate the secondary use of electronic health data for scientific research purposes. The mechanism that the proposal contains will co-exist with the scientific research regime that has been established under the General Data Protection Regulation. This article examines how the proposed EHDS regulation promises to transform the EU scientific research regime and protection of a data subject in scientific research. This paper shows that a scientific research regime 2.0 is put forward, where sharing is enhanced at the cost of self-determination of the individual and without resolving many of the challenges that emerged under the GDPR.
Technology and Regulation (TechReg) is a new interdisciplinary journal of law, technology and society. TechReg provides an open-access platform for disseminating original research on the legal and regulatory challenges posed by existing and emerging technologies.
The Editor-in-Chief is Professor Ronald Leenes of the Tilburg Law School. Our Editorial Board Committee comprises a distinguished panel of international experts in law, regulation, technology and society across different disciplines and domains.
TechReg aspires to become the leading outlet for scholarly research on technology and regulation topics, and has been conceived to be as accessible as possible for both authors and readers.
© 2022 Technology and Regulation • DOI: 10.26116/techreg • ISSN: 2666-139X • Privacy Policy • Responsible Disclosure Policy • Published by Open Press TiU • Supported by Openjournals