We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Datafication-enabled advertising and other datafication practices, in the absence of proper constraints, will deepen the perils of datafication. A set of cross-border competition disciplines proposed in Chapter 5 may well be an effective instrument to address problems associated with platform monopolies and data capitalism. In this way, there would be less need for ex-post competition law enforcement in developing countries and LDCs, where relatively limited resources can be allocated to combat digital cartels and data monopolization. In this context, algorithmic transparency can serve as a starting point for global platform governance. The case study in Chapter 5 investigates the key dimensions of platform transparency requirements in a comparative context and demonstrates that the fragmentation of platform regulation is growing. The proliferation of platform regulations and algorithmic disciplines may place SMEs in an even more difficult situation vis-à-vis big tech companies, which have the resources necessary to manage different legal requirements in different countries. Despite the inherent complexity of the political economy surrounding digital capitalism, Chapter 5 concludes that there are reasons to be optimistic about better governance through international trade agreements.
It is well known that the financial technology (Fintech) industry has great potential not only to transform the financial system, but also to build an equitable and sustainable society. In effect, if this technology is applied in the right way, it could be used to overcome the social and economic gaps that exist worldwide.
Justification:
However, until now, the specific legal regimes (RegTech) that have been established for Fintech have, in addition to the general lack of confidence in new technologies, made its implementation more difficult. Nevertheless, in order to consolidate Fintech, it is necessary to design suitable regulation to transform these new technologies into ordinary instruments of our financial system.
Objective:
Therefore, in order to promote an appropriate RegTech that allows for the progress of Fintech, it is necessary to analyse the legal problems that restrict their expansion by using an analytical methodology and a bibliographic compilation of legal resolutions.
Main conclusion:
Legal personal data protection is the main obstacle that must be overcome by paying attention to the guarantees inherent to this fundamental right. In this way, if the legal system is to be ready for the Digital Revolution, society must not be worried about either the loss of rights or increases in inequalities.
Transparency has been in the crosshairs of recent writing about accountable algorithms. Its critics argue that releasing data can be harmful, and releasing source code won’t be useful.1 They claim individualized explanations of artificial intelligence (AI) decisions don’t empower people, and instead distract from more effective ways of governing.2 While criticizing transparency’s efficacy with one breath, with the next they defang it, claiming corporate secrecy exceptions will prevent useful information from getting out.3
The standard response to concerns about “black box” algorithms is to make those algorithms transparent or explainable. Such approaches, however, involve significant limitations, especially in professional contexts such as medicine, law, or financial advice. Instead, systems should be designed to be contestable, meaning that those subject to algorithmic decisions can engage with and challenge them. Both laws and norms can encourage contestability of automated decisions, but systems designers still must take explicit steps to promote effective questioning and challenges.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.