The article will focus on the issue of new forms of consumer manipulation that are made possible by new technologies that stand between the consumer and the seller. Specifically, it concerns the phenomenon of dark patterns, a way of manipulating the consumer through the deliberate modification of the web interface of an online marketplace service provider. This method of manipulation has taken on a new dimension in recent times, as artificial intelligence-based tools have begun to be developed for this method of consumer manipulation. However, this does not mean that this is a new phenomenon or a possibility exclusively enabled by the use of artificial intelligence, but it can also be mediated by “simply” manipulating the UX of the web interface in question to achieve a similar character. This duality of the technological approach to achieving the desired consumer response in a subliminal way has consequently caused a fragmentation of the regulation of the use of dark patterns, in particular between the two European regulations on which, among others, the article will focus – the Digital Services Act and the Artificial Intelligence Act. The article will first introduce the phenomenon of dark patterns and the ways in which they are created on web interfaces. The article will then introduce the regulation of both regulations in question and describe the way in which they regulate or under what conditions they allow the use of dark patterns and then compare their effectiveness in achieving the stated objective – i.e. the protection of the consumer from manipulation.
The growth of the digital economy has resulted in unprecedented increase of the amount of data generated by mankind. This vast volume and variety of data is a significant source of knowledge that allows researchers to open new research fields and to analyse existing problems more precisely. It is particularly important for scientists employing machine learning techniques in their studies. From a legal standpoint, however, the data typically belongs to the entity that collected it. In practice, there can be entities such as the owners of social networks (e.g. Instagram – Meta), online services (e.g. YouTube – Google), or Internet of Things devices like fitness bands. This data is protected by private law, thus, the rules on re-use of public data cannot be applied to it. Therefore, access depends on the will of the data holder. This fact has several adverse effects on the development of science and society. Attempts have been made at the European Union level to create legal instruments that facilitate access to privately owned data for academic purposes. One such instrument is the Digital Services Act. This article presents the analysis of the regulation in terms of balancing the interests of scientists and data holders, as well as the practical problems that may arise from its application.
JavaScript is turned off in your web browser. Turn it on to take full advantage of this site, then refresh the page.