Full-text resources of CEJSH and other databases are now available in the new Library of Science.
Visit https://bibliotekanauki.pl

Results found: 2

first rewind previous Page / 1 next fast forward last

Search results

help Sort By:

help Limit search:
first rewind previous Page / 1 next fast forward last
EN
The article will focus on the issue of new forms of consumer manipulation that are made possible by new technologies that stand between the consumer and the seller. Specifically, it concerns the phenomenon of dark patterns, a way of manipulating the consumer through the deliberate modification of the web interface of an online marketplace service provider. This method of manipulation has taken on a new dimension in recent times, as artificial intelligence-based tools have begun to be developed for this method of consumer manipulation. However, this does not mean that this is a new phenomenon or a possibility exclusively enabled by the use of artificial intelligence, but it can also be mediated by “simply” manipulating the UX of the web interface in question to achieve a similar character. This duality of the technological approach to achieving the desired consumer response in a subliminal way has consequently caused a fragmentation of the regulation of the use of dark patterns, in particular between the two European regulations on which, among others, the article will focus – the Digital Services Act and the Artificial Intelligence Act. The article will first introduce the phenomenon of dark patterns and the ways in which they are created on web interfaces. The article will then introduce the regulation of both regulations in question and describe the way in which they regulate or under what conditions they allow the use of dark patterns and then compare their effectiveness in achieving the stated objective – i.e. the protection of the consumer from manipulation.
The Lawyer Quarterly
|
2024
|
vol. 14
|
issue 2
236-251
EN
Despite the rise in processes that are being automated in our daily life not much attention has been directed at the regulation of automation as such, that is not tied expressly to the technology that is being used for the automation. This holds true for automated decision-making, which can in its public form have a great impact on the individual’s life. As such, automated decision-making has only been regulated as a part of privacy-oriented legal instruments, which naturally begs the questions, whether the right to not be subject to automated decision-making is in fact a privacy related right. The article attempts to answer this question by identifying the place of the right to not be subject to automated decision-making within one of the privacy types, identified in extensive typology of Koops et al. It further posits several other legal values, that are different from privacy, that could warrant the placement of this right within the existing legal instruments.
first rewind previous Page / 1 next fast forward last
JavaScript is turned off in your web browser. Turn it on to take full advantage of this site, then refresh the page.