Full-text resources of CEJSH and other databases are now available in the new Library of Science.
Visit https://bibliotekanauki.pl

Results found: 4

first rewind previous Page / 1 next fast forward last

Search results

Search:
in the keywords:  social robots
help Sort By:

help Limit search:
first rewind previous Page / 1 next fast forward last
EN
This paper presents an overview of empirical studies concerning human attitudes towards robots. We start with explaining what attitudes towards robots are. It is followed by the overview of the aforementioned studies which is organized according to the factors related to the attitudes. These are human-related factors (sex, age, education, nationality, culture, belief in human nature uniqueness, religiousness), robot-related ones (external look, its purpose) and factors which arise from the human-robot interaction (earlier experiences with robots, interactions, designing robots).
Ethics in Progress
|
2019
|
vol. 10
|
issue 2
8-26
EN
There are many issues surrounding the introduction of social robots into society, including concerns about how they may be used to replace true social interaction in personal life, dehumanise formerly social occupations such as elderly care, and be perceived as more human than they actually are. This paper shall present a psychological perspective on the human reception of social robots and apply the gathered information to address these concerns.
Ethics in Progress
|
2021
|
vol. 12
|
issue 1
134-151
EN
The main objective of this paper is to discuss people’s expectations towards social robots’ moral attitudes. Conclusions are based on the results of three selected empirical studies which used stories of robots (and humans) acting in hypothetical scenarios to assess the moral acceptance of their attitudes. The analysis indicates both the differences and similarities in expectations towards the robot and human attitudes. Decisions to remove someone’s autonomy are less acceptable from robots than from humans. In certain circumstances, the protection of a human’s life is considered more morally right than the protection of the robot’s being. Robots are also more strongly expected to make utilitarian choices than human agents. However, there are situations in which people make consequentialist moral judgements when evaluating both the human and the robot decisions. Both robots and humans receive a similar overall amount of blame. Furthermore, it can be concluded that robots should protect their existence and obey people, but in some situations, they should be able to hurt a human being. Differences in results can be partially explained by the character of experimental tasks. The present findings might be of considerable use in implementing morality into robots and also in the legal evaluation of their behaviours and attitudes.
EN
Machines have always been a tool or technical instrument for human beings to facilitate and to accelerate processes through mechanical power. The same applies to robots nowadays – the next step in the evolution of machines. Over the course of the last few years, robot usage in society has expanded enormously, and they now carry out a remarkable number of tasks for us. It seems we are on the eve of a historic revolution that will change everything we know right now. But not only robots have an impact on our life. It is digitization in its entirety, including smart applications and games, that confronts us with new spaces. This special volume of Ethics in Progress tries to broaden our understanding of a philosophical field – robots and digitization – that is still in its infancy in terms of it research and literature.
first rewind previous Page / 1 next fast forward last
JavaScript is turned off in your web browser. Turn it on to take full advantage of this site, then refresh the page.