Full-text resources of CEJSH and other databases are now available in the new Library of Science.
Visit https://bibliotekanauki.pl

PL EN


2013 | 2 | 2 | 146-158

Article title

CHATBOTS FOR CUSTOMER SERVICE ON HOTELS’ WEBSITES

Content

Title variants

Languages of publication

EN

Abstracts

EN
In this article we present an analysis of implementations of a chatbot - a program which simulates an intelligent conversation with webpage visitors, dedicated to ho-tels and guesthouses (hotel chatbot, in short: HC). We obtained unique data from five various webpages exhibiting various configurations, containing a total of 17413 user statements in 4165 conversations. HC informative function was confirmed for more than 56% of the conversations. Moreover, 63% of users prefer to interact with HC if it suggests at least one clickable option to choose as an alternative to typing. The results indicate that the implementation of speech synthesis increases the per-centage of users who decide to start a conversation with the HC and it may have a positive impact on the percentage of users that book rooms online.

Year

Volume

2

Issue

2

Pages

146-158

Physical description

Dates

published
2013

Contributors

  • Department of Information Systems and Economic Analysis, Faculty of Economic Sciences, University of Warsaw
author
  • Denise Systems sp. z o.o.

References

  • Abu Shawar B., Atwell E. (2007) Chatbots: Are they Really Useful?, LDV-Forum Journal for Computational Linguistics and Language Technology, 22 (1), pp. 29-49.
  • Abu Shawar B., Atwell E. (2004) Evaluation of Chatbot Information System, in Pro-ceedings of the Eighth Maghrebian Conference on Software Engineering and Artificial Intelligence.
  • Banach A., Dąbkowski J. (2010), Duży ruch to nie wszystko. Żeby gość nas polecał, Ho-telarz, 11(574) November, pp. 36-39.
  • Bartneck C., Rosalia C., Menges R., Deckers I. (2005) Robot Abuse - A Limitation of the Media Equation, in Proceedings of the INTERACT 2005 workshop Abuse: The darker side of Human-Computer Interaction, Rome, Italy.
  • Brahnam S. (2006) Gendered bods and bot abuse, in Proceedings of the CHI 2006 workshop on Misuse and abuse of interactive technologies, Montreal, Quebec, Canada.
  • Brahnam S. (2005) Strategies for handling customer abuse of ECAs, in Proceedings of the INTERACT 2005 workshop Abuse: The darker side of Human-Computer Interac-tion, Rome, Italy, pp. 62-67.
  • Bogdanovych A., Simoff S., Sierra C., Berger H. (2005) Implicit training of virtual shopping assistants in 3D electronic institutions, in Proceedings of the IADIS Interna-tional Conference: e-Commerce 2005, IADIS Press, Portugal, pp. 50-57.
  • Chai J., Budzikowska M., Horvath V., Nicolov N., Kambhatla N., Zadrozny W. (2001) Natural Language Sales Assistant - A Web-Based Dialog System for Online Sales, in Proceedings of the 13th Innovative Applications of Artificial Intelligence Conference, IAAI’01, Seattle, WA, pp. 19-26.
  • De Angeli A. (2006) On Verbal Abuse Towards Chatterbots, in Proceedings of the CHI 2006 workshop on Misuse and Abuse of Interactive Technologies, Montreal, Quebec, Canada.
  • De Angeli A., Brahnam S. (2008) I hate you! Disinhibition with virtual partners, Inter-acting with Computers, 20(3), pp. 302-310.
  • De Angeli A., Carpenter R. (2005) Stupid computer! Abuse and social identities, in Proceedings of the INTERACT 2005 workshop Abuse: The darker side of Human-Computer Interaction, Rome, Italy.
  • De Angeli A., Johnson G. I., Coventry L. (2001) The unfriendly user: exploring social reactions to chatterbots, in Proceedings of the International Conference on Affective Human Factor Design, London, pp. 467-474.
  • Hughes L. (2006) The Eliza Effect: Conversational Agents and Cognition, available at: http://www.laurahughes.com/art/elizaeffect.pdf (accessed 27 September 2011).
  • Jessa Sz. (2004) Czy chatterboty nas rozumieją?, Software 2.0 Extra, 10/2004, pp. 16-20, available at: http://sdjournal.org/magazine/1251-sztuczna-inteligencja.
  • Jessa Sz., Jędruch W. (2010) Przetwarzanie wyrażeń języka naturalnego w wyrażenia logiczne - system Denise, in Przedsięwzięcia i usługi informacyjne. Praca zbiorowa Ka-tedry Architektury Systemów Komputerowych KASKBOOK, (Ontologie w opisie scena-riuszy usług), Gdańsk, pp. 75-87.
  • Kuligowska K. (2010) Koszty i korzyści implementacji wirtualnych asystentów w przed-siębiorstwach oraz ich znaczenie dla rozwoju gospodarki elektronicznej, rozprawa doktorska, Wydział Nauk Ekonomicznych Uniwersytetu Warszawskiego, Warszawa.
  • Kuligowska K., Lasek M. (2011) Virtual assistants support customer relations and business processes, The 10th International Conference on Information Management, Gdańsk.
  • Kopp S., Gesellensetter L., Krämer N., Wachsmuth I. (2004) A conversational agent as museum guide - design and evaluation of a real-world application, in Proceedings of Intelligent Virtual Agents (IVA 2005), Berlin, Germany, Volume 3661, pp. 329-343.
  • Loebner Prize, (2011), available at: http://www.loebner.net/Prizef/loebner-prize.html (accessed 27 September 2011).
  • Mewes D., Heloir A. (2009) The Uncanny Valley, available at: http://embots.dfki.de/doc/seminar_ss09/writeup%20uncanny%20valley.pdf (accessed 27 September 2011).
  • Nass C., Steuer J., Tauber E. (1994) Computers are social actors. Human Factors in Computing Systems, in CHI ´94 Conference Proceedings, New York, pp. 72-78.
  • Pfeiffer T., Liguda C., Wachsmuth I., Stein S. (2011) Living with a Virtual Agent: Seven Years with an Embodied Conversational Agent at the Heinz Nixdorf MuseumsForum, in: S. Barbieri , K. Scott, & L. Ciolfi (eds.), Proceedings of the Re-Thinking Technology in Museums 2011 - Emerging Experiences. Limmerick: think creative & the University of Limerick, pp. 121-131.
  • Reeves B. (2004) The Benefits of Interactive Online Characters, available at: http://www.sitepal.com/pdf/casestudy/Stanford_University_avatar_case_study.pdf (ac-cessed 27 September 2011).
  • Robinson S., Traum D., Ittycheriah M., Henderer J. (2008) What would you ask a con-versational agent? observations of human-agent dialogues in a museum setting, in Proceedings of the 5th International Conference on Language Resources and Evalua-tion.
  • Saarine L. (2001) Chatterbots: Crash Test Dummies of Communication. Master's Thesis, UIAH Helsinki, available at: http://mlab.uiah.fi/~lsaarine/bots/ (accessed 27 September 2011).
  • Shieber S. (1993) Lessons from a Restricted Turing Test, available at: http://www.eecs.harvard.edu/shieber/Biblio/Papers/loebner-rev-html/loebner-rev-html.html (accessed 27 September 2011).
  • Wallis, P. (2005) Believable conversational agents: introducing the intention map, available at: http://nlp.shef.ac.uk/dqa/wallis05-3.pdf (accessed 27 Sep. 2011).
  • Wallis P. (2005 b) Robust normative systems: what happens when a normative system fails?, in Proceedings of the INTERACT 2005 workshop Abuse: The darker side of Hu-man-Computer Interaction, Rome, Italy.
  • Weizenbaum J. (1976) Computer power and human reason: from judgment to calcula-tion, W. H. Freeman & Co., NY, USA.
  • Weizenbaum J. (1966) ELIZA – A computer program for the study of natural language communication between man and machine, Communications of the ACM, 10(8), pp. 36–45.

Document Type

Publication order reference

Identifiers

ISSN
2084-5537

YADDA identifier

bwmeta1.element.desklight-c9f46380-f824-4f05-97b5-5cf021d306be
JavaScript is turned off in your web browser. Turn it on to take full advantage of this site, then refresh the page.