Deriving Design Principles for Educational Chatbots from Empirical Studies on Human–Chatbot Interaction
Copyright ⓒ 2020 The Digital Contents Society
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-CommercialLicense(http://creativecommons.org/licenses/by-nc/3.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract
This study derives design principles according to the role of chatbots through a systematic review of educational chatbots. We propose design principles that should be considered, depending on the role of the chatbot. When designing a chatbot that plays the role of a tutor, it is necessary to consider the Live emotion principle, Modality principle, and Extraneous principle. When designing a chatbot that acts as an evaluator, the Bot effect principle should be considered. When developing a chatbot that acts as a responder, the Gender principle and Modality principle should be considered. In the case of a chatbot that plays the role of a moderator, it is necessary to consider the Neutral emotion principle, and in the case of a chatbot that plays the role of peer learner, the Modality principle (voice), the Imitation principle, and the Neutral emotion principle should be considered. In the future, it is necessary to study the method of contents presentation and the differentiated role of educational chatbots.
초록
이 연구에서는 교육용 챗봇에 대한 체계적인 연구를 통해 챗봇 설계 시 고려해야 할 원리를 도출하였다. 이를 위하여 교육용 챗봇에 대한 선행연구 분석을 진행하였으며, 분석 결과를 토대로 챗봇의 역할을 고려한 설계 원리를 제안하였다. 선행연구를 토대로 교육용 챗봇의 역할은 크게 튜터, 평가자, 응답자, 중재자, 학습동료로 구분할 수 있었다. 역할별로 고려해야 할 설계 원리를 탐색한 결과, 튜터챗봇은 감성 원리(Live emotion principle), 양식 원리(Modality principle), 외생적 부하 조절 원리(Extraneous principle)를 고려해야 하는 것으로 나타났다. 평가자 역할의 챗봇은 봇 효과 원리(Bot effect principle)를, 응답자 챗봇을 개발할 때는 성 원리(Gender principle)과 양식 원리를 고려해야 한다. 중재자 챗봇의 경우 중립적 감정 원리(Neutral emotion principle)을, 동료 학습자 챗봇의 경우, 양식 원리와 더불어 모방 원리(Imitation principle), 중립적 감정 원리(Neutral emotion principle)를 고려해야 한다. 앞으로는 챗봇의 역할에 따른 콘텐츠 제시 방법과 교육 챗봇의 차별화된 역할에 대한 연구를 더욱 심층적으로 수행할 필요가 있다.
Keywords:
Chatbot, Chatbot-Mediated Learning (CML), Design Principles키워드:
챗봇, 챗봇매개학습(CML), 설계원리Ⅰ. Introduction
Chatbots are computer programs that help humans communicate with computers through text or voice interactions. With the proliferation of Massive Open Online Courses (MOOCs) and the widespread use of messaging apps, the need for chatbots in education is increasing. There are three reasons for introducing chatbots. First, customer management costs can be lowered [11]. Second, they can shorten the time within which a response is provided to the customer, can support the service 24 hours a day, and can improve user satisfaction through customized consultation. Third, it is possible to improve the product or service by collecting information about the customer’s needs during the conversation with the chatbot. We may expect the same possibility in the context of education. When using chatbot technology for educational purposes, providing feedback to learners can be made more efficient, and it can be done all the time, increasing learners’ satisfaction. In addition, learning support may be optimized by collecting a variety of information about the learners. However, while chatbot technology is evolving, its integration into education tends to be rather sluggish [1]. There is a lack of research on the design principles to consider when developing an educational chatbot. This study aims to promote the development of educational chatbots by setting out the principles to be considered in designing educational chatbots, based on systematic analysis.
RQ1: How have chatbots been incorporated into empirical studies on human–chatbot interaction?
RQ2: What implications for educational chatbots can be derived from the studies?
Ⅱ. Theoretical Background
2-1 Expectations and Roles of Chatbots
A chatbot is a computer program to simulate human conversation via text or voice interaction [19]. Other terms for chatbots include talkbots, chatterbots, conversational agents, artificial conversational entities, and a conversational system. Efforts have been made to introduce chatbots or similar technologies in the education field, and related terms include a pedagogical agent or intelligent pedagogical agent (IPA), intelligent tutoring systems (ITS), and Artificial Intelligence Markup Language (AIML) -based chatbot. In the context of technology-mediated learning [2], chatbot-mediated learning (CML) contributes to motivation, self-directed learning, and individual learning by providing learners with individual learning environments that enhance the learning process and its outcomes. More specifically, chatbots can influence the learner’s learning process – the way in which information is found and communicated. In other words, rather than being provided with the contents passively, learners can support themselves to ask questions and lead the way. Second, learners can effectively support the learning process in large classrooms or in large online courses such as MOOCs. This may contribute to lowering the dissatisfaction experienced by learners and lowering the dropout rate. Third, learners can help them to make the right judgment by providing optimal information at the right time, and can provide continuous feedback to learners / teachers.
In general, chatbots are responsible for providing guidance, answering questions, or facilitating specific actions as coaches or colleagues (Table 1). In the educational context, the role of the chatbot can be set in various ways, which can be divided into five roles (see table 2). They are: tutors who guide and support the learning process of individual learners; evaluators who check the learner’s progress and diagnose performance; respondents who answer learners’ questions; communicators who mediate instructors and learners through interaction with learners; and fellow learners who exchange everyday conversations.
2-2 Principles of Chatbot Design
The following should be considered when designing chomps derived from Facebook (bot) [8], interoperability [12], and Microsoft [17] design and development principles.
The principles in table 3 provide guidelines on how to interact with chatbots from the UI or UX standpoint, but do not provide a standard on the purpose for which it should be used. In order to actively use a chatbot in an educational context, design and development guidelines should be prepared from the viewpoint of teaching and learning.
Hints for deriving chatbot design principles can be found in the Conversational Agents (CA) study. Traditional research was mainly on agent support, voice, and appearance (see table 4). The research that is required for the future is empirical and qualitative study of the change due to the agent’s participation, and research into the role of the agent.
Ⅲ. Methodology
In order to establish an empirical ground from which to derive design principles for educational chatbots, we first explored previous chatbot studies and summarized their findings. From there we extracted several implications for a chatbot design that is suitable in an educational context. The review process began by identifying the relevant research papers from Social Science Citation Index (SSCI) and Science Citation Index Expanded (SCIE) journals, which are of high quality and impact. Conference proceedings and conceptual papers were excluded from the search. Research papers published since 2005 were collected using the keywords “conversational agent”, “chatbot”, “pedagogical agent”, “conversational system”, “dialog system”, “chatterbot”, “chat bot”, “chat-bot”, and “intelligent pedagogical agent”. After the search process, we screened the articles by distinguishing empirical studies that focused on interactions between humans and chatbots. A total of seven studies from six articles were reviewed.
Ⅳ. Findings
4-1 Research question 1: How have chatbots been incorporated into empirical studies on human-chatbot interaction?
To answer the research question, we organized the review findings into two sets; one sorted by chatbot feature and the other by research variables and results. Basic information on each study was included in the first set (see table 5). Of the seven studies reviewed, all the researches were conducted under a higher education setting except for that of Corti and Gillespie (2016) [6], which was in an open setting, and that of van der Meij, van der Meij, and Harnsen (2015) [21], at a secondary school. The articles covered target knowledges in a varied range of disciplines such as healthy eating behavior [3], the circulatory system [9], instructional planning [14], and kinematics [21]. The chatbots used in the studies also differed from each other.
The chatbot features examined in the studies were mostly variations of delivery types (or representation types). They included expressions made by chatbots (e.g., facial expression, emotional expression, empathetic expression), the gender of the chatbots (i.e., male and female), modality (e.g., voice, text), and other representation types (e.g., head movement). A few studies incorporated instructional features into chatbots by providing prompts and feedback [9] and motivational scaffolding [21].
The major findings of the studies are listed in table 6. Overall, the results showed a tendency for participants to project their human-to-human interaction practices to their human-to-chatbot interaction, especially when the chatbot was designed to be more human-like. In detail, participants report more positive outcomes when the chatbots express or represent emotion than when they interact with chatbots designed to exhibit neutral emotion [3, 14, 18]. They also exhibited social stereotyping towards a gendered chatbot [14]. In cases of modality, though the results were not perfectly consistent, participants seemed to better understand a text-based chatbot than a speaking chatbot [3], while they showed more human-like interaction with the latter [6, 18].
4-2 Research question 2: What implications for educational chatbots can be derived from the studies?
From the review, we reorganized the findings with similar attributes and characteristics. Explanations for each attribute were then elaborated in the learning context. The implications are as follows.
• Live emotion – chatbots are better when designed to display consistent facial expressions or positive emotional expressions.
• Neutral emotion – a chatbot with a neutral emotional expression is more acceptable for persuasion.
• Modality – written text is better for delivering information or a guiding process; spoken text is better for affective support.
• Extraneous – too many animated or visual graphics have a detrimental effect on performance.
• Gender – people project social gender stereotyping according to the chatbot’s gender; people value information from a chatbot differently, depending on its gender representation.
• Bot effect – a chatbot can perform works that are redundant and require accuracy better than a human can.
• Imitation – more human-like chatbots drive more human-like interactions and establish a trusting relationship when giving information.
After extracting the implications, they were matched with each role of the educational chatbot (i.e., tutor, assessment, question and answer, communication, common dialogue); see table 7.
Ⅴ. Discussion
This study derives design principles according to the role of a chatbot by using a systematic review of recently published literature on educational chatbots. This approach can be expected to help in the design and development of educational chat-bots in situations where there is insufficient chatbot development and related research in an educational context. The findings of this study can be summarized as follows.
5-1 Key result
In order to derive design principles for educational chatbots, the seven studies examined in this study examined how appearance characteristics such as facial expressions, gender, and style of chatbot affect the learning process and performance. As a result, when the chatbot expresses emotionally rather than neutrally, text-based rather than speech-based human interactions contribute more to learning. The design principles derived from this are the Live emotion principle, Neutral emotion principle, Modality principle, Extraneous principle, Gender principle, Bot effect principle, Imitation principle, and so on. In addition, this study matched design principles to be considered according to the role of chatbot when designing an educational chatbot. When designing a chatbot that plays the role of a tutor, it is necessary to consider the Live emotion principle, Modality principle, and Extraneous principle. When designing a chatbot that acts as an evaluator, the Bot effect principle should be considered. When developing a chatbot that acts as a responder, the Gender principle and Modality principle should be considered. In the case of a chatbot that plays the role of a moderator, it is necessary to consider the Neutral emotion principle, and in the case of a chatbot that plays the role of peer learner, the Modality principle (voice), the Imitation principle, and the Neutral emotion principle should be considered. In this study, we explored some principles for educational chatbots based on previous studies, but most of them were related to the appearance characteristics of chatbots. In the future, research is needed on the contents presentation method of chatbots and differentiated roles.
5-2 Areas for further study
As mentioned above, there are relatively few studies on the principles to be considered in the design of educational chatbots and the appropriate design principles according to the role of the chatbots. Related research needs to be actively conducted in the future, and research on suitable design principles is required according to the purpose and role of the chatbot.
Prior studies have found that it is difficult to find consensus on the characteristics of educationally effective chatbots, but learners want to learn with more human and emotional chatbots. Although this may be beneficial in terms of motivation, further research is needed to determine whether it will have significant effects on learning outcomes. In addition, it is necessary to study the differences between education through chatbots and through other educational methods, and in short- and long-term settings.
It is also necessary to study how the role of the instructor and how the interaction between the instructor and the learner is changed by the educational use of the chatbot. Research is also required on the side effects of using chatbots and the degree of acceptance according to learners’ characteristics; for example, study of how the chatbot’s performance varies according to a learner’s ability to use a computer, propensity to cooperate, learning style, and learning level. There is also a need for research on the cost-effectiveness of educational use. It is also necessary to discuss which educational context is the most effective when a chatbot is used for any educational purpose, and that from a cost-effectiveness analysis it is worth introducing a chatbot.
5-3 Limitation
This study has some limitations. First of all, although some papers have educational contexts, they include cases that are not for educational purposes, so it is hard to say that they derive principles entirely for educational chatbots. Since this study did not examine the gray literature, such as theses, current research, academic journals, and research reports, there is a possibility of publication bias. It is also difficult to avoid language bias because it includes papers in English only. However, this study attempted to study the special area of the educational chatbot, which was not sufficiently examined in the past, and it is considered to have sufficient advantages because it tried to derive differentiated principles. In order to develop a chatbot with various purposes and roles for educational purposes, it is necessary to make various efforts with various experts.
Acknowledgments
This work was supported by National Research Foundation of Korea Grant funded by the Korean Government(KRF-2019-S1A5A8-036708)
Reference
- 6 Ways Artificial Intelligence and Chatbots Are Changing Education. Chatbots magazine. Available: https://chatbotsmagazine.com/six-ways-a-i-and-chatbots-are-changing-education-c22e2d319bbf, .
- Alavi, M., Leidner, D. E, “Research commentary: Technology-mediated learning—A call for greater depth and breadth of research”, Information systems research, Vol. 12, No. 1, pp. 1-10, 2001. [https://doi.org/10.1287/isre.12.1.1.9720]
- Berry, D., Butler, L., de Rosis, F, “Evaluating a realistic agent in an advice-giving task”, International Journal of Human-Computer Studies, Vol. 63, No. 3, pp. 304-327, 2005. [https://doi.org/10.1016/j.ijhcs.2005.03.006]
- Bodemer, D., Ploetzner, R., Feuerlein, I., Spada, H, “The active integration of information during learning with dynamic and interactive visualisations”, Learning and Instruction, Vol. 14, No. 3, pp. 325-341, 2004. [https://doi.org/10.1016/j.learninstruc.2004.06.006]
- Chatbots Infographic - Key Statistics 2017. Available: https://www.bevytechnologies.com/infographic-chatbots-key-statistics-2017, .
- Corti, K., Gillespie, A, “Co-constructing intersubjectivity with artificial conversational agents: People are more likely to initiate repairs of misunderstandings with agents represented as human”, Computers in Human Behavior, Vol. 58, pp. 431-442, 2016. [https://doi.org/10.1016/j.chb.2015.12.039]
- Eisman, E., López, V., Castro, J, “A framework for designing closed domain virtual assistants”, Expert Systems with Applications, Vol. 39, No. 3, pp. 3135-3144, 2012. [https://doi.org/10.1016/j.eswa.2011.08.177]
- Facebook for developers: Design Principles. Available: https://developers.facebook.com/docs/messenger-platform/introduction/general-best-practices/
- Harley, J., Carter, C., Papaionnou, N., Bouchet, F., Landis, R., Azevedo, R., Karabachian, L, “Examining the predictive relationship between personality and emotion traits and stu-dents’ agent-directed emotions: towards emotionally-adaptive agent-based learning environments”, User Modeling and User-Adapted Interaction, Vol. 26, No. 2-3, pp. 177-219, 2016. [https://doi.org/10.1007/s11257-016-9169-7]
- Hasler, B., Tuchman, P., Friedman, D, “Virtual research assistants: Replacing human interviewers by automated avatars in virtual worlds”, Computers in Human Behavior, Vol. 29, No. 4, pp. 1608-1616, 2013. [https://doi.org/10.1016/j.chb.2013.01.004]
- Hayashi, Y., Ono, K, In 2013 IEEE International Workshop on Robot and Human Interactive Communication, pp. 120-125. IEEE Press, Roman, 2013.
- Intercom: Principles of Bot design: Inside Intercom. Available: https://blog.intercom.io/principles-bot-design/
- Kester, L., Kirschner, P., van Merriënboer, J, “The management of cognitive load during complex cognitive skill acquisition by means of computer-simulated problem solving”, British Journal of Educational Psychology, Vol. 75, No. 1, pp. 71-85, 2005. [https://doi.org/10.1348/000709904X19254]
- Kim, Y., Baylor, A., Shen, E, “Pedagogical agents as learning companions: the impact of agent emotion and gender”, Journal of Computer Assisted Learning, Vol. 23, No. 3, pp. 220-234, 2007. [https://doi.org/10.1111/j.1365-2729.2006.00210.x]
- Louvet, J., Duplessis, G., Chaignaud, N., Vercouter, L., Kotowicz, J, “Modeling a collaborative task with social commitments”, Procedia Computer Science, Vol. 112, pp. 377-386, 2017. [https://doi.org/10.1016/j.procs.2017.08.218]
- Mell, J., Lucas, G., & Gratch, J, “An effective conversation tactic for creating value over repeated negotiations”, In: the 2015 International Conference on Autonomous Agents and Multiagent Systems, pp. 1567-1576. International Foundation for Autonomous Agents and Multiagent Systems, Istanbul, 2015.
- Microsoft: Principles of bot design. Available: https://docs.microsoft.com/en-us/azure/bot-service/bot-service-design-principles?view=azure-bot-service-4.0
- Novielli, N., de Rosis, F., Mazzotta, I, “User attitude towards an embodied conversational agent: Effects of the interaction mode”, Journal of Pragmatics, Vol. 42, No. 9, pp. 2385-2397, 2010. [https://doi.org/10.1016/j.pragma.2009.12.016]
- Rouse, M, “What is chatbot?” [Online]. Available: https://searchcustomerexperience.techtarget.com/definition/chatbot, .
- Turunen, M., Hakulinen, J., Ståhl, O., Gambäck, B., Hansen, P., Rodríguez Gancedo, M., de la Cámara, R., Smith, C., Charlton, D., Cavazza, M, “Multimodal and mobile conversational Health and Fitness Companions”, Computer Speech & Language, Vol. 25, No. 2, pp. 192-209, 2011. [https://doi.org/10.1016/j.csl.2010.04.004]
- van der Meij, H., van der Meij, J., Harmsen, R, “Animated pedagogical agents effects on enhancing student motivation and learning in a science inquiry learning environment”, Educational Technology Research and Development, Vol. 63, No. 3, pp. 381-403, 2015. [https://doi.org/10.1007/s11423-015-9378-5]
- Xu, K., Lombard, M, “Persuasive computing: Feeling peer pressure from multiple computer agents”, Computers in Human Behavior, Vol. 74, pp. 152-162, 2017. [https://doi.org/10.1016/j.chb.2017.04.043]
2006년 : 한양대학교 대학원 (교육학 석사)
2010년 : 한양대학교 대학원 (일반대학원 교육학 박사- 교수설계 및 이러닝)
2013년~현 재: 단국대학교 자유교양대학 조교수
※관심분야:E-learning, MOOC(Massive Open Online Course), 성인교육(Adult Learning)
2018년 : 한양대학교 대학원 (교육학 석사)
2018년~현 재: 한양대학교 대학원 교육공학과 박사과정 재학
※관심분야: 생산적 실패(Productive failure), 문제해결(Problem-solving), ITS(Intelligent Tutoring System)
2018년 : 한양대학교 교육공학 학사
2018년~현 재: 한양대학교 대학원 교육공학과 석사과정 재학
※관심분야:Eye Movement Modeling Example, 문제해결(Problem-solving), 멀티미디어학습(multimedia learning)