Related Literature Group 4

From Control Systems Technology Group

Revision as of 09:35, 24 April 2021 by 20182838 (Talk | contribs)
Jump to: navigation, search



Problem Statement

  • In the paper by Xiao et al., it was researched what kind of impact working from home has on social, behavioural and physical well-being during COVID-19. They distributed a questionnaire in which 988 valid responses were gathered. The sample had an average age of 40.9. They found that working from home, full time can contribute to mental issues for people that live online. These mental issues are for example isolation and depression, because these people do not have face-to-face interactions and do not receive social support from people living in the same home. [1]
  • The article also found that there can be work-family conflict inside of the house. This means that it is hard for people to separate work and family from each other because the boundaries are very blurred when working from home. Most participants had a hard time balancing work schedules because they could for example be interrupted by their family members. Emotional exhaustion is a possible result of this ongoing work-family conflict. [1]
  • Additionally, the researchers found that there are physical health problems that can arise from working from home. These problems can for example arise because employees do not have the ability to walk around in the office space, or outside in between meetings. Additionally, the high exposure to computer screens can result in fatigue, tiredness, headaches and eye-related symptoms. [1]
  • The goal of the following study was to investigate the prevalence of unhealthy behaviour before and during the COVID-19 quarantine amongst Brazilian adults. In total, data of 38.535 adults was gathered. Participants had to report the frequency of certain feelings, such as sadness, happiness etc. Additionally, they were asked to report the frequency and duration of their physical activities and their TV and computer/tablet use from before and during the COVID-19 pandemic. They found that, because of quarantine, people showed an increase of sedentary behaviours (more than 8 hours per day) and a decrease of physical activities. This can affect cardiovascular, metabolic, and mental health, and this all increases the risk of mortality. Furthermore, they found that the unhealthy behavior was also associated with feelings of loneliness, sadness and anxiety [2]
  • The paper researched constraints of mandatory home education due to the COVID-19 pandemic, from the perspective of the parents of first to ninth graders. It has been found that parents especially have time, expertise and technical restrictions. To obtain this, they would suggest more interaction with the teacher, both the children and the parents.[3]

State of the Art

  • There exists already a platform, called “X5Learn”, that helps its users (both students and teachers) with online education purposes. It can help students, for instance, by providing personal recommendations and adapting to their individual learning preferences. On the other hand, it enables collaboration of sources for teachers. Special about this platform is the fact that it combines both human-centered design, AI, and software tools. In this way, it makes sure that the service is easy, intuitive, and transparent to its users. [4]
  • The goal of the study by Kessens et al. [5] was to develop a personal computer assistant that helps children to adhere to performing daily activities and living healthy. This personal computer assistant had three roles, namely a companion, educator and motivator role. The companion robot gives emotional support and allows the children to also play with it. If the assistant takes on the educational role, it can teach and explain. When the robot takes on the motivational role it can encourage the children to adhere to a healthy lifestyle and it can learn them that adherence is important. The participants of the experiment with the assistant were children of 8 and 9 years old. In total there were 18 participants, of which 8 participants were female. The study showed that the more human-like the interaction with the computer assistant was (for example using different emotional expressions), the more persuasive, engaging and fun the interaction was between the computer and the child. The computer assistant showed to have the opportunity to increase motivation and self-performance management amongst the children compare to when they did not use the assistant. Also, the assistant was able to reduce the BMI of the users. Both children and adults enjoyed the computer assistant.
  • The following paper by Chou, Chan, & Lin [6] discusses the history of learning agents and its potential (both positive and negative) for the future. Educational agents could help to improve a social learning environment. However, the complexity of the educational agents makes development expensive and difficult. The paper also addresses the classification of educational agents, which is the larger group of software that helps social learning through a human approach. A learning companion falls within this branch and is defined as a “computer-simulated character, which has human characteristics and plays a non-authoritative role in a social learning environment".
  • In the paper of Cambo, Avrahami and Lee [7] it has been researched why work breaks are important and how to motivate users to increase physical activity. They want to reach this by wrapping this break in a playful interaction. Therefore, they came up with an application called BreakSense. They have decided that it is not necessary for the device to interrupt the user, since it is better for the user to self-interrupt and initiate a break themselves. This helped the participants in a way that using the device, induced physical activity in their daily routine.

In a meta analysis of 83 papers, all the kinds of chatbots that exist are described, and also the history and evaluation of chatbots and how humans perceive and experience their interaction with them. They also mention areas of research that future studies can focus on. [8]


  • It is possible to program a chatbot in a way that it will interact with you via a moving and talking avatar. The paper also suggests that displaying facial expressions are beneficial in the interaction with such an agent, since displaying the avatar’s emotion can enhance perceived emotional intelligence. [9]
  • Looije et al researched the guidelines that are needed when developing a personal assistant. These guidelines were derived from interviewing, persuasive technologies and from existing guidelines for personal assistants. In their research they found that guidelines were best expressed in iCat (a personal assistant) that was able to show socially intelligent behavior compared to a non-social or text interface based iCat.[10]
  • A study by Go and Sundar [11] states that revealing the identity of a chatbot as a non-human can have a positive effect: user will have less high expectations about the conversation, and will be impressed when an agent shows human-likebehaviour. Furthermore, they emphasize the importance of the conversational style between a human an a computer. When the dialogue resembles that of an actual human, perceived feelings of social presence and homophily will increase, leading to more positive attitudes towards the agent (and in turn potential desired behaviour consequences).
  • Higher feelings of social presence can be achieved when an agent’s language usage shows a consistent personality, which can be either introvert or extravert.[12]
  • Virtual agents which are communicating in a personalized way (using “I” and “you”) will behave more human-like and it will therefore gain more social fidelity. It will also lead to increased feelings of social presence and better learning performance and motivation.[12][13]
  • The previous statement is supported by Araujo[14]. In his paper, he showed that social presence increased when the machine shows a more intelligent interaction style.
  • Social fidelity is related to certain human-like behaviour and cues. This includes for example speech content (personalized language, feedback, politeness, social memory, personality) and visual cues (facial expressions, gestures, gaze, emotions). [12]


  • In the study of Niemi and Kousa [15], students (16-18 years old) and teachers of an upper secondary school were asked about their perception considering the distant teaching due to COVID-19. This was done with the use of four questionnaires (56 to 72 students and 9 to 15 teachers. Overall, the distant teaching was considered to be done well. Students did however feel a heavier workload and more fatiqued. Some also indicated a loss of motivation. Teachers did not recognize these problems.
  • Apparently, the mere presence of a lifelike character in an online learning environment can have a strong positive influence on the perceive learning experience of students (around the age of 12). Adding such an interactive agent to the learning process can make it more fun, and the agent is perceived to be helpful and credible. [16]
  • One has to be aware that there are both a lower and and upper bound to the proactiveness of a virtual agent. Agents that are too present can quickly be perceived as irritating and intrusive. [16]
  • Sharma et al. [17] explains that in combination with eye-tracking, AI can also be used to give individual feedback. Next to this, it can help to predict learning outcomes of the individual students.
  • The following study was aimed at measuring the effectiveness of learning chatbot systems on student performance. In total 72 students of a university in Tandojam participated in the study. These students were divided into 2 groups. One group was able to use the Google search engine, and the other group was able to use the Chatbot system to find solutions to their problems. The study found that learning through the Chatbot had a significant impact on memory retention and learning outcomes [18]
  • A paper by Grover et al. [19] described an experiment in which two chatbots were compared, one of which has a face and appeared to be more emotionally intelligent. Results suggested that the emotionally expressive agent can lead participants (average age 33) to be more productive and focused during working hours. The participants also reported to feel more satisfied with their achievements. Furthermore, the participants suggested valuable improvements, like intelligent task scheduling and a distraction monitoring system.
  • The study of Baethge and Rigotti [20] tries to find how interruptions during work influence the perception of performance and irritation. This study shows that interruptions are negatively correlated with the satisfaction of performance and positively with forgetting what you were doing and feeling irritated. The study was done with 133 nurses.
  • The study of Borst et al. [21] tries to make a first step towards an integrated theory for task interruptions, for in earlier research several factors have been proven to influence the effect of the interruption. These factors are the duration of the interruption, the complexity of the interrupting task and the moment of interruption. Their study confirmed that problem state requirements influence the disruptiveness. So, interfaces should interrupt at low problem state moments and also maintain the problem state for the user when interrupted by a secondary task.
  • Henning et al.[22] studied the influence of short breaks on computer operators. At larger work sites no improvements were found. At smaller work sites well-being and productivity improved when exercises were included in the small breaks. Extra 3-minute breaks from computer work were preferred over 30-second breaks in each hour.
  • Artificial agents have been used to provide social support. However, as mentioned by Ta et al.[23], most research in this area has not focused on everyday life. Instead, experiments are performed, for instance, under high levels of stress. This might give different results than in an educational or work setting. Therefore investigated the potential of artificial agents in everyday life. Their research seems to indicate that they can also be useful in these situations. These findings can thus be applied to the AI companion robot, which should provide social support and combat loneliness.
  • Odekerkern-Schröder et al. [24] study the role of the companion robot Vector during the COVID-19 pandemic. They found that companion robots can fulfill to reduce feelings of loneliness, by building supportive relationships.


  1. 1.0 1.1 1.2 Xiao, Y., Becerik-Gerber, B., Lucas, G., & Roll, S. C. (2021). Impacts of Working From Home During COVID-19 Pandemic on Physical and Mental Well-Being of Office Workstation Users. Journal of Occupational and Environmental Medicine, 63(3), 181–190.
  2. Werneck, A. O., Silva, D. R., Malta, D. C., Souza-Júnior, P. R. B., Azevedo, L. O., Barros, M. B. A., & Szwarcwald, C. L. (2021). Changes in the clustering of unhealthy movement behaviors during the COVID-19 quarantine and the association with mental health indicators among Brazilian adults. Translational Behavioral Medicine, 11(2), 323–331.
  3. Brom, C., Lukavský, J., Greger, D., Hannemann, T., Straková, J., & Švaříček, R. (2020). Mandatory Home Education During the COVID-19 Lockdown in the Czech Republic: A Rapid Survey of 1st-9th Graders’ Parents. Frontiers in Education, 5.
  4. Perez-Ortiz, M., Dormann, C., Rogers, Y., Bulathwela, S., Kreitmayer, S., Yilmaz, E., Noss, R., & Shawe-Taylor, J. (2021). X5Learn: A Personalised Learning Companion at the Intersection of AI and HCI. 26th International Conference on Intelligent User Interfaces, 70–74.
  5. Kessens, J. M., Neerincx, M. A., Looije, R., Kroes, M., & Bloothooft, G. (2009). Facial and vocal emotion expression of a personal computer assistant to engage, educate and motivate children. Proceedings - 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, ACII 2009.
  6. Chou, C. Y., Chan, T. W., & Lin, C. J. (2003). Redefining the learning companion: The past, present, and future of educational agents. Computers and Education, 40(3), 255–269.
  7. Cambo, S. A., Avrahami, D., & Lee, M. L. (2017). BreakSense: Combining physiological and location sensing to promote mobility during work-breaks. Conference on Human Factors in Computing Systems - Proceedings, 2017-May, 3595–3607.
  8. Rapp, A., Curti, L., & Boldi, A. (2021). The human side of human-chatbot interaction: A systematic literature review of ten years of research on text-based chatbots. International Journal of Human Computer Studies, 151, 102630.
  9. Angga, P. A., Fachri, W. E., Elevanita, A., Suryadi, & Agushinta, R. D. (2016). Design of chatbot with 3D avatar, voice interface, and facial expression. Proceedings - 2015 International Conference on Science in Information Technology: Big Data Spectrum for Future Information Economy, ICSITech 2015, 326–330.
  10. Looije, R., Cnossen, F., & Neerincx, M. A. (2006). Incorporating guidelines for health assistance into a socially intelligent robot. Proceedings - IEEE International Workshop on Robot and Human Interactive Communication, 515–520.
  11. Go, E., & Sundar, S. S. (2019). Humanizing chatbots: The effects of visual, identity and conversational cues on humanness perceptions. Computers in Human Behavior, 97, 304–316.
  12. 12.0 12.1 12.2 Sinatra, A. M., Pollard, K. A., Files, B. T., Oiknine, A. H., Ericson, M., & Khooshabeh, P. (2021). Social fidelity in virtual agents: Impacts on presence and learning. Computers in Human Behavior, 114, 106562.
  14. Araujo, T. (2018). Living up to the chatbot hype: The influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Computers in Human Behavior, 85, 183–189.
  15. Niemi, H. M., & Kousa, P. (2020). A Case Study of Students’ and Teachers’ Perceptions in a Finnish High School during the COVID Pandemic. International Journal of Technology in Education and Science, 4(4), 352–369.
  16. 16.0 16.1 Lester, J. C., Barlow, S. T., Converse, S. A., Stone, B. A., Kahler, S. E., & Bhogal, R. S. (1997). Persona effect: Affective impact of animated pedagogical agents. Conference on Human Factors in Computing Systems - Proceedings, 359–366.
  17. Sharma, K., Giannakos, M., & Dillenbourg, P. (2020). Eye-tracking and artificial intelligence to enhance motivation and learning. Smart Learning Environments, 7(1).
  18. Abbasi, S., & Kazi, H. (2014). Measuring effectiveness of learning chatbot systems on Student’s learning outcome and memory retention. In Asian Journal of Applied Science and Engineering (Vol. 3).
  19. Grover, T., Rowan, K., Suh, J., McDuff, D., & Czerwinski, M. (2020). Design and evaluation of intelligent agent prototypes for assistance with focus and productivity at work. International Conference on Intelligent User Interfaces, Proceedings IUI, 20, 390–400.
  20. Baethge, A., & Rigotti, T. (2013). Interruptions to workflow: Their relationship with irritation and satisfaction with performance, and the mediating roles of time pressure and mental demands. Work & Stress, 27(1), 43–63.
  21. Borst, J. P., Taatgen, N. A., & Van Rijn, H. (2015). What makes interruptions disruptive? A process-model account of the effects of the problem state bottleneck on task interruption and resumption. Conference on Human Factors in Computing Systems - Proceedings, 2015-April, 2971–2980.
  22. Henning, R. A., Jacques, P., Kissel, G. V., Sullivan, A. B., & Alteras-Webb, S. M. (1997). Frequent short rest breaks from computer work: Effects on productivity and well-being at two field sites. Ergonomics, 40(1), 78–91.
  23. Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., DeCero, E., & Loggarakis, A. (2020). User experiences of social support from companion chatbots in everyday contexts: Thematic analysis. Journal of Medical Internet Research, 22(3).
  24. Odekerken-Schröder, G., Mele, C., Russo-Spena, T., Mahr, D., & Ruggiero, A. (2020). Mitigating loneliness with companion robots in the COVID-19 pandemic and beyond: an integrative framework and research agenda. Journal of Service Management, 31(6), 1149–1162.
Personal tools