PRE2019 3 Group2

From Control Systems Technology Group
Jump to navigation Jump to search

Research on Idle Movements for Robots


Abstract

Group Members

Name Study Student ID
Stijn Eeltink Mechanical Engineering 1004290
Sebastiaan Beers Mechanical Engineering 1257692
Quinten Bisschop Mechanical Engineering 1257919
Daan van der Velden Mechanical Engineering 1322818
Max Cornielje Mechanical Engineering 1381989

Planning V1

Devision of work
The research is performed in a total of eight weeks. In those weeks two experiments are done:
1. Experiment A: .....(survey??);
2. Experiment B:.......


Week Datum start To Do & Milestones Responsible team members
1 3 Feb. Determining the subject

Milestones:
1. Choose subject;
2. Make the objectives, approach and task division;
3. Describe the users and the state-of-the-art;
4. Contact the person responsible for the NAO robot;


Responsible members:

1. Everyone;
2. Max and Sebastiaan;
3. Daan, Quinten and Stijn;
4. Max

2 10 Feb. setting up experiments

Milestones:
1. Update wiki with feedback;
2. Work out experiment A and experiment B based on literature study;
3. Set up testing facility;
4. Approach test persons.


Responsible members:

1. Sebastiaan;
2. Max and Daan;
3. Stijn and Sebastiaan;
4. Quinten.

3 17 Feb. Doing experiment A

Milestones:
1. Update wiki with feedback
2. Do experiment A.


Responsible members:

1. Stijn;
2. Everyone.

24 Feb. Break: Buffer for unfinished work in week 1-3
4 2 March Process experiment A and start preparing for NAO robot'

Milestones:
1. Update wiki with feedback;
2. Process the data from experiment A;
3. Further workout experiment B;
4. Start on preparing on working with the NAO robot(software etc.).

Responsible members:

1. Quinten;
2. Stijn;
3. Daan and Max;
4. Sebastiaan.

5 9 March Finilize preparing experiment B and testing the NAO robot

Milestones:
1. Update wiki with feedback;
2. Receive Nao robot;
3. Test the NAO robot and make sure it can perform the required movements;
4. Finalize experiment B.


Responsible members:

1. Max;
2. Everyone;
3 Sebastiaan and Max;
4. Quinten and Stijn.

6 16 March Performing experiment B and process results

Milestones:
1. Update wiki with feedback;
2. Perform experiment B;
3. Process results from experiment B.

Responsible members:

1. Daan;
2. Sebastiaan, Quiten and Max;
3. Stijn and Daan.

7 23 March Evaluate experiments, draw conclusion and work on wiki

Milestones:
1. Update wiki with feedback;
2. Conclusion and evaluation experiment A;
3. Conclusion and evaluation experiment B;
4. Determine follow up study;
5. Work on wiki.

Responsible members:

1. Sebastiaan;
2. Quinten and Max;
3. Stijn and Daan;
4. Sebastiaan;
5. Everyone.

8 30 March Finilze wiki and final presentation

Milestones:
1. Finilize wiki;
2. Prepare final presentation;
3. Peerreview.

Responsible team members:

1. Sebastiaan, Quinten, Daan and Max;
2. Stijn;
3. everyone.

Introduction

Problem statement

In an ideal world and in the future robots will interact with humans in a very socially intelligent way. Robots demonstrate humanlike social intelligence and non-experts will not be able to distinguish robots and other human agents anymore. To accomplish this, robots need to develop a lot further. The social intelligence of robots needs to be increased a lot, but also the movement of the robots. Nowadays, robots don't move the way humans do. For instance when moving your arm to grab something. Humans tend to overshoot a bit. A robot specifies the target and moves the shortest way to the object. Humans try to take the least resistance path. So this means they also use their surroundings to reach for their target. For instance, lean on a table to cancel out the gravity force. Humans use their joints more than robots do. Another big problem for a robot's motion to look human is idle movement. For humans and every other living creature in the world, it is physically impossible to stand precisely still. Robots, however, when not in action stand completely lifeless. Creating a problem for the interaction between the robot and the corresponding person. It is unsure if the robot is turned on and can respond to the human, and it also feels unnatural. Another thing is that humans are always doing something or holding something while having a conversation. To improve the interaction with humans and robots. There has to be looked at human idle movements, which idle movements are most beneficial for the human-robot interaction and which idle movements are most realistic and manageable for a robot to perform without looking too weird. In this research, we will look at all these things by observing human idle movements, test robot idle movement and research into the most preferable movements according to contestants.

Objectives

It is still very hard to get a robot to appear in a humanistic natural way as robots tend to be static (whether they move or not). As a more natural human-robot interaction is wanted, the behavior of social robots needs to be improved on different levels. In this project, the main focus will be on the movements that make a robot appear more natural/lifelike, which are the idle movements. The objectives of the research will be to find out which idle movements make a robot appear in a more natural way. In this research, the information will be based on previous research that has been done on this subject. More information will be gathered by applying interviews and using surveys to get the opinion of possible future users (these users will be mentioned in the chapter 'Users'). Next to this, the NAO robot will be experimented with. The NAO robot will be performing different idle movements and future users give their responses to these movements. With these acquired responses we will retrieve data that can be used to find out which idle movements make a robot appear more life-like. Due to the fact that in the future humanoid robots will also improve, possible expectations on the most important idle movements will also be given. All together we hope that this information will give greater insight into the use of idle movements on humanoid robots to be used in future research and projects on this subject.

Users

Approach, Milestones and Deliverables

Approach

To get knowledge about idle movements, observations have to be done on humans. The observations can be done on people walking (or standing) around public spaces, such as on the campus of the university or in the train. Furthermore, videos can be watched on (e.g) Youtube, as people will have, in general, the same idle movements on camera as off camera. The noticed idle movements can be listed in a table and can be tallied. By counting the different idle movements of humans, the most important and, maybe, even accepted ones will come forward.

Not to forget that various research in this area has already been done in this area. Papers, such as papers from R. Cuijpers and/or E. Torta, contain a lot of information concerning the idle movements that are considered important. Therefore, it is important to read the papers of the state of the art carefully. The state of the art will be explained briefly in the chapter ‘State of the art’.

Surveys will also be used. These surveys contain tick boxes with different idle movements; the person has to tick the box of the movement they do. This list of idle movement will be based on the research that has been done via observation and the reading of the state of the art papers. Next to that, a blank space has been left for the people, so they can fill in a idle movement which they do. Via this way, the overlooked idle movements can be taken into account as well.

Interviews can also give a clear insight into the importance of idle movements. The interview will be done with personal of the TU/e; this personal should have a knowledge about social robots and their purpose. Personal, with wide knowledge about social robots, are R. Cuijpers,M. Steinbuch, E. Torta and E. Barakova etc. The questions that can be asked are:

  • How do you see the future of social robots?
  • What do you think that the importance is of idle movements in this future?
  • What idle movement do you think is the best?
  • Do you think that the technology is already here to achieve idle movements?
  • ………
  • ………

By asking these questions, a clear opinion results. This opinion can be used to change the view on the users and the state of the art. The personal might give contradicting answers to what has been found, but that only results in various interesting contradictions which are useful in the research.

At last, an experiment can be done by using the NAO robot. The experiment makes use of a large amount of participants (which will be the users, see ‘Users’). The NAO robot will start a conversation with the participant for x amount of time. This is done multiple time (depending on the amount of idle movements used), once with the NAO robot not using any idle movements and, then, using different idle movements for every conversation. The used idle movements will be based on the research as listed above and are in the same order for every participant. After each conversation, the participant has to fill in Godspeed questionnaire[1]. The Godspeed questionnaire also includes a question about animacy. This question should also be answered with a scale between 1-10. By using the data of this experiment, a diagram can be made to the responses of the participants to the various idle movements. Via this, the best idle movement can be decided. The result can also occur in a combination of various idle movements as being the best.


State of the Art

  1. Torta, E. (2014). Approaching independent living with robots. Eindhoven: Technische Universiteit Eindhoven [2]
  2. Waldemar Karwowski (2007). Worker selection of safe speed and idle condition in simulated monitoring of two industrial robots [3]
  3. Raymond H. Cuijpers, Marco A. M. H. Knops (2015). Motions of Robots Matter! The Social Effects of Idle and Meaningful Motions [4]
  4. Toru Nakata, Tomomasa Sato and Taketoshi Mori (1998). Expression of Emotion and Intention by Robot Body Movement [5]
  5. Takayuki Kanda, Hiroshi Ishiguro, Michita Imai, and Tetsuo Ono (2003). Body Movement Analysis of Human-Robot Interaction [6]
  6. Thibault Asselborn, Wafa Johal and Pierre Dillenbourg (2017). Keep on moving! Exploring anthropomorphic effects of motion during idle moments [7]
  7. Cooney, M., Kanda, T., Alissandrakis, A., & Ishiguro, H. (2014). Designing enjoyable motion-based play interactions with a small humanoid robot. International Journal of Social Robotics, 6(2), 173-193. [8]
  8. Kocoń, M., & Emirsajłow, Z. (2012). Modelling the idle movements of human head in three-dimensional virtual environments. Pomiary Automatyka Kontrola, 58(12), 1121-1123. [9]
  9. Beck, A., Hiolle, A., & Canamero, L. (2013). Using perlin noise to generate emotional expressions in a robot. In Proceedings of the Annual Meeting of the Cognitive Science Society (Vol. 35, No. 35). [10]
  10. Satake, S., Kanda, T., Glas, D. F., Imai, M., Ishiguro, H., & Hagita, N. (2009, March). How to approach humans? Strategies for social robots to initiate interaction. In Proceedings of the 4th ACM/IEEE international conference on Human robot interaction (pp. 109-116). [11]
  11. Obaid, M., Sandoval, E. B., Złotowski, J., Moltchanova, E., Basedow, C. A., & Bartneck, C. (2016, August). Stop! That is close enough. How body postures influence human-robot proximity. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 354-361). IEEE. [12]
  12. Rosenthal-von der Pütten, A. M., Krämer, N. C., & Herrmann, J. (2018). The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. International Journal of Social Robotics, 10(5), 569-582.[13]
  13. [14]
  14. Srinivasan, V., Murphy, R. R., & Bethel, C. L. (2015). A reference architecture for social head gaze generation in social robotics. International Journal of Social Robotics, 7(5), 601-616.[15]
  15. Jung, J., Kanda, T., & Kim, M. S. (2013). Guidelines for contextual motion design of a humanoid robot. International Journal of Social Robotics, 5(2), 153-169.[16]
  16. Straub, I. (2016). ‘It looks like a human!’The interrelation of social presence, interaction and agency ascription: a case study about the effects of an android robot on social agency ascription. AI & society, 31(4), 553-571.[17]
  17. Song, H., Min Joong, K., Jeong, S.-H., Hyen-Jeong, S., Dong-Soo K.: (2009). Design of Idle motions for service robot via video ethnography. In: Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2009), pp. 195–99 [18]
  18. Streck, A., Wolbers, T.: (2018). Using Discrete Time Markov Chains for Control of Idle Character Animation 17(8). [19]
  19. Kofinas, N., Orfanoudakis, E., Lagoudakis, M.,: (2014). Complete Analytical Forward and Inverse Kinematics for the NAO Humanoid Robot, 31(1), pp. 251-264 [20]
  20. Zhang, M., Chen, J., Wei, X., Zhang, D.: (2018). Work chain‐based inverse kinematics of robot to imitate human motion with Kinect, 7(8). [21]
  21. Zhu, M., Sun, H., Lan, R., Li, B.: (2011). Human motion retrieval using topic model, 4(10), pp. 469-476. [22]
  22. Aggarwal, J., Cai, Q.: (1999). Human Motion Analysis: A Review, 1(3), pp. 428-440. [23]

References

  1. Bartneck, C., Kulić, D., Croft, E. and Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics 1(1), 71–81.
  2. https://pure.tue.nl/ws/portalfiles/portal/3924729/766648.pdf
  3. https://www.tandfonline.com/doi/abs/10.1080/00140139108967335
  4. https://www.researchgate.net/publication/281841000_Motions_of_Robots_Matter_The_Social_Effects_of_Idle_and_Meaningful_Motions
  5. https://pdfs.semanticscholar.org/9921/b7f11e200ecac35e4f59540b8cf678059fcc.pdf
  6. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.98.3393&rep=rep1&type=pdf
  7. https://www.researchgate.net/publication/321813854_Keep_on_moving_Exploring_anthropomorphic_effects_of_motion_during_idle_moments
  8. https://link.springer.com/article/10.1007/s12369-013-0212-0
  9. https://www.infona.pl/resource/bwmeta1.element.baztech-article-BSW4-0125-0022
  10. https://escholarship.org/content/qt4qv84958/qt4qv84958.pdf
  11. https://dl.acm.org/doi/pdf/10.1145/1514095.1514117
  12. https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7745155
  13. https://link.springer.com/article/10.1007/s12369-018-0466-7
  14. https://link.springer.com/content/pdf/10.1007/s10846-013-0015-4.pdf
  15. https://link.springer.com/article/10.1007/s12369-015-0315-x
  16. https://link.springer.com/article/10.1007/s12369-012-0175-6
  17. https://link.springer.com/article/10.1007/s00146-015-0632-5
  18. https://ieeexplore.ieee.org/abstract/document/5326062
  19. https://ieeexplore.ieee.org/document/8490450
  20. https://link.springer.com/article/10.1007/s10846-013-0015-4
  21. https://onlinelibrary.wiley.com/doi/epdf/10.4218/etrij.2018-0057
  22. https://onlinelibrary.wiley.com/doi/pdfdirect/10.1002/cav.432
  23. https://www.sciencedirect.com/science/article/pii/S1077314298907445