Merkouris, A., Garneli, V., & Chorianopoulos, K. (2021). Programming Human-Robot Interactions for Teaching Robotics within a Collaborative Learning Open Space: Robots Playing Capture the Flag Game. In CHI Greece 2021: 1st International Conference of the ACM Greek SIGCHI Chapter, CHI Greece 2021 (p. 5). New York, NY, USA: Association for Computing Machinery.
Game-based competitive or cooperative robotics activities constitute an effective approach to exploit the child-robot interaction perspective. However, in most game-based robotics activities robots act autonomously to achieve the goal. In this work, we aim to promote the child-robot interaction aspect through a multiplayer game where one team of robots and humans collaborates to compete with another team of humans and cobots. We describe the design of an open space that will allow children to gain access, locally and remotely, and program robotic agents to play the traditional “Capture the Flag” game in a physical stadium-arena. Through this space, we intend to teach robotics, while programming human-robot interfaces, within a computer-supported game-based learning environment. We give insights on the initial design of such an open space and the educational benefits of its use in the comprehension of abstract computational and STEM concepts.
Merkouris, A., & Chorianopoulos, K. (2019). Programming Embodied Interactions with a Remotely Controlled Educational Robot. ACM Trans. Comput. Educ., 19(4). New York, NY, USA: Association for Computing Machinery.
Contemporary research has explored educational robotics, but it has not examined the development of computational thinking in the context of programming embodied interactions. Apart from the goal of the robot and how the robot will interact with its environment, another important aspect that should be taken into consideration is whether and how the user will physically interact with the robot. We recruited 36 middle school students to participate in a six-session robotics curriculum in an attempt to expand their learning in computational thinking. Participants were asked to develop interfaces for the remote control of a robot using diverse interaction styles from low-level to high-level embodiment, such as touch, speech, and hand and full-body gestures. We measured students’ perception of computing, examined their computational practices, and assessed the development of their computational thinking skills by analyzing the sophistication of the projects they created during a problem-solving task. We found that students who programmed combinations of low embodiment interfaces or interfaces with no embodiment produced more sophisticated projects and adopted more sophisticated computational practices compared to those who programmed full-body interfaces. These findings suggest that there might be a tradeoff between the appeal and the cognitive benefit of rich embodied interaction with a remotely controlled robot. In further work, educational robotics research and competitions might be complemented with a hybrid approach that blends the traditional autonomous robot movement with student enactment.
Merkouris, A., Chorianopoulou, B., Chorianopoulos, K., & Chrissikopoulos, V. (2019). Understanding the Notion of Friction Through Gestural Interaction with a Remotely Controlled Robot. Journal of Science Education and Technology, 28(3), 209–221. Springer.
Embodied interaction with tangible interactive objects can be beneficial for introducing abstract scientific concepts, especially for young learners. Nevertheless, there is limited comparative evaluation of alternative interaction modalities with contemporary educational technology, such as tablets and robots. In this study, we explore the effects of touch and gestural interaction with a tablet and a robot, in the context of a primary education physics course about the notion of friction. For this purpose, 56 students participated in a between-groups study that involved four computationally enhanced interventions which correspond to different input and output modalities, respectively: (1) touch-virtual, (2) touch-physical, (3) hand gesture-virtual, and (4) hand gesture-physical. We measured students’ friction knowledge and examined their views. We found that the physical conditions had greater learning impact concerning friction knowledge compared to the virtual way. Additionally, physical manipulation benefited those learners who had misconceptions or limited initial knowledge about friction. We also found that students who used the more familiar touchscreen interface demonstrated similar learning gains and reported higher usability compared to those using the hand-tilt interface. These findings suggest that user interface familiarity should be carefully balanced with user interface congruency, in order to establish accessibility to a scientific concept in a primary education context.
Merkouris, A., & Chorianopoulos, K. (2018). Programming Touch and Full-Body Interaction with a Remotely Controlled Robot in a Secondary Education STEM Course. In Proceedings of the 22nd Pan-Hellenic Conference on Informatics, PCI ’18 (pp. 225–229). New York, NY, USA: Association for Computing Machinery.
Contemporary research has introduced educational robotics in the classroom, but there are few studies about the effects of alternative embodied interaction modalities on computational thinking and science education. Twenty-six middle school students were asked to program interfaces for controlling the heading and speed of a robot using two types of embodied interaction modalities. We compared touch and full-body gestures to autonomous control, which does not require any embodied interaction. We assessed the development of their computational thinking skills by analyzing the projects they created during a problem-solving task and examined their understandings of science concepts related to kinematics. We found that novice students preferred full-body interfaces, while advanced students moved to more disembodied and abstract computational thinking. These findings might be applied to focus computing and science education activities to the right age and abilities groups of students.
Merkouris, A., & Chorianopoulos, K. (2018). Programming Human-Robot Interactions in Middle School: The Role of Mobile Input Modalities in Embodied Learning. In M. E. Auer & T. Tsiatsos (Eds.), Interactive Mobile Communication Technologies and Learning (pp. 457–464). Cham: Springer International Publishing.
Embodiment within robotics can serve as an innovative approach to attracting students to computer programming. Nevertheless, there is a limited number of empirical studies in authentic classroom environments to support this assumption. In this study, we explored the synergy between embodied learning and educational robotics through a series of programming activities. Thirty-six middle school students were asked to create applications for controlling a robot using diverse interaction modalities, such as touch, speech, hand and full body gestures. We measured students’ preferences, views, and intentions. Furthermore, we evaluated students’ interaction modalities selections during a semi-open problem-solving task. The results revealed that students felt more confident about their programming skills after the activities. Moreover, participants chose interfaces that were attractive to them and congruent to the programming tasks.
Merkouris, A., Chorianopoulos, K., & Kameas, A. (2017). Teaching Programming in Secondary Education Through Embodied Computing Platforms: Robotics and Wearables. ACM Trans. Comput. Educ., 17(2). New York, NY, USA: Association for Computing Machinery.
Pedagogy has emphasized that physical representations and tangible interactive objects benefit learning especially for young students. There are many tangible hardware platforms for introducing computer programming to children, but there is limited comparative evaluation of them in the context of a formal classroom. In this work, we explore the benefits of learning to code for tangible computers, such as robots and wearable computers, in comparison to programming for the desktop computer. For this purpose, 36 students participated in a within-groups study that involved three types of target computer platform tangibility: (1) desktop, (2) wearable, and (3) robotic. We employed similar blocks-based visual programming environments, and we measured emotional engagement, attitudes, and computer programming performance. We found that students were more engaged by and had a higher intention of learning programming with the robotic rather than the desktop computer. Furthermore, tangible computing platforms, either robot or wearable, did not affect the students’ performance in learning basic computational concepts (e.g., sequence, repeat, and decision). Our findings suggest that computer programming should be introduced through multiple target platforms (e.g., robots, smartphones, wearables) to engage children.
Merkouris, A., & Chorianopoulos, K. (2015). Introducing Computer Programming to Children through Robotic and Wearable Devices. In Proceedings of the Workshop in Primary and Secondary Computing Education, WiPSCE ’15 (pp. 69–72). New York, NY, USA: Association for Computing Machinery.
Learning to program in computer code has been considered one of the pillars of contemporary education with benefits that reach well beyond the skills required by the computing industry, into creativity and self-expression. Nevertheless, the execution of computer programs usually takes place on a traditional desktop computer, which has a limited repertoire of input and output interfaces to engage with the user. On the other hand, pedagogy has emphasized that physical representations and tangible interactive objects benefit learning especially for young students. In this work, we explore the benefits of learning to code for ubiquitous computers, such as robots and wearable computers, in comparison to programming for the desktop computer. For this purpose, thirty-six students participated in a within groups study that involved three types of tangibility at the target computer platform: 1) desktop with Scratch, 2) wearable with Arduino LilyPad, and 3) robotic with Lego Mindstorms. Regardless of the target platform, we employed the same desktop visual programming environment (MIT Scratch, Modkit and Enchanting) and we measured emotional engagement and assessed students’ programming skills. We found that students expressed more positive emotions while programming with the robotic rather than the desktop computer. Furthermore, tangible computing platforms didn’t affect dramatically students’ performance in computational thinking.