Collaborators/Mentors: Takayuki Kanda, Hiroshi Ishiguro, Peter Kahn, Jolina Rucket, Solace Shen, Heather Gary, Brian Gill
My Role: Speech and Behavior Design, Data Analysis, Literature Review, Research Report, Usability Testing
Timeline: 2013 - 2014 (2 weeks for gestures and eye gaze design)
**All the photos belong to HINTS lab at University of Washington
The Challenge
After we had designed the interaction patterns, speech scripts, creativity tasks, and speech scripts, I started creating Robovie’s non-verbal behaviors. We specifically need Robovie’s gestures and eye gaze to accomplish three goals:
Guide Interaction, suggesting to participants when to look at the screen, when to work on Zen Rock Garden, etc.
Reinforce the robot’s primary attitude - encouraging.
Ease tension.
Wizard of Oz Test
Robovie appeared autonomous to participants but was controlled behind the scene. Robot controllers listened to participants and then delivered instructions to Robovie based on prewritten scripts.
The robot
I started off by understanding Robovie's technical details.
Story Boards
I visualized the script into storyboards and designed Robovie's behaviors. Based on the problem statement, I generated "must have" behaviors and "nice to have" behaviors.
"Must have" behaviors are essential in moving the interaction forward, while "nice to have" behaviors reinforce Robovie's personality and build rapport between participants and Robovie.
Implementation
Robovie is a Japanese robot developed by researchers at the Advanced Telecommunications Research Institute International (ATR). ATR developed a custom robotics control software specifically for Robovie's speech and non-verbal behaviors.
We programmed combination gestures (i.e., pointing, waving, shaking hands, etc.) into Robovie. For simple gestures such as "turn the head" and "maintain eye contact," robot controllers could directly operate through the console.
User Tests
During the pilot test, I discovered some problems and made adjustments accordingly:
"I am unsure if Robovie is paying attention to what I am doing!"
This is a standard issue in designing social robots. I studied Social Robot pioneer Cynthia Breazeal's work to solve this problem. Specifically, I designed the robot to look at the same thing the participant looked at (theory of joint attention). Follow-up interviews with participants confirmed the effectiveness of this design.
Participants' attention drifted away after thirty minutes.
Since the design session lasted for more than half an hour, we needed techniques to keep participants concentrated. We added a joke and paired Robovie's talking with gestures to help participants stay focused.
Outcome
To evaluate, we recruited two groups of young adults (Forty-eight undergraduate students in the age range of 18 to 25): the "Robovie group" interacted with the robot, and the "control group" went through the same design process without the robot.
Results showed that, on average, participants provided almost twice the number of creative expressions in the Robot condition.