Storytelling and Reading

alt text

Our robots like to tell and read stories with children. When Tega and Jibo are listening to children’s stories, they can understand if the child learned a new word or a new sentence structure. When they are telling stories, they are curious to know if the child is engaged and having fun, so they will closely monitor the child’s emotions and attention. Tega and Jibo also asks questions to the child from time to time, to help children think and learn.

Our robots can think what storybook would help each child learn and practice new words and sentences. Every child learns and engages differently, so we want our robots to understand that.

Here is the storybook on a tablet. Our robots don’t have hands to touch the screen like we do, but they can use their brain power to tell tablets what to do, like flipping pages! Isn’t that cool? The secret is, the robot and the tablet can talk to each other over Wifi.

alt text

Let’s look at the storybook again. You can click on the words and pictures on the storybook and it will read to you what the word is. Sometimes, Jibo and Tega might ask you, “what are the names of those goats?” and you can tell them “Billy Goats Gruff” or touch the words on the screen. You can also ask the robots, “Hey Jibo, how do you read this word?” and click on the word, and Jibo might help you read it, “Let me try, I think that word is grass!”.

alt text

Here is a video of Tega and children telling stories to each other. Like you do, robots get smarter and smarter every time they tell and listen to stories!

Selected Publications:

Hae Won Park, Ishaan Grover, Samuel Spaulding, Louis Gomez, and Cynthia Breazeal. “A Model-free Affective Reinforcement Learning Approach to Personalization of a Social Robot Companion for Early Literacy Education.” In Association for the Advancement of Artificial Intelligence (AAAI), 2019. [Download]

Hae Won Park, Mirko Gelsomini, Jin Joo Lee, and Cynthia Breazeal. “Telling stories to robots: The effect of backchanneling on a child’s storytelling.”, In ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2017. [Download]

Storytelling and Reading

alt text

Our robots like to tell and read stories with children. When Tega and Jibo are listening to children’s stories, they can understand if the child learned a new word or a new sentence structure. When they are telling stories, they are curious to know if the child is engaged and having fun, so they will closely monitor the child’s emotions and attention. Tega and Jibo also asks questions to the child from time to time, to help children think and learn.

Our robots can think what storybook would help each child learn and practice new words and sentences. Every child learns and engages differently, so we want our robots to understand that.

Here is the storybook on a tablet. Our robots don’t have hands to touch the screen like we do, but they can use their brain power to tell tablets what to do, like flipping pages! Isn’t that cool? The secret is, the robot and the tablet can talk to each other over Wifi.

alt text

Let’s look at the storybook again. You can click on the words and pictures on the storybook and it will read to you what the word is. Sometimes, Jibo and Tega might ask you, “what are the names of those goats?” and you can tell them “Billy Goats Gruff” or touch the words on the screen. You can also ask the robots, “Hey Jibo, how do you read this word?” and click on the word, and Jibo might help you read it, “Let me try, I think that word is grass!”.

alt text

Here is a video of Tega and children telling stories to each other. Like you do, robots get smarter and smarter every time they tell and listen to stories!

Selected Publications:

Hae Won Park, Ishaan Grover, Samuel Spaulding, Louis Gomez, and Cynthia Breazeal. “A Model-free Affective Reinforcement Learning Approach to Personalization of a Social Robot Companion for Early Literacy Education.” In Association for the Advancement of Artificial Intelligence (AAAI), 2019. [Download]

Hae Won Park, Mirko Gelsomini, Jin Joo Lee, and Cynthia Breazeal. “Telling stories to robots: The effect of backchanneling on a child’s storytelling.”, In ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2017. [Download]

Word Quest: An interactive robot game for vocabulary learning

alt text

The WordQuest game is being designed to help develop a child’s vocabulary. It is a collaborative game, with a robot, similar to the classic game “I Spy”. In it a child and a robot take turns. The tablet may show a picture of an object from the scene and ask the player to say what it is, or it may say a word (and shows the written word) and then they take turns identifying the picture of that word on the touchscreen tablet.

A child and robot can also explore the scene by looking around it zooming in and out, and clicking on the various objects, and the tablet will tell them what the object is (and provide the written word for that object). The current version of the game has two scenes, “outdoor” and “indoor”, that in total contain 50 animated, clickable objects for children to learn. Each game quest (e.g., “can you find objects that are in crimson?”) contains one target vocabulary word (e.g., “crimson”) for the child to learn

The game is designed to help the child learn its meaning by taking turns with the robot and finding the objects in the scene that represent the target word’s meaning. When four correct objects are collected either by the child or the robot, the quest is completed.

Additional Technical information: The game is implemented in the Unity 3D game engine, which supports flexible updates and changes of learning curriculum. During the game play, the robot may switch its role between “learner” and “tutor” roles to best promote a child’s learning, based on its assessment on their learning progress. The computational model used to guide the robot’s action is built using Reinforcement Learning.

alt text

Selected Publications:

Chen, H., Park, H.W., Dong, X., Breazeal, C. (in review). Active Role Adaptation for Social Robot Learning Companion. In proceedings of 2019 IEEE International Conference on Robotics and Automation (ICRA 2019)

Rhyme Racer: An interactive robot game for practicing phonological awareness

alt text

The Rhyme Racer game is designed to help children develop rhyming, phonological awareness, and pronunciation skills. It is a fast-paced, fun, and competitive game with a robot, in which children race the robot to identify and say a word that rhymes with the ‘target’ word in each round. Each round, the game uses the information from previous rounds to learn about a child’s individual rhyming and pronunciation skills and adjusts the content of the next round of words to maximize the child’s learning.

Additional Technical information: It is difficult to program robots and games for interactive social behavior during educational co-play, in part because the interaction takes place in a high-dimensional state space with noisy state dynamics and sparse rewards. Moreover, an effective program must balance multiple objectives across different time scales, such as teaching new words, acquiring information about the student to model their knowledge, and keeping the interaction fun and engaging. This project aims to address these challenges for educational co-play by developing testbeds for learning algorithms, developing a computational framework formulating educational co-play as a planning problem, and using Gaussian processes with a kernel using concept-distance and phonetic similarity of children’s responses to model student knowledge.

Selected Publications:

A Social Robot System for Modeling Children’s Word Pronunciation AAMAS 2018

Learning Behavior Policies for Interactive Educational Play Models, Robotics Science and Systems: Workshop on Algorithms, and HRI