Pepper's interactive UI
Pepper is the world’s first social humanoid robot able to recognize faces and basic human emotions. Pepper was optimized for human interaction and is able to engage with people through conversation and his touch screen.
About the project
This is an ongoing project that a friend and I are conducting with Ericsson as part of their student initiative. We were challenged with the tasks of looking into the human-robot-interaction that takes place when people come to talk to Pepper, an interesting and exciting project that we now conduct in our spare time.
So, what's up with Pepper?
Pepper can almost be seen as an Ericsson employee these days with his own name tag, often representing the company at events, exhibitions and fairs. He draws much attention and get to interact with and meet a lot of curious people. However, in large halls or crowded rooms with a lot of people, Pepper has great trouble distinguishing and understanding what people are saying. This is the issue we want to address.
What are we going to do?
At the moment, when Pepper is representing Ericsson at events, the tablet display on his chest is not being used at all - you only interact by talking. As this is not possible in loud environments however, we are going to design a simple concept on how Pepper's screen can be utilized for facilitating the interaction through touch. The objective is to look into what topics may be of interest to facilitate conversation about and what such a UI might look like.