I am a researcher at Microsoft Research AI in the Perception and Interaction Group. My research interests involve designing, building, and evaluating socially interactive technologies that are physically situated in the open world, particularly embodied virtual agents and robots. I draw inspiration from sociolinguistics, microsociology, conversation analysis, and social psychology to design fine-grained coordinated behaviors that enable embodied agents to engage in more fluid interactions and collaborations with people.
I received my PhD from the University of Wisconsin-Madison where my dissertation work centered on researching effective social gaze behaviors for human-robot and human-agent interaction.
Research on Situated Interaction
When people interact with each other, they engage in a rich, highly coordinated, mixed-initiative process, regulated through both verbal and non-verbal channels. In contrast, while their perceptual abilities are improving, computers are still unaware of their physical surroundings and of the “physics” of human interaction. The Situated Interaction project at MSR aims to close this gap.
Platform for Situated Intelligence
To enable faster progress in this challenging research area, we are developing the Platform for Situated Intelligence—an open, extensible platform that lowers the barrier to entry in developing multimodal, integrative-AI systems. Check out our recent beta release on GitHub. More information about the project can also be found in this blog post.