I am a researcher at Microsoft Research AI in the Perception and Interaction Group. My research interests involve designing, building, and evaluating socially interactive technologies that are physically situated in the open world, particularly embodied virtual agents and robots. My goal is to enable natural language-based interactions with these embodied technologies that can both perceive and appropriately respond to human conversational verbal and nonverbal social cues. To accomplish that goal, I explore techniques to improve an agent’s awareness of the social context around it and couple that enhanced representation with behaviors that can improve the agent’s task and social capabilities, user acceptance, rapport, and in general enable these technologies to be better “citizens” of the environment they are embedded within.
A summary of my PhD thesis in three minutes.
A five-video playlist from UW-Madison’s “Science Narratives” centered on the UW-Madison Human-Computer Interaction Lab and our social robotics research.
My talk at the UW CSE Robotics Seminar on gaze mechanisms for embodied agents.
Wired (UK), 2016: “This robot changes how it looks depending on your personality”
IEEE Spectrum (US), 2016: “This robot changes how it looks at you to match your personality”
Popular Science (US), 2014: “Robots seem more thoughtful if they glance away while they talk”
New Scientist (UK), 2014: “The robot tricks to bridge the uncanny valley”