Spatial Patterns of Behavior in HRI Under Environmental Spatial Constraints
Robots need the ability to recognize social group conversations to effectively adapt their behavior to different social contexts in dynamic environments. One way of enabling them with this ability is by providing them with methods to identify spatial patterns of human behavior that typically emerge during social conversations. These spatial patterns are often observed as face-to-face, side-by-side or circular spatial arrangements; however, the specific type of arrangement that emerges ultimately depends on many social factors including environmental spatial constraints.
This project provides the empirical knowledge and methods needed to incorporate spatial constraints into the way robots reason about human (and robot) spatial formations. In particular, we focus on studying:
1) how do spatial constraints influence conversational group formations in HRI?;
2) how can robots detect these formations under spatial constraints?; and
3) how can they autonomously generate appropriate spatial behavior to sustain conversations in spatially constrained environments?
[project website] [arXiv version]
[project website] [paper]
[paper] [supplementary] [demo] [code]
[paper] [tutorial] [code]
This material is based upon work supported by the National Science Foundation under Grant No. (IIS-1924802). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.