The ability to make increasingly smaller computational devices and advancements in input and interaction technology are revolutionizing the way users interact with technology through new natural user interfaces (NUIs). These interfaces provide natural ways of interacting with technology using touch, speech, gestures, handwriting, and vision. The goal of these interfaces is to ease the discovery and barriers caused by interfaces such that the computing technology acts like a natural and dynamic partner and less of a tool. However, interaction with NUIs continues to mimic the traditional point-and-click paradigm of desktop computers, and thus, reinforces the idea that technology is a tool and not a partner. In this talk, I will present my vision of the future of human-machine interactions, and then discuss my lab’s research projects that aim to see this vision become a reality. In particular, I will discuss our research in understanding how human-human nonverbal communication (e.g., gesture, gaze, and facial expressions) can be leveraged to create natural multimodal interfaces to aid in developing collaborations between humans and technology. Lastly, I will end by describing a new DARPA-funded project that aims to develop an Augmented Reality computational partner to assist with various sequential tasks.
Dr. Jaime Ruiz is an Associate Professor in the Department of Computer & Information Science & Engineering at the University of Florida, where he directs the Ruiz HCI Lab. Before joining the University of Florida in 2016, he was a faculty member at Colorado State University. His primary research is in the field of Human-Computer Interaction, focusing on multimodal and natural user interfaces. Dr. Ruiz received his Ph.D. in Computer Science from the University of Waterloo. He also holds a M.S. in Computer Science from San Francisco State University and a B.S. in Psychology from the University of California, Davis. Dr. Ruiz’s work has been funded by NSF, DARPA, NIH, USDA, and Google. In 2018, he was awarded an NSF CAREER award for his project on next-generation multimodal interfaces.