Keynote Speakers
Speaker: Hrvoje Benko, Director, Research Science, Meta Reality Labs Research
Title: Beyond Words: Improving Interactions for AI-Assisted AR
Abstract:
The Augmented Reality technology combined with an Artificial Intelligence assistant holds the promise to provide us with assistance at the right time, at the right place and with the right information. The vision of always-on AI-assisted AR that can be used in a continuous fashion for an entire day depends on solving many difficult problems including display technology, computing power, batteries, localization, tracking, and contextual sensing, in addition to delivering the multimodal AI models to power the inference and assistance. However, to deliver truly useful all-day mobile AI+AR experiences, we must solve the fundamental problem of how to effectively interact with this technology that goes beyond just speaking to it. The solution to the interaction problem requires that we invent novel sensing and haptic technologies for all-day wearable devices as well as leverage the ever more powerful contextual AI understanding to yield truly frictionless and expressive experiences. In this talk, I will cover the recent advances in this research area from Meta Reality Labs Research.
Speaker Bio:
Dr. Hrvoje Benko is a Director of Research Science at Meta Reality Labs Research where he is developing novel interactions, devices and interfaces for Contextualized AI, Augmented and Virtual Reality. He is leading the efforts to invent novel wearable devices that enable people to use their gestures, gaze and voice to express themselves while harnessing the contextualized understanding of their environment for better interactions. He currently leads a multi-disciplinary organization that includes scientists and engineers with expertise in human computer interaction, computer vision, machine learning, AI, design, neuroscience and cognitive psychology.
He is an expert in the field of Human-Computer Interaction (HCI) where he has coauthored more than 80 scientific articles and 70 issued patents. His research has been awarded 13 best paper awards or honorable mentions at the top HCI conferences and he has received the ACM UIST Lasting Impact Award in 2022 for his co-authored work “OmniTouch: Wearable Multitouch Interaction Everywhere”. He has been active in the organization of the ACM User Interface Systems and Technology conference, the premiere technical conference in HCI, serving as the program chair in 2012 and as the general chair in 2014. He sits on the editorial board of the TOCHI Journal, the premiere journal in the HCI field.
He also holds an Affiliate Full Professor position at the University of Washington Information School. Prior to his current role at Meta, he was a Principal Researcher at Microsoft Research, where he worked on novel haptic handheld devices, multi-touch interactions, large-scale projection-mapping environments, and novel AR/VR interactions and technologies. He received his Ph.D. in Computer Science from Columbia University in 2007 investigating mobile augmented reality and multi-touch interactive technologies. In 2023, he was inducted into the SIGCHI Academy for his research contributions to the field of Human-Computer Interaction.
Speaker: Maud Marchal, University of Rennes, INSA
Title: Exploring novel haptic modalities within mixed environments
Abstract:
These last years have seen the emergence of novel haptic technologies providing various haptic sensations across all the user’s body. However, while these technologies have now been made available, mixed reality systems are still struggling to convey compelling haptic sensations for 3D interaction.
The talk presents some of our latest contributions exploring the design of haptic interaction techniques within mixed environments. The talk will illustrate the use of haptic technologies through the different components of the 3D interaction loop, from perceptual considerations to algorithmic implementation, through the design of innovative haptic interfaces as well as 3D interaction metaphors. At the end, the talk aims at introducing some of the next challenges in Mixed Reality for enhancing 3D interaction through the use of multimodal haptic feedback.
Speaker Bio:
Maud Marchal is a Full Professor in Computer Science at Univ. Rennes, INSA. She is also a Junior Member of Institut Universitaire de France since 2018. She works on physics-based simulation since her PhD in 2006 at University Joseph Fourier, Grenoble. Since 2008 and her position at INSA, she has explored and contributed to novel VR applications, gathering her expertise on multi-sensory feedback, 3D interaction techniques and interactive physics-based simulations. She is the Principal Investigator of an ERC Consolidator Grant on multimodal haptics in Virtual Reality. She is involved in program committees of major conferences of computer graphics, virtual reality and haptics and Associate Editor of IEEE Transactions on Visualization and Computer Graphics, IEEE Transactions on Haptics, Computers & Graphics and ACM Transactions on Applied Perception. She has notably been Program Chair of IEEE Virtual Reality Conference in 2018, 2020 and 2021, Program Chair of IEEE Symposium on Mixed and Augmented Reality in 2021 and 2023 and General Chair of ACM SIGGRAPH/Eurographics Symposium on Computer Animation in 2018 and Eurohaptics in 2024.