Demos
DEMO1115: Demonstration of FIRE: Mid-Air Thermo-Tactile Display
Authors
Yatharth Singhal, Haokun Wang, Jin Ryong Kim
Abstract
We demonstrate an ultrasound haptic-based mid-air thermo-tactile display system. We design a proof-of-concept thermo-tactile feedback system with an open-top chamber, heat modules, and an ultrasound haptic display. Our method involves directing heated airflow toward the focused pressure point produced by the ultrasound display to deliver thermal and tactile cues in mid-air simultaneously. We demonstrate our system with three different VR environments (campfire, kitchen, and candle) to show the rich user experiences of integrating thermal and tactile feedback.
DEMO1109: PORI: POrtal Recall Interface for Relationship Memories
Authors
Nahyeon Kim, Chungnyeong Lee, Jusub Kim
Abstract
We introduce a new AR+AI powered mobile App for capturing and reliving personal relationship memories. Photo and video have limitations as a medium for recording relationship memories. They don’t have the ability to record interactions with the subject of the relationship memory. It’s also very difficult to provide an immersive re-experience through normal photo and video because the subject exists in a different visual environment than the user replaying the memory. We developed a next-generation personal memory recording and playback app that will allow people to relive their everyday interpersonal experiences more vividly than photos and videos.
DEMO1086: Demonstrating XDTK: Prototyping Multi-Device Interaction and Arbitration in XR
Authors
Eric J Gonzalez, Ishan Chatterjee, Khushman Patel, Mar Gonzalez-Franco, Andrea Colaço, Karan Ahuja
Abstract
The interaction space of XR head-mounted devices can be extended by leveraging other digital devices, such as phones, tablets, and smartwatches. We present a demonstration of XDTK (Cross-Device Toolkit), an open-sourced prototyping toolkit for multi-device interactions in XR. The toolkit consists of two parts: (1) an Android app that runs on client devices and surfaces pose, touch, and other sensor data to a (2) Unity server that can be added to any Unity-based XR application. For this demo, we specifically apply XDTK toward a few example applications, including multi-device arbitration. By leveraging pose data from each device, we can infer which device the user is gazing at to seamlessly hand off control and display between multiple devices. We also show examples leveraging a tablet sketching and a smartwatch for menu navigation.
DEMO1111: Using Skin Conductance Sensing for Empathic Mixed Reality Agents
Authors
Zhuang Chang, Kangsoo Kim, Kunal Gupta, Jamila Abouelenin, Zirui Xiao, Boyang Gu, Huidong Bai
Abstract
This demonstration shows how mixed reality agents (MiRAs) can create empathy with users by sensing their skin conductance level (SCL). We developed a Mixed Reality (MR) shooting game featuring a virtual human capable of responding to the user’s SCL changes. The system configures the agent’s awareness of SCL, enabling accurate, random, or no responses. This work demonstrates a novel approach to enhancing human-agent interaction in MR environments. Future directions include expanding emotional state monitoring through using multiple biosensors, integrating sensors for environmental awareness, and further developing machine learning models to improve the MiRA’s emotional intelligence.
DEMO1083: Accented Character Entry Using Physical Keyboards in Virtual Reality
Authors
Snehanjali Kalamkar, Verena Biener, Daniel Pauls, Leon Lindlein, Morteza Izadifar, Per Ola Kristensson, Jens Grubert
Abstract
In recent years, research on text entry in Virtual Reality (VR) has gained popularity but the efficient entry of accented characters (i.e. characters with diacritical marks) in VR remains underexplored. Entering accented characters is supported on most soft keyboards through a long press on a base character and subsequent selection of the accented character. However, entering those characters on physical keyboards is still challenging, as they require the recall and entry of respective numeric codes. To address this issue, within this work, we investigate three techniques to support accented character entry on physical keyboards in VR. Specifically, we compare a context-aware numeric code technique not requiring users to recall a code, a key-press-only condition in which the accented characters are dynamically remapped to physical keys next to a base character, and a multimodal technique, in which eye gaze is used to select the accented version of a base character previously selected by key-press on the keyboard. The results from our user study (n=18) reveal that both the key-press-only and the multimodal techniques outperform the baseline technique in terms of text entry speed.
DEMO1122: Enhance Flight Experience Through Wind-based Cross-modal Effect
Authors
Yuan Li, Jiayi Hu, Du Jin, Juro Hosoi, Rui ZHANG, Yuki Ban, Shinichi Warisawa
Abstract
Realizing an immersive flight experience in the field of virtual reality is of great interest in both academia and industry. Existing methods tend to achieve this by providing an intuitive sense of motion or physically suspending the user. However, these methods often place a significant demand on the user’s physical space or impose various restrictions on user interactions. In this study, we propose a new method that is expected to enhance the flight experience by achieving the sensation of floating without suspending the user. To do this, we developed a wind-based device that can give airflow stimulation to the soles of the user’s feet. A preliminary user study has been conducted, laying the foundation for our future works.
DEMO1094: Visibility Modulation of Aligned Spaces for Multi-User Telepresence
Authors
Taehei Kim, Jihun Shin, Hyeshim Kim, Hyuckjin Jang, Jiho Kang, Sung-Hee Lee
Abstract
We propose a multi-user Mixed Reality(MR) telepresence system that dynamically modulates the visibility of multiple users’ rooms based on their locations. Our method optimizes room alignment to maximize a shared space, where users interact with common real or virtual objects and each other. We also visualize each user’s private, non-shared space to convey activities beyond the shared area. To address room overlap, we dynamically adjust private space visibility based on users’ locations. Our system allows users to experience a seamlessly connected and shared multi-room environment. We also demonstrate a game that leverages the dynamic visibility modulation feature of our system.
DEMO1106: Virtual Reality for Immersive Education in Orthopedic Surgery Digital Twins
Authors
Jonas Hein, Jan David Grunder, Lilian Calvet, Frédéric Giraud, Nicola Alessandro Cavalcanti, Fabio Carrillo, Philipp Fürnstahl
Abstract
TBA
DEMO1128: Virtual Dairy Farm: An Interactive Experience for Public Education
Authors
Anh Nguyen, Hyeongil Nam, Emma Windfeld, Michael Francis, Guillaume Lhermie, Kangsoo Kim
Abstract
Despite growing public interest in the origins and production methods of dairy products, driven by concerns about environmental impact, local sourcing, and ethics, a knowledge-trust gap persists between consumers and the dairy industry. To tackle this, our demo presents an immersive virtual farm simulation designed to provide realistic on-farm experiences to the public. Users can visit the virtual farm, explore various sites where dairy cows are raised, and learn about dairy production processes through this virtual experience. The simulation demonstrates significant potential as an effective tool for agricultural education.
DEMO1123: MetaGadget: IoT Framework for Event-Triggered Integration of User-Developed Devices into Commercial Metaverse Platforms
Authors
Ryutaro Kurai, Yuichi Hiroi, Takefumi Hiraki
Abstract
This demonstration introduces MetaGadget, an IoT framework designed to integrate user-developed devices into commercial metaverse platforms. Synchronizing virtual reality (VR) environments with physical devices has traditionally required a constant connection to VR clients, limiting flexibility and resource efficiency. MetaGadget overcomes these limitations by configuring user-developed devices as IoT units with server capabilities, supporting communication via HTTP protocols within the commercial metaverse platform, Cluster. This approach enables event-triggered device control without the need for persistent connections from metaverse clients. Through the demonstration, users will experience event-triggered interaction between VR and physical devices, as well as real-world device control through the VR space by multiple people. Our framework is expected to reduce technical barriers to integrating VR spaces and custom devices, contribute to interoperability, and increase resource efficiency through event-triggered connections.
DEMO1124: Mitigating Latency Effects on Subjective Experience in Robot Teleoperation Using a VR-Enabled Virtual Spring
Authors
Du Jin, Rui ZHANG, Yuan Li, Yuki Ban, Shinichi Warisawa
Abstract
Remote robot teleoperation is crucial for tasks that require human oversight, yet latency can significantly impair operator performance and result in discomfort, break in presence and increased workload. In this paper, we propose a new robot teleoperation technique based on Virtual Reality that is expected to alleviate the negative impacts on subjective experience caused by latency. The technique allows users to manipulate a virtual robot synchronized with a real one by ‘grabbing’ a virtual spring attached to it. Controller vibration is also used to simulate spring force feedback, together creating an illusion that the position discrepancy is caused by the dynamics of the spring instead of latency. We hypothesize that this approach can mitigate the negative effects of latency by making the robot’s movement appear less strange and more natural. A user study was conducted to evaluate the effectiveness of virtual spring and controller vibration separately using a 2×2 factorial design. The results suggest that the virtual spring enhanced user comfort level and the sense of presence, but the controller vibration showed no clear benefits.
DEMO1132: AI-Powered Mixed Reality: Revolutionizing Training Methodologies
Authors
Bence Bihari, Bálint György Nagy, János Dóka, Balazs Sonkoly
Abstract
Training is a costly and time-consuming task, especially in industrial environments. Self-learning methods, customized curriculum, mixed reality (MR), and automation in the training process can have great societal impact. However, the preparation of new educational materials will pose additional challenges which could hinder the technology adaptation. In this demo, we propose a novel platform exploiting various AI technologies to facilitate and accelerate the creation of training applications, especially in MR environments. An expert needs to perform and explain a complex process only once and our system automatically generates a step-by-step tutorial including video snippets and textual descriptions.
DEMO1120: Audyssey: Immersive Visualization of Popular Music Evolution
Authors
Dasol Lee, Jaesuk Lee, Semi Kwon, Jusub Kim
Abstract
In this demo, we present a method to visualize the changes in pop music trends from 1958 to 2024 using Billboard Hot 100 chart data. We collected audio sources from YouTube and classified the music data by decade, from the 1960s to the 2010s, using a CNN model. The classified music data was then projected into a 3D space using UMAP and implemented as a VR experience. This allows users to visually and audibly experience the similarities and changes in music trends over time. This demo offers a new way to intuitively understand the temporal evolution of popular music.
DEMO1130: VHard: An XR UI for Kinesthetic Rehearsal of Rock Climbing Moves
Authors
Joel Kevles Salzman, Jace Li, Benjamin Yang, Steven Feiner
Abstract
Rock climbers frequently struggle to complete climbs due to the difficulty of practicing precise movements without actually attempting them. A standard strategy is to kinesthetically rehearse the hand and body positions required while on the ground. However, the lack of live feedback severely hampers this method because climbers can only roughly estimate the distances and positions involved. We introduce VHard, an extended reality (XR) application that visualizes the precision and accuracy of hand placements on a metric digital twin of a MoonBoard climbing wall. VHard augments kinesthetic rehearsal with XR feedback in order to assist climbers in completing difficult boulder problems.
DEMO1104: An XR GUI for Visualizing Messages in ECS Architectures
Authors
Benjamin Yang, Xichen He, Jace Li, Carmine Elvezio, Steven Feiner
Abstract
Entity-Component-System (ECS) architectures are fundamental to many systems for developing extended reality (XR) applications. These applications often contain complex scenes and require intricately connected application logic to connect components together, making debugging and analysis difficult. Graph-based tools have been created to show actions in ECS-based scene hierarchies, but few address interactions that go beyond traditional hierarchical communication. To address this, we present an XR GUI for Mercury (a toolkit to handle cross-component ECS communication) that allows developers to view and edit relationships and interactions between scene entities in Mercury.
DEMO1112: Demonstration of Fiery Hands: Thermal Gloves through Thermal and Tactile Integration
Authors
Haokun Wang, Yatharth Singhal, Hyunjae Gil, Jin Ryong Kim
Abstract
We demonstrate a novel wearable thermal interface, designed and fabricated as a glove consisting of flexible thermoelectric devices (Peltier) and vibrotactile ERM motors. Our approach induced the thermal sensation through thermal and tactile integration by strategically placing the thermal actuator at the back side of the finger, presenting thermal feedback at the location of tactile actuators on the palmer side, and providing free hand manipulation. We showcase our system with two different VR environments: water faucet and magic fireball to illustrate the unique user experience of interacting with virtual objects.
DEMO1113: Demonstration of Thermal Flow Illusions with Tactile and Thermal Interaction
Authors
Yatharth Singhal, Daniel Honrales, Haokun Wang, Jin Ryong Kim
Abstract
We present a novel interaction technique called thermal motion, which creates an illusion of flowing thermal sensations by combining thermal and tactile actuators. This technique generates dynamic thermal referral illusions across multiple tactile points, making users perceive moving thermal cues. Our demonstration uses a sleeve form factor to illustrate this effect, effectively creating the illusion of moving thermal cues along users’ arms.
DEMO1114: Demonstration of Tangible Data: Immersive Data Exploration With Touch
Authors
Ayush Bhardwaj, Jin Ryong Kim
Abstract
We demonstrate an interactive 3D data visualization tool called TangibleData. TangibleData adapts hand gestures and mid-air haptics to provide 3D data exploration and interaction in VR using hand gestures and ultrasound mid-air haptic feedback. We showcase different types of 3D visualization datasets with different data encoding methods that convert data into visual and haptic representations for data interaction. We focus on an intuitive approach for each dataset, using mid-air haptics to improve the user’s understanding. We transformed each dataset’s inherent properties (velocity, vorticity, density, volume, and location) into haptics to demonstrate a tangible experience for the user.
DEMO1095: VR Simulation for Evaluating Pharmaceutical Delivery Assistance Systems from Sales Offices to Client Facilities
Authors
Shigeki Ozawa, Takumi Miyoshi, Ryosuke Ichikari, Takuya Miura, Takeshi Kurata
Abstract
In delivery processes from pharmaceutical wholesaler’s sales offices to the client’s facilities, an assistance system using handheld devices and paper slips is used. The introduction of AR assistance systems is beginning to be investigated to improve operations in these processes. However, as AR is not yet highly accepted in Japanese industry, a VR environment has been constructed to enable the cycle consisting of prototyping, comparison with existing systems, and sharing the results as efficiently as possible. In this demo, the participants will be able to have a try of each assistance method in the VR environment.
DEMO1127: Lee Jungseop XR: A XR Documentary for Apple Vision Pro
Authors
Dasom Kim, Melodie Jannau, Yongsoon Choi, Sangyong Kim, Jusub Kim
Abstract
Artworks and artifacts are stored in collections in museums or art storage facilities. Visible storage became popular in museums all over the world about 2010, allowing visitors to have a wider visual view of the exhibits and boosting accessibility and utilization. However, the appreciation of individual pieces may be hampered by this high-density display methods, and curators generally find it difficult to design exhibitions. By getting over physical constraints, eXtended Reality (XR) technology offers a workable solution that makes it easier to plan a variety of creative and unique exhibitions. This work suggests that a new category of XR documentary content be developed in response to the advent of Apple’s Vision Pro. The goal of the research is to determine whether XR technology can improve the experience in the visible storage exhibitions, which might have a significant impact on how collections are seen and used in the future.
DEMO1118: Visual Guidance for Assembly Processes
Authors
Julian Kreimeier, Hannah Schieber, Shiyu Li, Alejandro Martin-Gomez, Daniel Roth
Abstract
TBA