Emerging Technologies Presentations
- Full Access
- Onsite Student Access
- Onsite Experience
- Virtual Full Access
- Virtual Basic Access
All 19 presentations are accessible on-demand in the virtual platform from 6 December 2021 to 11 March 2022.
Out of which, 13 Emerging Technologies will have physical exhibits onsite in Hall E, Tokyo International Forum from 15 - 17 December 2021.
Live demonstrations and Q&As for the respective presentations will be taking place at the specified Date/Time below.
"Amazing Sketchbook the Ride": Driving a Cart in a 3DCG Scene Created from a Hand-Drawn Sketch
Date: Friday, 17 December 2021
Time: 10:00am-11:00am
Venue: Hall E, Experience Hall Talk Stage (B2F, E Block)
Date:
Description: We introduce an interactive content "Amazing Sketchbook the Ride", which allows users to drive an actual cart and move freely in a 3DCG scene as if they were immersed in it. The users can create the 3DCG scene for driving by themselves by drawing a sketch of the scene.
Collaborative Avatar Platform for Collecting Human Expertise
Date: Thursday, 16 December 2021
Time: 4:00pm-5:00pm
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: We present a system that mixes two people into one robot arm and collaborates together.
There are two ways of control (division of roles and mixing in adjustable ratio), and we found that it allows us to perform movements that are impossible by alone and stabilizes the movement.
Depth-Aware Dynamic Projection Mapping using High-speed RGB and IR Projectors
Date: Thursday, 16 December 2021
Time: 5:00pm-6:00pm
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: We report a new system combining a camera, an RGB projector, and a newly developed high-speed IR projector. Moreover, we realize a 0.4-ms markerless 3D tracking by leveraging a small inter-frame motion. Based on these technologies, we demonstrate flexible, depth-aware dynamic projection mapping on the entire scene with 8-ms latency.
DroneStick: Flying Joystick as a Novel Type of Interface
Date: Wednesday, 15 December 2021
Time: 3:00pm-4:00pm
Venue: Virtual Platform
Date:
Description: DroneStick is a novel hands-free method for smooth interaction between a human and a robotic system via one of its agents. A flying joystick (DroneStick) being a part of a multi-robot system is composed of a flying drone and coiled wire with a vibration motor.
Frisson Waves: Sharing Frisson to Create Collective Empathetic Experiences for Music Performances
Date: Friday, 17 December 2021
Time: 1:00pm-2:00pm
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: Frisson is a mental experience of body reactions such as shivers, tingling skin, and goosebumps. However, this sensation is not shareable with others and is rarely used in live performances. We propose Frisson Waves, a real-time system to detect, trigger and share frisson in a wave-like pattern during music performances.
GroundFlow: Multiple Flows Feedback for Enhancing Immersive Experience on the Floor in the Wet Scenes
Date: Wednesday, 15 December 2021
Time: 1:00pm-2:00pm
Venue: Virtual Platform
Date:
Description: We present GroundFlow, a water recirculation system that provides multiple flows feedback on the floor in immersive virtual reality. Our demonstration also implemented a virtual excursion that allows users to experience different water flows and their corresponding wet scenes.
HoloBurner: Mixed reality equipment for learning flame color reaction by using aerial imaging display
Date: Friday, 17 December 2021
Time: 11:00am-12:00pm
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: The HoloBurner system is a mixed reality instrument for chemistry experiments to learn about the color reaction of flames. The system consists of a real Bunsen burner and an aerial image display that shows a holographic image of a flame at the tip of the burner.
Integration of stereoscopic laser-based geometry into 3D video using DLP Link synchronisation
Date: Wednesday, 15 December 2021
Time: 11:00am-12:00pm
Venue: Virtual Platform
Date:
Description: In this work, we demonstrate a laser-based display integrating stereoscopic vector graphics into a dynamic scene generated by a conventional 3d video projection system. These laser graphics are projected with a galvo-laser system which exhibits far greater brightness, contrast and acuity compared to pixelated projectors.
LIPSYNC.AI: A.I. Driven Lips and Tongue Animations Using Articulatory Phonetic Descriptors and FACS Blendshapes
Date: Wednesday, 15 December 2021
Time: 10:00am-11:00am
Venue: Virtual Platform
Date:
Description: LIPSYNC.AI: A.I. Driven Lips and Tongue Animations Using Articulatory Phonetic Descriptors and FACS Blendshapes
Midair Haptic-Optic Display with Multi-Tactile Texture based on Presenting Vibration and Pressure Sensation by Ultrasound
Date: Friday, 17 December 2021
Time: 3:00pm-4:00pm
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: In this demonstration, we develop a midair haptic-optic display with multi-tactile texture using focused ultrasound. In this system, participants can touch aerial 3D images with a realistic texture without wearing any devices. The realistic textures are rendered by simultaneously presenting vibration and static pressure sensation using ultrasound.
Multimodal Feedback Pen Shaped Interface and MR Application with Spatial Reality Display
Date: Thursday, 16 December 2021
Time: 12:00pm-1:00pm
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: Pen shaped interface capable of providing the following multimodal feedbacks from virtual object (1) Linear feedback to express contact pressure
(2) Rotational feedback to simulate friction of rubbing virtual surface
(3) Vibrotactile feedback
(4) Auditory feedback to express contact information
MR interaction system with pen shaped interface and Spatial Display.
Parallel Ping-Pong: Demonstrating Parallel Interaction through Multiple Bodies by a Single User
Date: Thursday, 16 December 2021
Time: 11:00am-12:00pm
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: Parallel Ping-Pong is a Parallel Interaction with multiple bodies (here 2 robots arms) controlled by a single user. To reduce the user workload when controlling multiple bodies, we added (1) an automatic view transition between the bodies' viewpoints, and (2) an autonomous body motion integrating the user motion.
Real-time Image-based Virtual Try-on with Measurement Garment
Date: Thursday, 16 December 2021
Time: 10:00am-11:00am
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: We propose a robust real-time Image-based virtual try-on system. We formulate the problem as a supervised image-to-image translation problem using a measurement garment, and we capture the training data with a custom actuated mannequin. The user only needs to wear a measurement garment to try on many different target garments.
Recognition of Gestures over Textiles with Acoustic Signatures
Date: Wednesday, 15 December 2021
Time: 12:00pm-1:00pm
Venue: Virtual Platform
Date:
Description: We demonstrate a method capable of turning a textured surface into an opportunistic input interface using a machine learning model pre-trained on the acoustic signal generated while scratching different types of fabric. A single and brief audio recording then suffices to characterize both the texture and the gesture that originates.
Self-Shape-Sensing Device with Flexible Mechanical Axes for Deformable Input Interface
Date: Thursday, 16 December 2021
Time: 2:00pm-3:00pm
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: A novel device that is capable of sensing its own shape, structured around a flexible mechanical axis that allows for its deformation within a wider degree of freedom and enables its tangible control by hand. Users can interact with volumetric images on spatial display intuitively by changing device shape.
Simultaneous Augmentation of Textures and Deformation Based on Dynamic Projection Mapping
Date: Thursday, 16 December 2021
Time: 1:00pm-2:00pm
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: We exploit human perception characteristics and dynamic projection mapping techniques and realize simultaneous overwriting of both textures and deformation of a real object.
With the developed 1000fps low-latency projector-camera system, we demonstrate that a plaster figure turns into a colorful and flabby object by the projection.
The Aromatic Garden, Exploring new ways to interactively interpret narratives combining olfaction and vision including temporal change of scents using olfactory display
Date: Thursday, 16 December 2021
Time: 3:00pm-4:00pm
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: Nakamoto Laboratory (Tokyo Institute of Technology) present a multi-sensory olfactory game, ‘The Aromatic Garden’, offering a unique user experience through the combination of new olfactory technology and art. Players must use their sense of smell to navigate and collect scents, presenting an engaging and challenging experience.
VWind: Virtual Wind Sensation to the Ear by Cross-Modal Effects of Audio-Visual, Thermal, and Vibrotactile Stimuli
Date: Friday, 17 December 2021
Time: 2:00pm-3:00pm
Venue: Hall E, Experience Hall - E-Tech (B2F, E Block) & Virtual Platform
Date:
Description: We propose to present virtual wind sensation by cross-modal effects. We developed VWind, a wearable headphone-type device to give vibrotactile and thermal stimuli in addition to visual scenarios and binaural sounds. We prepared the demonstrations as if users were blown to the ear or exposed to freezing winds.
Weighted Walking: Propeller-based On-leg Force Simulation of Walking in Fluid Materials in VR
Date: Wednesday, 15 December 2021
Time: 2:00pm-3:00pm
Venue: Virtual Platform
Date:
Description: Weighted Walking is a wearable device with ducted fans on the user's calf. The fans can generate powerful thrust to simulate the forces caused by the user's lower limbs moving in different fluid materials. It can also simulate the walking experience in different gravity conditions, such as in another planet.