Human-Computer Interaction + Human-Centered AI
Interactive, Mobile, Wearable, Ubiquitous Technology
Extended Reality (AR, VR, MR) / GenAI-XR / In-Car XR / Human-AV Interaction
Sports HAI / Physical AI for HCI / Rewiring Experience Across Realities
Interpretable, Inclusive, and Immersive Interaction for AI-Infused Physical Systems (Vehicles, Robots, Cities)
Ongoing
Bridging the Gap Between Reality and Virtuality: Actuated XR Systems Powered by Sensory Intelligence and Soft Robotics, NRF/MSIT (PI, 2024-2027)
Dynamic Real-Virtual Haptic Feedback Systems Based on Personal Robots, NRF/MOE (Mentor, 2024-2027)
SpaceTop: Spatial Computing HCI Technology for Everywhere XR Productivity Workstation, IITP/MSIT (2024-2031)
HCI + AI for Human-Centered Physical System Design (AI for HCI), GIST-MIT Research Collaboration (PI, 2021-2023 / 2024-2025)
Hyper-Connected Mobility Safety Technology Based on AI, KAIA/MOLIT (2023-2024 / 2025-2026)
Convergence Culture Virtual Studio for Realizing AI-based Metaverse, Innopolis Foundation/MSIT (2023-2024 / 2025-2026)
Inter-University Alliance for Cultivating R&D Experts in Future Vehicular Technologies, KIAT/MOTIE (2022-2026)
Advanced Human Resource Training for AI Core Technology, IITP/MSIT (2019-2028)
Completed
AI-Powered Intelligent Surfaces for Adaptive XR Environments in Vehicles, GIST Research Project (PI, 2024)
AI-based Game Simulation Technology to Support Online Game Content Production, KOCCA/MCST (2022-2024)
Graduate School of Convergence for FLEX Energy, KETEP/MOTIE (2020-2024)
Digital Contents Lab, RAPA/MSIT
GIST CarXR Lab: In-Car XR Platforms and Contents to Augment Passenger UX in Futuristic Vehicles (PI, XR Lab 2021)
GIST Metaverse Research Center: Metaverse with Immersive Transportation Methods - Walking and Driving XR for Meta-Mobility (PI, Metaverse Lab 2022)
Meta-Human with Immersive Mobility Platform and Multimodal XR Contents (PI, Metaverse Lab 2023)
Human-AI Collaboration for XR Digitization of Physical Experience, GIST Research Project (PI, 2023)
Meta-Human: A Virtual Cooperation Platform for a Specialized Industrial Services, MOTIE (2022-2023)
Natural User Interface for Immersive Movement and Interaction in Metaverse Industrial Sites (PI, 2022)
Stylized Neural Rendering for 3D Meta-Human Editing/Authoring (PI, 2023)
Human-Centered Game AI Research Laboratory, NRF/MSIT (2021-2023)
Interactive Contents and Platform based on Multiple Scenario in Autonomous Vehicle, KOCCA/MCST (KETI, 2020-2022)
Research on Multi-Sensor Fusion Technology for Multi-User NUI Platform (PI, 2020)
System/Interface Design and Validation of Multi-User NUI Platform for Vehicle Demonstration (PI, 2021)
Usability Evaluation and Improvement of Multi-User NUI Platform (PI, 2022)
Performance Evaluation of AI/Big Data-Based Time-Series Price Prediction Service, Nature Mobility Co., Ltd. (2020-2021)
Human-Centered Services and Interaction Technologies for Intelligent Vehicles, GRI G.I. 4.0. (PI, 2020-2021)
Content Creation and Entertainment Technologies Based on Intelligent Authoring Tool to Enhance Accessibility of Social Communication Disabilities, KOCCA/MCST (2019-2021)
Visibility Test Methods and Tools for Public Signage, KOCCA/MCST (PI, 2019-2021)
In-Vehicle Cyber-Human Systems for Supporting Human-Centered Interaction with Machine Intelligence in Autonomous Vehicles, GIST AI (PI, 2019 + 2020 & 2021)
Smart Culture Lens Based on Machine Learning for the Analysis of Elements of Visual Form in Korean Culture, KOCCA/MCST (2018-2019)
Contract Research (KETI) (2018-2019)
Gesture-Based UX Design for Vehicle AR NUI (PI, 2018)
Design and Validation of Gesture Recognition-Based User Input System for Vehicle AR NUI (PI, 2019)
HVI Testbed for Creating AI-based Driver Experience Models in Futuristic Driving Environments, GIST Demo (PI, 2018)
AI-Based Assessment of In-Situ Driver Experience in Autonomous Driving Situations, GIST AI (PI, 2018)
HRI Experience Sampling Test-bed with a Sensor Support, GIST Convergence (PI, 2017)
Sensor-based Context-Aware System to Improve UX in ubiquitous HCI Situations, GIST GUP (PI, 2017-2019)
Categories: HCI+Mobility Accessibility Locomotion NUI VR/AR Education/Training Haptics Gamification HCI+AI
Education/Training HCI+AI
Counterfactual Explanation-Based Badminton Motion Guidance Generation Using Wearable Sensors
Haptics VR/AR
Flip-Pelt: Motor-Driven Peltier Elements for Rapid Thermal Stimulation and Congruent Pressure Feedback in Virtual Reality
Accessibility Haptics VR/AR
WatchCap: Improving Scanning Efficiency in People with Low Vision through Compensatory Head Movement Stimulation
Locomotion HCI+AI VR/AR
GaitWay: Gait Data-Based VR Locomotion Prediction System Robust to Visual Distraction
HCI+Mobility Locomotion VR/AR
Curving the Virtual Route: Applying Redirected Steering Gains for Active Locomotion in In-Car VR
Haptics Locomotion VR/AR
ErgoPulse: Electrifying Your Lower Body With Biomechanical Simulation-based Electrical Muscle Stimulation Haptic System in VR
HCI+Mobility Haptics VR/AR
SYNC-VR: Synchronizing Your Senses to Conquer Motion Sickness for Enriching In-Vehicle Virtual Reality
HCI+AI
LumiMood: A Creativity Support Tool for Designing the Mood of a 3D Scene
HCI+Mobility VR/AR
The Way of Water: Exploring the Role of Interaction Elements in Usability Challenges with In-Car VR Experience
Education/Training
MultiSenseBadminton: Wearable Sensor–Based Biomechanical Dataset for Evaluation of Badminton Performance
Accessibility Education/Training
Engagnition: A Multi-Dimensional Dataset for Engagement Recognition of Children with Autism Spectrum Disorder
Locomotion VR/AR
Effect of Optical Flow and User VR Familiarity on Curvature Gain Thresholds for Redirected Walking
Accessibility NUI Locomotion VR/AR
Enhancing Wayfinding Experience in Low-Vision Individuals through a Tailored Mobile Guidance Interface
NUI
A Study with Portal Display, a Head Pose-Responsive Video Teleconferencing System
Locomotion VR/AR
Enhancing Seamless Walking in Virtual Reality: Application of Bone-Conduction Vibration in Redirected Walking
HCI+Mobility HCI+AI VR/AR
What and When to Explain? On-Road Evaluation of Explanations in Highly Automated Vehicles
HCI+Mobility VR/AR
Designing Virtual Agent Human-Machine Interfaces Depending on the Communication and Anthropomorphism Levels in Augmented Reality
Education/Training HCI+AI
Multi-Layer Multi-Input Transformer Network (MuLMINet) with Weighted Loss
HCI+Mobility VR/AR
Assessing the Impact of AR HUDs and Risk Level on User Experience in Self-Driving Cars: Results from a Realistic Driving Simulation
Locomotion NUI Haptics VR/AR
Giant Finger: A Novel Visuo-Somatosensory Approach to Simulating Lower Body Movements in Virtual Reality
HCI+AI
Simulating Urban Element Design with Pedestrian Attention: Visual Saliency as Aid for More Visible Wayfinding Design
Education/Training VR/AR
Logogram VR: Treadmill-Coupled VR with Word Reflective Content for Embodied Logogram Learning
Locomotion Haptics VR/AR
Electrical, Vibrational, and Cooling Stimuli-Based RDW: Comparison of Various Vestibular Stimulation-Based Redirected Walking Systems
Locomotion VR/AR
Evaluation of Visual, Auditory, and Olfactory Stimulus-Based Attractors for Intermittent Reorientation in Virtual Reality Locomotion
HCI+Mobility VR/AR
Take-Over Requests after Waking in Autonomous Vehicles
HCI+Mobility NUI
Naturalistic Ways for Drivers to Intervene in the Vehicle System while Performing Non-Driving Related Tasks
HCI+Mobility NUI
Gaze-Head Input Examining Potential Interaction with Immediate Experience Sampling in an Autonomous Vehicle
Gamification
Cultural Heritage Design Element Labeling System With Gamification
HCI+Mobility NUI
A Cascaded Multimodal Natural User Interface to Reduce Driver Distraction
Gamification
Designing a Crowdsourcing System for the Elderly: A Gamified Approach to Speech Collection
HCI+Mobility VR/AR
Toward Immersive Self-Driving Simulations: Reports from a User Study across Six Platforms
HCI+Mobility VR/AR
A New Approach to Studying Sleep in Autonomous Vehicles: Simulating the Waking Situation
HCI+Mobility VR/AR
MAXIM: Mixed-reality Automotive driving XIMulation
The goal of this project is to create an in-car information system that adapts the delivery timings of HCI demands to drivers based on in-situ driving and cognitive load models for safe navigation.
In this project, We explored a novel navigation display system that uses an augmented reality (AR) projection to minimize cognitive distance by overlaying driving directions on the windshield and road.
While in-car navigation systems enhance situational awareness, they also increase drivers’ visual distraction and cognitive load. This project explores the efficacy of multi-modal route guidance cues for ‘safer’ driving.
The purpose of this project was to design features for car dashboard displays that are both functional and aesthetically pleasing.
This project aims to better support student learning by adapting computer-based tutoring to individual learning phases and real-time capabilities. In this manner, computer-based tutors may be more effective in supporting robust learning.
This project explored the effects of AR technology, when combined with a range of 3D prototype applications.
The goal of this project is to improve perception and performance during touch-based interaction in personal electronic devices. Specifically, we have identified the appropriate fusion of visual, audio, and haptic cues during fingertip interaction with touch screen images
This project aims to understand users-on-the-go in connected environments and to improve the quality of their ubiquitous HCI experience by enhancing machine intelligence to be more human-centered.
The goal of this project is to help drivers safely interact with ubiquitous HCI demands and benefit from proactive information services in cars. Our prior and ongoing projects primarily explore the ‘interruptive’ feature of ubiquitous HCI demands in cars. We have been rigorously addressing the issues of when to intervene by using our sensor-based assessment technologies that estimate drivers’ cognitive load in near real-time.
This project seeks to develop a sensor-based method for tracking variation in cognitive processing loads.
This project investigates how dialog-based HCI demands interact with driver interruptibility. We are refining our key technology, obtained from Project 1 in this document, to predict the duration of driver interruptibility.
This proposed project aims to make machine intelligence-driven physical and non-physical interventions in computer-assisted driving more acceptable and dependable.