Welcome to HCI Tech Lab!

We are a multidisciplinary research group working on physical computing, natural user interface, and socially acceptable interactions. Our research focuses on enabling novel interactions for Extended Reality (XR) through sensing/feedback technology and wearable interface with the aid of applied machine learning while supporting intelligent authoring system.

We envision natural user interactions that overcome physical, mental, and social barriers. To achieve this, we will focus on


  • Embedding Interactive Technologies
  • Advancing Interaction Techniques
  • Authroring User Interface & Experience



Learn More about HCI Tech Lab through this video (Korean)

News

A paper introduced on SIGGRAPH blog
Jul 4 2022
Our SIGGRAPH 2022 Emerging Technologies paper "Sense of Embodiment Inducement for People With Reduced Lower-body Mobility and Sensations With Partial-visuomotor Stimulation" in collaboration with LAVA Lab & Visual Cognition Lab is highlighted on SIGGRAPH Blog!

A paper accepted to ECCV 2022
Jul 4 2022
"Sound-Guided Semantic Video Generation" in collaboration with Computer Vision Lab is accepted to ECCV2022!

Summer Interns
June 24 2022
Welcome summer interns (Dong Kyu, Min-yung, Jungmin)

Recruiting Graudate Students
June 1 2022
We have positions for M.S. & Ph.D for Spring 2023. Please check M.S./Ph.D. Open Positions.

Undergraduate Research Internship
May 2022
We are looking for undergraduate research interns for 2022 Summer.

Lab Meeting & Lunch Gathering
Apr 29 2022
Kicked off regular research meeting!

New Lab Member
Mar 2 2022
Welcome our new MS student Youjin!

A paper accepted to CVPR 2022
Mar 2 2022
"Sound-Guided Semantic Image Manipulation" in collaboration with Computer Vision Lab is accepted to CVPR2022!

Winter Interns
Jan 17 2022
Welcome KAIST undergraduate interns (Nicha, Yoonseo, Haebin)

First Lab Meeting
Jan 13 2022
First Lab Meeting including Graduate Students & Undergraudate Interns.

Lab Opening!
Jan 3 2022
Officially opend the lab area. Work-in-progress setting up the lab with members.

URP Invidual Research Selected
Dec 21 2021
Individual Research Proposal by Zofia has been accepted. This is exploratory research to enable Novel Haptic Interface.

New Lab Members
Dec 13 2021
Welcome our new MS student (Minjae) and Interns (Zofia, Kyungeun)

New Grant
Dec 1 2021
Development of Open XR platform for high immersive collaboration supported by National Research Council of Science and Technology (NST) for 6 years.

New Lab Member
Nov 4 2021
Welcome our new MS student Jina!

Looking for interns
Oct 12 2021
We are looking for undergraduate research interns for 2021 Fall/Winter. Please check Undergraduate Research Internship.

Looking for graduate students
Oct 12 2021
I am looking for graduate students. Please see M.S./Ph.D. page.

Lab Website Open!
Sep 27 2021
Beta version website is open. The website will be actively updated.

Research Highlights

...

Join Us

**Note: We currently have openings for Ph.D. and M.S.**

We are excited to talk to strong candidates whose backgrounds or interests overlap with

Software engineering: Human-Computer Interaction, Virtual/Augmented Reality, Machine Learning

Hardware engineering: Sensors, Embedded systems, Mechanical Design, Signal processing,

HCI related topics: UX/UI Design, Human Factors, User/Design Study

For prospective students, please check below.


M.S./Ph.D. Students

Highlighted Publications

HapSense: A Soft Haptic I/O Device with Uninterrupted Dual Functionalities of Force Sensing and Vibrotactile Actuation
AUTHORS Sang Ho Yoon, Woo Suk Lee, Shantanu Thakurdesai, Di Sun, Flávio P. Ribeiro, James D. Holbery
IN PROCEEDINGS ACM User Interface Software and Technology Symposium (UIST), 2019
DOI PDF VIDEO PRESENTATION

iSoft: A Customizable Soft Sensor with Real-time Continuous Contact and Stretching Sensing
AUTHORS Sang Ho Yoon, Ke Huo, Yunbo Zhang, Guiming Chen, Luis Paredes, Subramanian Chidambaram, Karthik Ramani
IN PROCEEDINGS ACM User Interface Software and Technology Symposium (UIST), 2017
DOI PDF VIDEO PRESENTATION MEDIA

TRing: Instant and customizable interactions with objects using an embedded magnet and a finger-worn device
AUTHORS Sang Ho Yoon, Ke Huo, Yunbo Zhang, Guiming Chen, Luis Paredes, Subramanian Chidambaram, Karthik Ramani
IN PROCEEDINGS ACM User Interface Software and Technology Symposium (UIST), 2016
DOI PDF VIDEO PRESENTATION

Sound-Guided Semantic Image Manipulation

AUTHORS Seunghyun Lee, Wonseok Roh, Wonmin Byeon, Sang Ho Yoon, Chanyoung Kim, Jinkyu Kim*, Sangpil Kim*
TO APPEAR Conference on Computer Vision and Pattern Recognition (CVPR), 2022

SurfaceFlow: Large Area Haptic Display via Compliant Liquid Dielectric Actuators
AUTHORS Yitian Shao, Siyuan Ma, Sang Ho Yoon, Yon Visell, James Holbery
IN PROCEEDINGS IEEE Haptics Symposium, 2020
DOI PDF

iMold: Enabling Interactive Design Optimization for In-Mold Electronics
AUTHORS Jonathan Ting, Yunbo Zhang, Sang Ho Yoon, James D Holbery, Siyuan Ma
IN PROCEEDINGS ACM Conference on Human Factors in Computing Systems Late Breaking Work (CHI LBW), 2020
DOI PDF

Stress Monitoring using Multimodal Bio-sensing Headset
AUTHORS Joong Hoon Lee, Hannes Gamper, Ivan Tashev, Steven Dong, Siyuan Ma, Jacquelin Remaley, James D Holbery, Sang Ho Yoon
IN PROCEEDINGS ACM Conference on Human Factors in Computing Systems Late Breaking Work (CHI LBW), 2020
DOI PDF

MultiSoft: Soft Sensor Enabling Real-Time Multimodal Sensing with Contact Localization and Deformation Classification
AUTHORS Sang Ho Yoon, Luis Paredes, Ke Huo, Karthik Ramani
IN PROCEEDINGS ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), 2018
DOI PDF VIDEO

Scenariot: Spatially mapping smart things within augmented reality scenes
AUTHORS Ke Huo, Yuanzhi Cao, Sang Ho Yoon, Zhuangying Xu, Guiming Chen, Karthik Ramani
IN PROCEEDINGS ACM Conference on Human Factors in Computing Systems (CHI), 2018
DOI PDF VIDEO

BikeGesture: user elicitation and performance of micro hand gesture as input for cycling
AUTHORS Yanke Tan*, Sang Ho Yoon*, Karthik Ramani
IN PROCEEDINGS ACM Conference on Human Factors in Computing Systems Late Breaking Work (CHI LBW), 2017
DOI PDF

Robust hand pose estimation during the interaction with an unknown object
AUTHORS Chiho Choi, Sang Ho Yoon, Chin-Ning Chen, Karthik Ramani
IN PROCEEDINGS International Conference on Computer Vision (ICCV), 2017
DOI PDF

TMotion: Embedded 3D Mobile Input using Magnetic Sensing Technique
AUTHORS Sang Ho Yoon, Ke Huo, Vinh P. Nguyen, Karthik Ramani
IN PROCEEDINGS International Conference on Tangible, Embedded, and Embodied Interaction (TEI), 2016
AWARD ACM UIST Best Poster Award (1st Place, 2015)
DOI PDF VIDEO

Wearable textile input device with multimodal sensing for eyes-free mobile interaction during daily activities
AUTHORS Sang Ho Yoon, Ke Huo, Karthik Ramani
IN Pervasive and Mobile Computing, 2016 (SCIE)
DOI PDF

TIMMi: Finger-worn Textile Input Device with Multimodal Sensing in Mobile Interaction
AUTHORS Sang Ho Yoon, Ke Huo, Vinh P. Nguyen, Karthik Ramani
IN PROCEEDINGS International Conference on Tangible, Embedded, and Embodied Interaction (TEI), 2015
DOI PDF VIDEO

BendID: Flexible interface for localized deformation recognition
AUTHORS Vinh P. Nguyen, Sang Ho Yoon, Ansh Verma, Karthik Ramani
IN PROCEEDINGS ACM international joint conference on pervasive and ubiquitous computing (UBICOMP), 2014
DOI PDF VIDEO

HCI Tech Lab
KAIST, N5, Room 2346,
291 Daehak-ro, Yuseong-gu, Daejeon (34141)
Republic of Korea