kwmheen

Profile Picture

Kyoungwhan Mheen

M.S. Student in HCI Tech Lab at KAIST

Hello, I am a master's student in Human-Centered Interactive Technologies Laboratory at the Graduate School of Culture Technology, KAIST under the supervision of Prof. Sang ho Yoon.

My research interest lie in gaze-based behavior and gaze-hand multimodal interaction. I am particularly fascinated by how eye movements can be used to anticipate user intentions and enhance interactive systems.

Autograph Image

Education

Education Image 1

Korea Advanced Institute of Science and Technology

2024 - Present

Graduate School of Culture Technology M.S.

Education Image 2

Hanyang University

2020 - 2023

Information Systems B.S


Publications

Automatic Preparatory Object Rotation Interaction using Gaze Data in Virtual Reality

Kyoungwhan Mheen (KAIST), Min-yung Kim (KAIST), Jina Kim (KAIST), Sang Ho Yoon* (KAIST)

[Korea Computer Congress (KCC) 2024]

Worki-Yo: 증강현실 기반 사무실 스트레칭 시스템 제안

Kyoungwhan Mheen, Heejeong Ko, Sungjae Choi, Sang Ho Yoon*

[HCI Korea 2025]

박물관 내 대형 작품 해설을 위한 월 디스플레이와 증강현실의 결합

Hee Jeong Ko, Hyun Woo Kim, Kyoungwhan Mheen, Dooyoung Kim, Woontack Woo*

[HCI Korea 2025]


Prizes

Best Presentation Award

HMD-Based Trajectory Extraction for Seated Arm Stretching Based on Linear and Angular Trajectory Classification

[Korea Computer Graphics Society (KCGS) 2025]


Projects

What-if: Immersive fairy tale in re-imagined space

GCT501 Culture Technology|2024 Spring|Kyoungwhan Mheen, Kangeun Lee, Hyungwoo Jin, Danbinaerin Han

The "WhatIf" project combines Generative AI and Augmented Reality (AR) to create an immersive Tabletop Role-Playing Game (TRPG). By blending physical and digital elements, the game reimagines classic tales, offering players dynamic narratives and interactive gameplay, with the freedom and creativity similar to popular strategy games like "Civilization."

Education Image 2

PreGraspHelper : Virtual Objects Preparatory Manipulation for Grasping

GCT623 Interaction Sensing Principle & Application|2024 Spring|Kyoungwhan Mheen, Min-yung Kim

PreGraspHelper is an innovative interaction concept designed for virtual reality (VR) environments, where it aids in object manipulation before grasping. By detecting a user's intent through gaze fixation on a handle, the system automatically rotates the virtual object into an optimal position for grasping.

Education Image 2

Worki-Yo : Improved Stretching accessibility for Office Worker Using Immersive Augment Reality

CTP445 Augmented Reality|2024 Spring|KAIST

Worki‐Yo is an AR‐based system designed to enhance office worker health by addressing ergonomic issues from prolonged desk work. Six validated exercises—linear (blue frame) and angular (red frame)—use color‐mapped velocity data to guide users. Wrist/Elbow, Pectoral, and Upper/Lower Back stretches are linear; Upper Body Twist, Shoulder, and Trunk stretches are angular. A velocity comparison reveals distinct movement patterns for each group. By integrating hand tracking, 3D trajectory feedback, and real‐time overlays, Worki‐Yo helps users maintain proper form and correct asymmetries.

Education Image 2

PeeKaBoo! - Personalized Emotional Evaluation Kernel Adjusted by Biofeedback-Optimized Observation

GCT565 Augmented Humans|2025 Autumn|KwangBin Lee, Jihun Chae

PeekaBoo is a VR horror prototype that leverages multiple LLMs—such as an Actor, Rule Maker, and Self-Reflection module—working together to dynamically adapt fear elements in real time. By monitoring player gaze and behavior in Unity, each LLM refines or creates new "rules" based on a document-based memory system. This ensures prior interactions remain accessible for future decisions, ultimately delivering a more immersive and personalized horror experience.

Education Image 3

Flace: Framing Spaces for Seamless Collaboration in Meta-Museum

GCT550 Augmented Reality|2025 Autumn|김현우, 고희정

Flace is a system that integrates Wall Display and AR HMDs to support multi‐user viewing and personalized experiences. In the open viewing zone, visitors (with or without HMDs) share a communal display. In the docent interaction zone, AR HMDs and eye‐tracking enable asynchronous commentary and individualized guidance, enhancing engagement and accuracy. By providing both shared and personal interaction spaces, Flace shows promise for traditional museums and emerging Meta‐Museum contexts featuring remote or interactive exhibits.

Education Image 4

Career

AlphaMind [Under Develop]

2023.01.01.~ Present (CEO, Frontend/Backend Developer)

Crypto Aggregation & Algorithm Trading(Co. MK Associates)

Research Assistant

2023.03.01. ~ 2023.07.31.

Under the supervision of Prof. Youngjoon Won,
Blockchain Application Research (NFT Transaction Anlz.)

Intelligent Robotics Research Center, KIST

2022.06.22. ~ 2022.08.31.

Under the supervision of Dr. Lim Yoonseob


Extra-Curricular Activities


6th President of FORIF (HYU Academic Division Central SW Club)

2021.03.01. ~ 2021.12.31.

FORIF Design Committe (2020.09.01. ~ 2020.12.31.)

FORIF Vice President (2022.03.01. ~ 2022.07.31.)

FORIF Administrator (2022.09.01. ~ 2022.12.31.)

FORIF C, Python, Data Analysis, Web Crawling, LLM Study Mentor (2021.03.01.~2023.07.31.)

"My beloved club, where the heart of my university life truly resides"



🏆 SKT AI Curriculum (3rd Place, Pharma-see)

2021.12.31. (AI Devloper, UI/UX Designer)

Under the supervision of Prof. Youngjoon Won, YOLO based Pill Recognition System


Contact

GitHub

@kwmheen

LinkedIn

kwmheen

Instagram

@kwmheen