π§βπ We are recruiting undergraduate students for 2026 Internship!
Lab Introduction
SEOULTECH IXLAB conducts cutting-edge research in Human-Computer Interaction (HCI). Our mission is to contribute to top-tier international conferences and journals while shaping the future of how humans interact with technology.
1. Immersive Spatial Computing
- User behavior/performance modeling
- [CHI β26] Steering through a dynamically varying path
- [CHI β26] How Video See-Through MR Affects Pointing on Physical Devices: From Smartphones to Large Monitors
- [CHI β26] Investigating the Impact of Lying Postures on Mixed Reality Interaction
- [CHI β25] Understanding User Behavior in Window Selection using Dragging for Multiple Targets
- [IJHCI β26] Effect of Onset Position of Ray Casting in Virtual Reality
- Interactive MR/AR/VR applications
- [CHI β26] Exploring Interface Design of MR Translation System for Everyday Interaction
- [CHI β25] Through the Looking Glass, and What We Found There: A Comprehensive Study of User Experiences with Pass-Through Devices in Everyday Activities
- [ISMAR β25] Exploring Interface Design of Translation System for Enhanced Immersion and Usability in Mixed Reality
- [IUI β25] A picture is worth a thousand words? Investigating the Impact of Image Aids in AR on Memory Recall for Everyday Tasks
2. Intelligence-Augmented & Generative Interaction
- Generative AI for Audio-Visual Interaction
- [IJHCS β26] SnapSound: Empowering everyone to customize sound experience with Generative AI
- [PeerJ CS β24] Mitigating Inappropriate Concepts in Text-to-Image Generation with Attention-guided Image Editing
- [CHI β23] Tingle Just for You: A Preliminary Study of AI-based Customized ASMR Content Generation
- LLMs-based Human-centered Interaction
- [CHI β26] Knight Path: Supporting Player Sense-making in Game Narratives via Visual Scaffolding and Spoiler-Aware AI
- [ISMAR β24] Public Speaking Q&A Practice with LLM-Generated Personas in Virtual Reality
- [SIGGRAPH Asia β24] Designing LLM Response Layouts for XR Workspaces in Vehicles
3. Multimodal Sensing Interfaces (Gaze, Speech, Gestures)
- [UIST β25] Silent Yet Expressive: Toward Seamless VR Communication through Emotion-aware Silent Speech Interfaces
- [CHI β25] Exploring Emotion Expression Through Silent Speech Interface in Public VR/MR
- [EAIT β25] Enhancing Learner Experience with Instructor Cues in Video Lectures: A Comprehensive Exploration and Design Discovery toward A Novel Gaze Visualization
- [IJHCI β24] EchoTap: Non-verbal Sound Interaction with Knock and Tap Gestures
- [ICMI β21] Knock & Tap: Classification and Localization of Knock and Tap Gestures using Deep Sound Transfer Learning
4. Human-Agent Interaction (New 2026 NRF Project)
- We received a 5-year research grant from NRF in 2026, in which we will develop a novel interaction framework for Human-Agent Interaction.
- Human AI interaction is rapidly evolving from LLM-based applications to Agent-based ecosystems.
- We explore how human and agents can collaborate to create a better future.
- Main research topics include:
- Design of Tangible Agent interfaces and interaction
- Embodied Human-Agent Interaction
- Enhancing UI/UX of Human-Agent Interaction for Web, Mobile, XR environments
- Multi-User Multi-Agent Interaction
- β¦ and your ideas!
Internship Duration
- 2026.04.20 ~ 2026.08.31
- Extension will be considered based on how well the student suits our Lab.
Target Candidates
- Students eager to conduct research integrating Human-Computer Interaction with AI/AX technologies.
- Students with a strong interest in at least one of our core research areas listed above.
- Students considering enrollment in Masterβs or Integrated B.S.-M.S. programs.
- 4th-year students: Only students who are considering applying for the Masterβs degree for Spring 2027 are eligible for selection.
- 3rd-year students: Only students who are considering applying for the Integrated BS-MS degree for Spring 2027 are eligible for selection.
Qualifications
- Attitude: Self-motivated, sincere, and a proactive team player
- Commitment: No planned long-term absences (e.g., exchange programs) during the internship
- Participation: Active involvement in lab activities during weekdays (semester and vacation)
- Activities: Able to attend domestic conferences, workshops, and seminars
- Skills: Basic programming sense
Requirements & Benefits
- Research Growth: Students will have deeper understanding of HCI and UI/UX fundamentals, VR/AR technologies, AI Agent ecosystem, Mobile/Web platforms, ML/DL applications, etc.
- Research Autonomy: Although our lab shares a common research theme, students are encouraged to propose and lead their own research projects based on their unique interests within the HCI field.
- Individualized Mentorship: Students will receive direct guidance from Professor Jin-Woo Jeong to help navigate academic growth and research direction.
- Work hours
- Master/Ph.D students: at least 40 hours/week
- Undergraduate students: flexible (Semester), at least 35 hours/week (Vacation)
- International Conference Travel Support
- Master/Ph.D students: at least 1 conference/year
- Undergraduate students: if the authored paper is accepted
- Stipend
- Master students: at least 2,000,000 KRW/month + Tuition fee (λ―ΈλμΈμ¬μ₯νκΈ)
- Ph.D students: at least 3,000,000 KRW/month + Tuition fee (λ―ΈλμΈμ¬μ₯νκΈ)
- Undergraduate students: 300,000 KRW/month (Semester), 700,000 KRW/month (Vacation), 1,000,000 KRW/month + BEAR scholarship (for students committed to entering graduate school)
- Equipments: Personal office space and individual PC/dual monitors setup
- Various research-purpose devices are provided (VR/MR HMDs, Eyetrackers, Sensors, Wearable devices, etc.)
Timeline
- Application Deadline: 2026.04.03 (Fri)
- Interviewee Notification: 2026.04.06 (Mon)
- Interview: 2026.04.07 (Tue) ~ 2026.04.10 (Fri)
- Result Announcement: 2026.04.13 (Mon)
- Internship Period: 2026.04.20 (Mon) ~ 2026.08.31 (Mon)
Application
- Undergraduate transcript, Cover letter (1 page including motivation), CV (optional)
- Application Form: Link
Contact
- Email: jinw.jeong@seoultech.ac.kr (Jin-Woo Jeong, PI)
Enjoy Reading This Article?
Here are some more articles you might like to read next: