|
Taesun Yeom
I'm a second-year M.S.–Ph.D. student at EffL (Efficient Learning Lab), POSTECH (advisor: Prof. Jaeho Lee).
For my research, I primarily focus on understanding various phenomena that arise in deep neural networks.
My recent research interests focus on the theoretical analysis of deep learning, particularly on connecting implicit bias, learning dynamics, and generalization.
Before joining EffL, I received my bachelor's degree in Mechanical Engineering from Chung-Ang University and worked closely with Prof. Minhyeok Lee and Prof. Seokwon Lee.
Email | GitHub | Google Scholar | LinkedIn | CV
Publications
Generalization Analysis of Linear Knowledge Distillation
Taesun Yeom, Tahyeok Ha, and Jaeho Lee
Under review, 2026
Activation Quantization of Vision Encoders Needs Prefixing Registers
Seunghyeon Kim, Taesun Yeom, Jinho Kim, Wonpyo Park, Kyuyeun Kim, and Jaeho Lee
Under review, 2025
arxiv
Over-Alignment vs Over-Fitting: The Role of Feature Learning Strength in Generalization
Taesun Yeom, Taehyeok Ha, and Jaeho Lee
ICML, 2026
arxiv
Fast Training of Sinusoidal Neural Fields via Scaling Initialization
Taesun Yeom*, Sangyoon Lee*, and Jaeho Lee
ICLR, 2025
arxiv | code | openreview
DuDGAN: Improving Class-Conditional GANs via Dual-Diffusion
Taesun Yeom, Chanhoe Gu, and Minhyeok Lee
IEEE Access, 2024
paper | code
Superstargan: Generative adversarial networks for image-to-image translation in large-scale domains
Kanghyeok Ko, Taesun Yeom, and Minhyeok Lee
Neural Networks, 2023
paper | code
Education
Teaching Experience
Services
|