Taesun Yeom
I'm a second-year M.S.-Ph.D. student at EffL (Efficient Learning Lab), POSTECH (advisor: Prof. Jaeho Lee).
For my research, I primarily focus on understanding various phenomena that arise in deep neural networks from a theoretical perspective.
My recent research interests include (1) Implicit biases and learning dynamics of neural networks, (2) Deep learning with infinite-dimensional functions (e.g., neural fields), and (3) Efficient deep learning.
Before joining EffL, I received my bachelor's degree in Mechanical Engineering from Chung-Ang University and worked closely with Prof. Minhyeok Lee and Prof. Seokwon Lee.
Email /
GitHub /
Google Scholar /
LinkedIn /
CV
|
|
|
Over-Alignment vs Over-Fitting: The Role of Feature Learning Strength in Generalization
Taesun Yeom, Taehyeok Ha, and Jaeho Lee
Under review, 2026
arxiv
We establish the existence of an optimal feature learning strength in classification and provide a theoretical explanation.
|
|
Activation Quantization of Vision Encoders Needs Prefixing Registers
Seunghyeon Kim, Jinho Kim, Taesun Yeom, Wonpyo Park, Kyuyeun Kim, and Jaeho Lee
Under review, 2025
arxiv
We propose simple post-training quantization (PTQ) methods for ViTs that can be applied on top of existing approaches.
|
|
On the Internal Representations of Graph Metanetworks
Taesun Yeom and Jaeho Lee
ICLR Workshop on Weight Space Learning, 2025
arxiv /
openreview
What do graph metanetworks learn?
|
|
Fast Training of Sinusoidal Neural Fields via Scaling Initialization
Taesun Yeom*, Sangyoon Lee*, and Jaeho Lee
International Conference on Learning Representations (ICLR), 2025
arxiv /
code /
openreview
We propose a simple yet effective method to accelerate neural field training.
|
|
DuDGAN: Improving Class-Conditional GANs via Dual-Diffusion
Taesun Yeom, Chanhoe Gu, and Minhyeok Lee
IEEE Access, 2024
paper /
code
We propose a dual-diffusion process to mitigate overfitting in class-conditional GANs.
|
|
Superstargan: Generative adversarial networks for image-to-image translation in large-scale domains
Kanghyeok Ko, Taesun Yeom, and Minhyeok Lee
Neural Networks, 2023
paper /
code
This work enables GANs for image-to-image translation across large-scale domains.
|
M.S.–Ph.D. in Artificial Intelligence
Pohang University of Science and Technology (POSTECH), South Korea
|
2024.09 – Present
|
B.S. in Mechanical Engineering
Chung-Ang University, South Korea
|
2018.03 – 2024.08
|
|