Taesun Yeom
I'm a first-year M.S.-Ph.D. student at EffL (Efficient Learning Lab), POSTECH (advisor: Prof. Jaeho Lee).
My research primarily focuses on developing theories that closely align with empirical phenomena to gain a deeper understanding of neural nets.
Recently, I have been particularly interested in (1) understanding the inductive biases of neural nets, (2) neural fields, (3) implicit bias and learning dynamics.
Before joining EffL, I received my bachelor's degree in Mechanical Engineering from Chung-Ang University and worked closely with Prof. Minhyeok Lee and Prof. Seokwon Lee.
Email /
GitHub /
Google Scholar /
LinkedIn /
CV
|
|
|
On the Internal Representations of Graph Metanetworks
Taesun Yeom and Jaeho Lee
ICLR Workshop on Weight Space Learning, 2025
arxiv /
openreview
What do graph metanetworks learn?
|
|
Fast Training of Sinusoidal Neural Fields via Scaling Initialization
Taesun Yeom*, Sangyoon Lee*, and Jaeho Lee
International Conference on Learning Representations (ICLR), 2025
arxiv /
code /
openreview
We propose a simple yet effective approach for accelerating neural field training.
|
|
DuDGAN: Improving Class-Conditional GANs via Dual-Diffusion
Taesun Yeom, Chanhoe Gu, and Minhyeok Lee
IEEE Access, 2024
paper /
code
We propose a dual-diffusion process, which aims to reduce overfitting of class-conditional GANs.
|
|
Superstargan: Generative adversarial networks for image-to-image translation in large-scale domains
Kanghyeok Ko, Taesun Yeom, and Minhyeok Lee
Neural Networks, 2023
paper /
code
Enabling GANs for image-to-image translation in large-scale domains.
|
M.S.–Ph.D. in Artificial Intelligence
Pohang University of Science and Technology (POSTECH), South Korea
|
2024.09 – Present
|
B.S. in Mechanical Engineering
Chung-Ang University, South Korea
|
2018.03 – 2024.08
|
|