Taesun Yeom

I'm a first-year M.S.-Ph.D. student at EffL (Efficient Learning Lab), POSTECH (advisor: Prof. Jaeho Lee). My research primarily focuses on gaining a deeper understanding of neural nets by developing theories that closely align with empirical phenomena.

Recently, i have been particularly interested in neural fields, implicit regularization, geometric deep learning, and learning dynamics. Feel free to reach out if you’re interested in collaborating on any of these research topics.

Email  /  GitHub  /  Google Scholar  /  LinkedIn

Publications

project image

On the Internal Representations of Graph Metanetworks


Taesun Yeom and Jaeho Lee
Under review, 2025

What do graph metanetworks learn?

project image

Fast Training of Sinusoidal Neural Fields via Scaling Initialization


Taesun Yeom*, Sangyoon Lee*, and Jaeho Lee
International Conference on Learning Representations (ICLR), 2025
arxiv / openreview /

We propose a simple yet effective approach for accelerating neural field training.

project image

DuDGAN: Improving Class-Conditional GANs via Dual-Diffusion


Taesun Yeom, Chanhoe Gu, and Minhyeok Lee
IEEE Access, 2024
paper / code /

We propose a dual-diffusion process, which aims to reduce overfitting of class-conditional GANs.

project image

Superstargan: Generative adversarial networks for image-to-image translation in large-scale domains


Kanghyeok Ko, Taesun Yeom, and Minhyeok Lee
Neural Networks, 2023
paper / code /

Enabling GANs for image-to-image translation in large-scale domains.




Education

M.S.-Ph.D in Artificial Intelligence
Pohang University of Science and Technology (POSTECH), South Korea
2024.09 – Present
B.S. in Mechanical Engineering
Chung-Ang University, South Korea
2018.03 – 2024.08

Work experience

Republic of Korea Army (ROKA)
Mandatory military service
2019.07 – 2021.02


Design and source code from Jon Barron's website