Prasanga Dhungel
Incoming Ph.D. Student in ML, CISPA/ELLIS | Focus: Efficient, Reliable & Interpretable ML
Hi, I am Prasanga, an incoming Ph.D. student in Machine Learning at CISPA Helmholtz / ELLIS Unit Saarbrücken. Starting in July 2026, I will join the Relational ML Lab under the supervision of Dr. Rebekka Burkholz focusing on Efficiency in Machine Learning.
My research interest centers on democratizing AI and challenging the “scale at all costs” paradigm. Through theoretically grounded algorithms, my interest lies in transcending the traditional trade-off between efficiency and performance, thereby making AI computationally accessible while maintaining its predictive power.
I recently completed my Master’s degree in Informatics at TUM, where my thesis, supervised by Prof. Stephan Günnemann, explored efficient methods for pruning large-scale datasets via score extrapolation. Prior to this, I completed my undergraduate studies at the Institute of Engineering, Pulchowk Campus in Nepal.
Alongside my academic background, I have gained practical experience as a Machine Learning Engineer at E.ON and Naamche. During this time, I experienced the stark gap between academic benchmarks and real-world deployment. Seeing firsthand the challenges of calibration drifts and a lack of explainability deeply informed my doctoral research focus. I now strive to create AI systems that are both highly efficient and fundamentally reliable.
Research Interests
The following areas represent my current research focus and published work:
-
Robust and Interpretable ML: Building ML systems that are not only accurate but also reliable and explainable is crucial for real-world deployment. I’m particularly interested in developing methods that handle distribution shifts, outliers, and provide meaningful explanations for model decisions—essential for domains like energy systems and critical infrastructure. The interplay between data geometry, hypothesis space and optimization dynamics dictates learnability. I thus aim to explore how data structures (such as separability), choice of architecture and loss function influence the trajectory of optimization in over-parameterized networks, and how this affects convergence and generalization.
-
Data-Centric AI & Efficient Learning: My master’s thesis explored novel approaches to large-scale dataset pruning through score extrapolation, addressing the computational challenges of training on massive datasets. I’m particularly interested in seeing this from the lens of data geometry and optimization dynamics, and how we can leverage this understanding to design more efficient learning algorithms. For example, Gradient descent on a linearly separable dataset converges to Hard Margin SVM solution, meaning we can essentially remove all the non-support vectors and at the end converge to the same solution (Soudry et al, 2017).
-
Machine Learning in Non-Euclidean Spaces: Graph neural networks and geometric deep learning open exciting possibilities for modeling complex relational data. My work has explored using geometric data structures for efficient learning, and I’m interested in applying these techniques to real-world problems involving networks, molecules, and spatial data.
-
MLOps & Production ML: Bridging the gap between research and production is critical. I’m passionate about building scalable ML infrastructure, implementing robust monitoring systems, and establishing best practices that enable teams to deploy and maintain ML systems reliably at scale.
If you are interested in collaborating on research projects or discussing ideas, please feel free to reach out. I am always open to exploring new challenges and opportunities to create impactful solutions.
Beyond Work
I am a lover of literature, intellectual podcasts, and the beauty of nature. In my free time, you can find me lost in a good book, exploring the Bavarian Alps, or indulging in a thought-provoking film. I’m passionate about clear communication and enjoy breaking down complex technical concepts into accessible narratives. Thank you for visiting!