Topic
Deep unfolding transforms iterative optimization algorithms into trainable neural network layers. Each layer mimics a step in an algorithm, combining domain knowledge with deep learning flexibility. This approach offers interpretability, faster convergence, and performance gains, making it especially compelling for structured problems where traditional networks struggle to generalize effectively. Applying deep unfolding to Gaussian Mixture Model (GMM) learning enhances the Expectation-Maximization (EM) process by embedding it in a trainable architecture. This hybrid approach can improve clustering accuracy, convergence speed, and robustness to noise, offering a powerful alternative to classical EM, especially in high-dimensional or complex data settings.
Path
Our main goal is to improve Expectation Maximization using Deep Unfolding for some use cases.
Prerequisite
There are no hard constraints but the more programming and math you know the more you can have fun while doing the project.
What I offer
- A teammate/supervisor who is actually present.
- Possibility to be a co-author in a research level publication.
- A BSc/MSc thesis project that will be used in production level software for an enterprise level project.
- I can probably provide you with an office.
- Nice private IT infrastructure to implement whatever wild ideas you have in mind.