Topic
Probabilistic Generative AI models uncertainty, generate realistic data, and work well with limited or noisy datasets. They are interpretable, support unsupervised learning, and enable principled reasoning. Their flexibility, robustness, and theoretical grounding make them valuable for data augmentation, simulation, anomaly detection, and tasks requiring uncertainty estimation or structured data modeling. Priors are essential in probabilistic models as they encode prior beliefs and guide learning, especially with limited data. Gaussian mixtures are particularly interesting as priors because they model complex, multi-modal distributions effectively. This flexibility allows capturing diverse patterns in real data, making them ideal for generative tasks and Bayesian inference.
Path
Our main goal is to see if by using more complex priors we can somehow benefit in the context of Probabilistic Generative AI.
Prerequisite
There are no hard constraints but the more programming and math you know the more you can have fun while doing the project.
What I offer
- A teammate/supervisor who is actually present.
- Possibility to be a co-author in a research level publication.
- A BSc/MSc thesis project that will be used in production level software for an enterprise level project.
- I can probably provide you with an office.
- Nice private IT infrastructure to implement whatever wild ideas you have in mind.