Topic
Distances and divergences are crucial for understanding and working with Gaussian Mixture Models (GMMs). They quantify how similar or different two GMMs are, enabling tasks like clustering, model comparison, and anomaly detection. Unlike simple metrics, divergences such as Kullback-Leibler (KL) or Wasserstein distances capture the structure of probabilistic distributions, accounting for both mean and covariance differences. These measures are essential for optimizing GMM parameters, evaluating convergence, and performing model selection. Accurate distance calculations also support applications in signal processing, computer vision, and machine learning, where nuanced distinctions between data distributions are vital for performance and interpretability.
Path
Our main goal is to characterize different distances and divergences, their pros and cons for Gaussian Mixtures.
Prerequisite
There are no hard constraints but the more programming and math you know the more you can have fun while doing the project.
What I offer
- A teammate/supervisor who is actually present.
- Possibility to be a co-author in a research level publication.
- A BSc/MSc thesis project that will be used in production level software for an enterprise level project.
- I can probably provide you with an office.
- Nice private IT infrastructure to implement whatever wild ideas you have in mind.