Year | Month |
2023 | Jan Feb Mar Apr May Jun Jul Aug Oct |
2022 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov Dec |
2021 | Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec |
2020 | Jan May Jun Jul Aug Sep Oct Nov Dec |
2019 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov |
Title: | Statistical Learning for Large Dimensional Data by Finite Mixture Modeling |
Speaker: | Ms CHEN Xiao, Department of Mathematics, Hong Kong Baptist University, Hong Kong |
Time/Place: | 15:00 - 17:00 Zoom, Meeting ID: 929 7090 6533 Password: 992568 |
Abstract: | The goal of mixture modelling is to model the data as a mixture of processes or populations with distinct data patterns. Mixture modelling can find combinations of hidden group memberships for many kinds of models. While mixture models based on Gaussian distributions are still popular, they are sensitive to outliers and varying tails. Thus, robust mixture models are getting increasingly popular. In this thesis, we mainly considered replacing Gaussian density distributions with exponential power(EP) distributions in mixture modelling. This thesis contributes to the mixture modelling in 3 ways. First, a family of mixtures of univariate exponential power distributions and a family of mixtures of multivariate exponential power distributions are considered. The EP mixture model is an attractive alternative to Gaussian mixture models and t mixture models in model-based clustering and density estimation. It can deal with Gaussian, light‐tailed, and heavy‐tailed components at the same time. In this thesis, we used the penalty likelihood method to determine the number of components for mixtures of univariate power exponential distributions and mixtures of multivariate power exponential distributions, and we have proved the consistency of the order selection procedure. The proposed algorithm performs better than classical methods in order selection for EP mixture models, and it is not computing-intensive. Second, robust mixtures of regression models with EP distributions are introduced. These models provide a flexible framework for heterogeneous dependencies on the observed variables. Here we used the penalized log-likelihood for selecting the number of components. Simulations and real data analyses illustrate the robustness of the proposed model and the performance of the proposed penalized method in order selection. Lastly, we proposed mixtures of robust probabilistic principal component analyzers with EP distributions and proved the robustness of our method through toy examples and real data analysis. This method could model high-dimensional non-linear data using a combination of local linear models when there are outliers or heavy-tails. It could be used for high-dimensional clustering and data generation. |
Title: | Mathematical Studies on Geometric Structures of Transmission Eigenfunctions and Inverse Problems |
Speaker: | Ms HE Youzi, Department of Mathematics, Hong Kong Baptist University, Hong Kong |
Time/Place: | 14:00 - 16:00 Zoom, Meeting ID: 977 6985 3164 Password: 601130 |
Abstract: | The interior transmission eigenvalue problem is a class of non-elliptic, non-selfadjoint and nonlinear eigenvalue problems that arises in the theory of wave scattering and it is related to the direct and inverse scattering problems in many ways. The intrinsic geometric structures of the transmission eigenfunctions is not only mathematically interesting and intriguing but also practically important and useful. Meanwhile, the inverse scattering theory itself plays a central role in mathematical physics. It involves various areas such as medical imaging, remote sensing, geophysics, signal processing, nondestructive testing, and so on. This thesis is concerned with the mathematical study on geometric structures of the interior transmission eigenfunctions and an application of the inverse problems. |
We organize conferences and workshops every year. Hope we can see you in future.
Learn MoreProf. M. Cheng, Dr. Y. S. Hon, Dr. K. F. Lam, Prof. L. Ling, Dr. T. Tong and Prof. L. Zhu have been awarded research grants by Hong Kong Research Grant Council (RGC) — congratulations!
Learn MoreFollow HKBU Math