Year | Month |
2023 | Jan Feb Mar Apr May Jun Jul Aug Oct |
2022 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov Dec |
2021 | Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec |
2020 | Jan May Jun Jul Aug Sep Oct Nov Dec |
2019 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov |
Title: | Estimation and Inference of Individual Treatment Effect via Bayesian Mixture Model |
Speaker: | Ms. WANG Juan, Department of Mathematics, Hong Kong Baptist University, HKSAR |
Time/Place: | 15:00 - 16:30 FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University |
Abstract: | Treatment effect is a widely discussed topic, such as the effect of a drug on health outcomes, the evaluations of government programs or public policies. In this work, I proposed a Bayesian mixture model to construct a joint distribution for two potential outcomes. Latent variables explain unobservable correlation between two potential outcomes. Consequently, relationships between two potential outcomes and inferences of individual treatment effect can be studied. I consider both simulation and theoretical perspective of these problems. |
Title: | The Visualization of Neural Network Landscape, Loss Convergence with Neuron Increasing and Layer Accuracy increasing |
Speaker: | Mr LAI Jianfa, Department of Mathematics, Hong Kong Baptist University, HKSAR |
Time/Place: | 16:00 - 17:30 FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University |
Abstract: | Neural networks have many successes in lots of different areas. But due to the non-convexity of their loss, it is not easy to sketch the loss landscape. It is widely believed that using the Stochastic Gradient Descent(SGD) to train a network works because the loss landscape either shows no local minima or those minimizers that have similar generalizations as the global minimizer. In this paper, I present Minimum Neural Network model to show the properties of the neural network landscape, fully considering the initialization scheme, the signal-noise ratio, and the training sample size. Moreover, I find two interesting phenomena: 'Loss Convergence with Neuron Increasing' and 'Layer Accuracy Increasing', which may provide important evidence on why neuron networks are able to learn by increasing width and depth. |
Title: | Classification with Incomplete and Inexact supervision |
Speaker: | Ms CHEN Xiao, Department of Mathematics, HKBU, HKSAR |
Time/Place: | 10:00 - 11:30 FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University |
Abstract: | The goal of classification with supervision is to learn a mapping from the input variables x to the class labels y, given training set with pairs (xi, yi). However,the dataset is usually hand-labelled, which means the amount of labelled data is little. Thus we often don’t have enough supervision information to train an efficient discriminative model. But getting unlabeled data usually costs little, so we can utilize the unlabelled data to enhance classification performance. This process is referred to as semi-supervised learning, which is a kind of incomplete supervision problem. A lot of methods have been proposed to solve the problem of semi-supervised learning. If we can get some inexact supervision information on the nlabelled data with little cost, can we also utilize the additional in-formation to improve classification accuracy? For example, we can divide the labels into several larger groups and suppose we are able to know which group an unlabelled data point belongs to with little cost. Can this inexact supervision information help enhance classification performance? In this research, I start from MNIST dataset. Assume that we only have a small amount of labelled data. And for the unlabelled data, we know some features of labels. For example, {0, 1, . . . , 9} is divided into two parts, and we know the label of a data point belongs to which part. I want to study how much information this binary label can offer and I’m aimed at utilizing this kind of inexact upervision information to enhance classification performance. |
We organize conferences and workshops every year. Hope we can see you in future.
Learn MoreProf. M. Cheng, Dr. Y. S. Hon, Dr. K. F. Lam, Prof. L. Ling, Dr. T. Tong and Prof. L. Zhu have been awarded research grants by Hong Kong Research Grant Council (RGC) — congratulations!
Learn MoreFollow HKBU Math