Colloquium/Seminar
Year | Month |
2021 | Feb Mar Apr May Jun Jul |
2020 | Jan May Jun Jul Aug Sep Oct Nov Dec |
2019 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov |
2018 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov Dec |
2017 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov Dec |
Event(s) on February 2012
- Friday, 3rd February, 2012
Title: Ant Colony Optimization: Its foraging behavior in Optimization Speaker: Dr. Yun-Chia LIANG, Department of Industrial Engineering & Management, Yuan Ze University, Taiwan Time/Place: 11:30 - 12:30
FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist UniversityAbstract: Ant Colony Optimization (ACO), inspired by the behavior of real ants, has been considered one of the most successful algorithms in Swarm Intelligence (SI). Early applications of ACO were in combinatorial optimization problems such as Traveling Salesman Problem (TSP) or Quadratic Assignment Problem (QAP). However, recent applications have introduced ACO to the problems with continuous domain. Dr. Liang will share his research experience of ACO on both continuous and combinatorial problems. The application examples include the orienteering problem, redundancy allocation problem, on-line scheduling problem, and curve fitting problem. - Wednesday, 8th February, 2012
Title: CMIV Colloquium: Laplace Operator and Heat Kernel for Shape Analysis Speaker: Professor Jian SUN, Mathematical Sciences Center, Tsinghua University, China Time/Place: 11:00 - 12:00
FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist UniversityAbstract: The Laplace-Beltrami operator (manifold Laplacian) is a fundamental geometric object associated to a Riemannian manifold and has many desirable properties. For instance eigenfunctions of the Laplacian form a natural basis for square integrable functions on the manifold analogous to Fourier harmonics for functions on a torus. In addition, the Laplace operator is intimately related to heat diffusion process on a manifold, relating geometry of a manifold to the properties of the heat flow. In the first part of the talk, I will present a method to detect the intrinsic symmetry based on the eigenfunctions of Laplace-Beltrami operator. In the second part of the talk, I will present a multi-scale intrinsic shape signature based on heat diffusion, which is both concise and provably informative. - Tuesday, 14th February, 2012
Title: DLS: The Extremal Kahler Metrics on Toric Manifolds Speaker: Prof. An-min Li, Sichuan University, China Time/Place: 11:30 - 12:30 (Preceded by Reception at 11:00am)
SCT909, Cha Chi-ming Science Tower, HSH Campus, Hong Kong Baptist UniversityAbstract: We study the prescribed scaler curvature problem on toric manifolds. We will show that the uniform stability introduced by Donaldson is a necessary condition for existing a smooth solution for any dimension n. For the case n = 2 we prove that this condition is also sufficient. More precisely, we prove the following theorem:
Theorem Let M be a compact toric surface and Delta be its Delzant polytope. Let K in C^{infty} (bar{Delta}) be an edge-nonvanishing function. If (M, K) is uniformly stable, then there is a smooth T^{2}-invariant metric on M that solves the Abreu equation.
This talk is based on the joint works with Bo-hui Chen and Li Sheng
- Friday, 24th February, 2012
Title: Model Selection in High Dimensional Semivarying Coefficient Models and Approximated Newton-Raphson Algorithm for SCAD method Speaker: Mr. Chen Chi, Department of Mathematics, Hong Kong Baptist University, Hong Kong Time/Place: 11:30 - 12:30
FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist UniversityAbstract: SCAD penalty method is a powerful weapon for variable selection in high dimensional linear regression model. In this work, the author applies SCAD method into High Dimensional Semi- varying Coe_cient Models and shows that the SCAD estimator enjoys the sparsity, continuity and oracle properties. In the meanwhile, if the model has a Gauss noise, then ordinary least square estimator also has satis_ed properties like asymptotic normality. Later in this work, the author introduces a new algorithm for SCAD penalty method. This algorithm is based on classical Newton-Raphson method and the author will show that it has faster convergence rate than LLA method. - Friday, 24th February, 2012
Title: Some Reliable Non-randomized Response Techniques For Sensitive Questions Speaker: Miss Wu Qin, Department of Mathematics, Hong Kong Baptist University, Hong Kong Time/Place: 13:30 - 14:30
FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist UniversityAbstract: Non-response and response biases occur when sensitive questions (e.g.income, tax evasion, drug taking) are being asked directly. Non-randomized response models can be used to increase respondent’s cooperation and credibility of the analysis. The first part of my work is to consider reliable statistical procedures for testing the equality of two sensitive proportions via non-randomized response techniques. I derive the wald, score and likelihood ratio tests. Simulation results show that score test outperforms the other two tests. The second part is to modify the existing non-randomized response models to make the them more applicable and flexible. In our new models, the original assumptions of prior information of non sensitive question and independence between sensitive question and non sensitive question can be dropped. Future work on sensitive question analyses will be discussed. - Friday, 24th February, 2012
Title: Variable selection and estimation for high-dimensional semi-parametric regression models Speaker: Mr. Wang Tao, Department of Mathematics, Hong Kong Baptist University, Hong Kong Time/Place: 14:30 - 15:30
FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist UniversityAbstract: Large data sets and high dimensionality characterize many contemporary statistical problems. Variable selection and dimension reduction are two fundamental tasks for statistical modeling in high-dimensional settings. Although parametric inference is among the most convenient and popular, linear or generalized linear models are not complex enough to capture the underlying relationship between the response variable and its associated predictors. To achieve greater realism, semi-parametric modeling is frequently incorporated to balance between modeling biases and the curse of dimensionality. In the first part, we consider variable selection, in the “small n, large p” setting, for a general class of models with the single-index structure. We propose non-convex penalized least-squares estimation, and show that theoretical results for linear models are in parallel extended to general single-index models without distribution constraint for the error, at the cost of mild conditions on the predictors. As an important application, in the second part we consider the extension of the new methodology to partially linear single-index models. Finally, we discuss briefly some of our ongoing works on sparse estimation for general semi-parametric models, including sparse inverse regression estimation and penalized minimum average variance estimation.