Year | Month |
2024 | Jan Feb Mar May |
2023 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov Dec |
2022 | Jan Feb Jun Jul Aug Oct Nov Dec |
2021 | Jul Aug Sep Oct Nov |
Title: | Subgroup analysis for high-dimensional functional regression |
Speaker: | Dr. Xiaochen ZHANG, Zhongtai Securities Institute for Financial Studies, Shandong University, Jinan, China |
Time/Place: | 16:00:00 - 17:00:00 FSC1217 |
Abstract: | Subgroup analysis for scalar data has been well studied in the literature. However, less has been done on functional data, especially on high-dimensional functional regression. In this study, we develop a high-dimensional functional regression model for simultaneous estimation and subgroup identification for a heterogeneous population. Under mild conditions, we establish the estimation and selection consistency of the proposed method. The proposed analysis allows the number of functional predictors and number of subgroups to increase as the sample size increases. Simulation studies demonstrate satisfactory performance of the proposed method, and it is also illustrated through a real application. |
Title: | How Can We Learn from Possibly Unrelated Sources? A Source-Function Weighted-Transfer Learning |
Speaker: | Professor Lu LIN, Zhongtai Securities Institute for Financial Studies, Shandong University, Jinan, China |
Time/Place: | 15:00:00 - 16:00:00 FSC1217 |
Abstract: | The homogeneity, or more generally, the similarity between source domains and a target domain seems to be essential to a positive transfer learning. In practice, however, the similarity condition is difficult to check and is often violated. In this paper, instead of the popularly used similarity condition, a seeming similarity is introduced, which is defined by a non-orthogonality together with a smoothness. Such a condition is naturally satisfied under common situations and even contains the dissimilarity as its special case. Based on the seeming similarity together with an L_2-adjustment, a source-function weighted-transfer learning estimation (sw-TLE) is constructed. By source-function weighting, an adaptive transfer learning is achieved in the sense that the transfer learning is always positive in both similar and dissimilar scenarios. Particularly, under the case with homogenous sources, the sw-TLE even obtains the parametric or semiparametric convergence rate, though the model under study is nonparametric. The hidden relationship between the source-function weighting estimator and the James-Stein estimator is established as well, which reveals the structural reasonability of our methodology. Moreover, the strategy does apply to nonparametric and semiparametric models. The comprehensive simulation studies and real data analysis can illustrate that the new strategy is significantly better than the competitors, and is comparable with the oracle estimator. |
Title: | A Gromov--Wasserstein Geometric View of Spectrum-Preserving Graph Coarsening |
Speaker: | Dr. Yifan Chen, Department of Computer Science, Hong Kong Baptist University |
Time/Place: | 15:00:00 - 16:00:00 FSC1217 |
Abstract: | In this talk, I will delve into a geometric perspective of graph coarsening, which is a technique for solving large-scale graph problems by working on a smaller version of the original graph. It has a long history in scientific computing and has recently gained popularity in machine learning, particularly in methods that preserve the graph spectrum. Compared to the previous spectral perspective, the proposed geometric perspective is especially useful when working with a collection of graphs, such as in graph classification and regression. Specifically, we consider a graph as an element on a metric space equipped with the Gromov--Wasserstein (GW) distance, and bound the difference between the distance of two graphs and their coarsened versions. Minimizing this difference can be done using the popular weighted kernel K-means method, which strengthens existing spectrum-preserving methods. |
Title: | Stochastic Optimization, Stochastic Variational Inequality, and the Progressive Decoupling Algorithm |
Speaker: | Professor Jie Sun, National University of Singapore, Singapore, and Curtin University, Australia |
Time/Place: | 11:00:00 - 12:00:00 FSC1217 |
Abstract: | This talk introduces basic concepts of multistage stochastic optimization and its abstract form --- the stochastic variational inequality (SVI). A scheme of solving SVI problems called the progressive decoupling algorithm (PDA) is discussed and preliminary computational results are reported. |
Title: | A decoupled, linear, and unconditionally energy stable finite element method for a two-phase ferrohydrodynamics model |
Speaker: | Professor Xiaoming He, Department of Mathematics and Statistics, Missouri University of Science and Technology, USA |
Time/Place: | 14:30:00 - 15:30:00 FSC 1217 |
Abstract: | In this talk, we present numerical approximations of a phase-field model for two-phase ferrofluids, which consists of the Navier-Stokes equations, the Cahn-Hilliard equation, the magnetostatic equations, as well as the magnetic field equation. By combining the projection method for the Navier-Stokes equations and some subtle implicit-explicit treatments for coupled nonlinear terms, we construct a decoupled, linear, fully discrete finite element scheme to solve the highly nonlinear and coupled multi-physics system efficiently. The scheme is provably unconditionally energy stable and leads to a series of decoupled linear equations to solve at each time step. Through numerous numerical examples in simulating benchmark problems such as the Rosensweig instability and droplet deformation, we demonstrate the stability and accuracy of the numerical scheme. |
Title: | Tractability of approximation by general shallow networks |
Speaker: | Dr. Tong Mao, Institute of Mathematical Sciences, Claremont Graduate University |
Time/Place: | 15:30:00 - 16:00:00 FSC 1217 |
Abstract: | In this talk, we consider the approximation of functions given by an integral form. Such function spaces are called variation spaces. Although these spaces have been studied in various literature, we study them under some general assumptions. We concentrated on the constant terms and showed these terms are tractable (depending polynomially on the dimension) under some dimension-independent assumptions. Applications include approximation by power rectified linear unit networks, zonal function networks, certain radial basis function networks as well as the important problem of function extension to higher dimensional spaces. |
Title: | On recovering signal from partial frame coefficients |
Speaker: | Dr. Fusheng Lyu, School of Mathematical Sciences, Nankai University |
Time/Place: | 15:00:00 - 15:30:00 FSC1217 |
Abstract: | Frame theory is widely used in data analysis and signal processing fields. The frame coefficients are usually redundant, which makes it possible to recover the original signal after some frame coefficients are lost. In this talk, we will discuss the feasibility and smoothness of recovering finite-dimensional signals from partially disordered and phaseless disordered frame coefficients. We propose self-located robust frames and real phase retrievable frames. Among them, the self-located robust frames ensure that the signal can be recovered smoothly from partial noisy disordered frame coefficients, and the real phase retrievable frames ensure that all real finite-dimensional signals can be recovered from partial phaseless disordered frame coefficients up to a sign. We will also give several characterizations and constructions of these two types of frames. |
Title: | Convergence theory of outcome weighted learning in precision medicine |
Speaker: | Prof. Daohong Xiang, School of Mathematical Sciences, Zhejiang Normal University |
Time/Place: | 14:30:00 - 15:00:00 FSC 1217 |
Abstract: | Outcome-weighted learning (OWL) is one of the most popular algorithms for estimating the optimal individualized treatment rules in precision medicine. In this talk, the convergence theory of OWL for the cases of bounded and unbounded clinical outcomes is mainly studied. Fast learning rates of OWL associated with least square loss, exponential-hinge loss and r-norm SVM loss are derived explicitly. |
The Department has a distinguished record in teaching and research. A number of faculty members have been recipients of relevant awards.
Learn MoreDr S. Hon recevied the Early Career Award (21/22) from the Research Grants Council.
DetailsFollow HKBU Math