Year | Month |
2023 | Jan Feb Mar Apr May Jun Jul Aug Oct |
2022 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov Dec |
2021 | Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec |
2020 | Jan May Jun Jul Aug Sep Oct Nov Dec |
2019 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov |
Title: | The importance to be discrete: optimized Schwarz domain decomposition methods for anisotropic elliptic equations |
Speaker: | Prof Laurence Halpern, Department of Mathematics, University Paris 13 , France |
Time/Place: | 11:30 - 12:30 FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University |
Abstract: | In the last two decades, optimized Schwarz methods were introduced and analyzed for various equations. They are very well fitted to anisotropic problem, but the values of the optimized coefficients may be not so good for strong anisotropies. We study here two new issues linked to the size of the computational domain, and the discretization scheme. We show on two examples, the classical FV4 scheme and the DDFV scheme, how to treat these issues to optimized the transmission between the subdomains. |
Title: | Forward and inverse eddy current problems |
Speaker: | Prof CHEN Junqing, Department of Mathematical Sciences, Tsinghua University, Beijing, China |
Time/Place: | 11:00 - 12:00 FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University |
Abstract: | The complete mathematical model of electromagnetic field is Maxwell's equation, but sometimes the eddy current approximation to Maxwell's equation is more powerful and has many applications. In this talk, I will talk about forward and inverse eddy current problems. The forward problem is based on the A-phi model, with which one can couple the circuit variables to the field equation. We solve the problem with adaptive edge element method and propose an optimal preconditioner for the discrete algebraic system. For the inverse problem, we study the eddy current field with small inclusions by asymptotic analysis method. With the asymptotic result, we can simulate the forward problem with the adaptive finite element. Then we give detection and classification algorithms based on the leading order term. Finally, I will show some numerical examples to illustrate the proposed algorithms. |
Title: | New Developments in Multiple Testing and Multivariate Testing for High-Dimensional Data |
Speaker: | Mr Zongliang HU |
Time/Place: | 10:30 - 12:30 FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University |
Abstract: | This thesis aims to develop some new and novel methods in advancing multivariate testing and multiple testing for high-dimensional small sample size data. |
Title: | An assembly and decomposition (AD) approach for constructing separable minorizing functions in a class of MM algorithms |
Speaker: | Prof Guo-Liang TIAN, Department of Mathematics, Southern University of Science and Technology, Shenzhen, China |
Time/Place: | 14:00 - 15:00 FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University |
Abstract: | The minorization-maximization (MM) principle provides an important and useful tool for optimization problems and has a broad range of applications in statistics because of its conceptual simplicity, ease of implementation and numerical stability. A key step in developing an MM algorithm is to construct an appropriate minorizing function. This is quite challenging to many practitioners as it has to be done case by case and its success often involves and heavily depends on a clever and specific use of Jensen's inequality or a similar kind. To address this problem, in this paper, we propose a new assembly and decomposition (AD) approach which successfully constructs separable minorizing functions in a general class of MM algorithms. The AD approach constructs a minorizing function by employing two novel techniques which we refer to as the assembly technique (or A-technique) and the decomposition technique (or D-technique), respectively. The A-technique first introduces the notions of assemblies and complemental assembly, consisting of several families of concave functions that have arisen in numerous applications. The D-technique then cleverly decomposes the high-dimensional objective function into a sum of one-dimensional functions to construct minorizing functions as guided and facilitated by the A-technique. We demonstrate the utility of the proposed approach in diverse applications which result in novel algorithms with theoretical and numerical advantages. Extensive numerical studies are provided to assess its finite-sample performance. Further extensions of the AD techniques are also discussed. (This is a joint work with Miss Xifen HUANG and Dr. Jinfeng XU) |
Title: | Nonconvex Optimization for Spectral Compressed Sensing |
Speaker: | Prof Ke WEI, School of Data Science, Fudan University, Shanghai, China |
Time/Place: | 10:30 - 11:30 FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University |
Abstract: | Spectrally sparse data arise in many areas of science and engineering, for instance magnetic resonance imaging, fluorescence microscopy and radar imaging. Spectral compressed sensing is about reconstructing spectrally sparse data from incomplete information. Two difference classes of nonconvex optimization algorithms are introduced to tackle this problem. They are developed by exploiting low rank structure within the data in two different ways: one is based on the embedded manifold of low rank matrices and the other is based on the factorization model of low rank matrices. Theoretical recovery guarantees will be presented for the proposed algorithms under certain random models, showing that the sampling complexity is essentially proportional to the intrinsic dimension of the problems rather the ambient dimension. Empirical observations demonstrate the efficacy of the algorithms. |
We organize conferences and workshops every year. Hope we can see you in future.
Learn MoreProf. M. Cheng, Dr. Y. S. Hon, Dr. K. F. Lam, Prof. L. Ling, Dr. T. Tong and Prof. L. Zhu have been awarded research grants by Hong Kong Research Grant Council (RGC) — congratulations!
Learn MoreFollow HKBU Math