2021 Feb     Mar     Apr     May     Jun     Jul    
2020 Jan     May     Jun     Jul     Aug     Sep     Oct     Nov     Dec    
2019 Jan     Feb     Mar     Apr     May     Jun     Jul     Aug     Oct     Nov    
2018 Jan     Feb     Mar     Apr     May     Jun     Jul     Aug     Oct     Nov     Dec    
2017 Jan     Feb     Mar     Apr     May     Jun     Jul     Aug     Oct     Nov     Dec    

Event(s) on December 2013

  • Monday, 2nd December, 2013

    Title: Fast Algorithms for High Frequency Waves and Related Inverse Problems
    Speaker: Prof. Jianliang QIAN, Department of Mathematics, Michigan State University, USA
    Time/Place: 11:00  -  12:00
    FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University
    Abstract: Computational wave propagation has emerged as a fundamental, vigorously growing technology for modeling, design, and development in areas ranging from seismic imaging, medical imaging, remote sensing, and nanotechnology. I will give an overview of some newly developed fast algorithms for high frequency wave propagation, such as fast sweeping methods for eikonal equations, Gaussian beam methods for the Schrodinger equation based on fast wavepacket transforms, and multiscale Gaussian beam methods for wave equations based on fast multiscale wavepacket transforms. To demonstrate the power of these fast algorithms, I will outline two applications in inverse problems: traveltime tomography and photoacoustic tomography.

  • Tuesday, 3rd December, 2013

    Title: Regularized Approximate Cloaking of Acoustic and Electromagnetic Waves
    Speaker: Dr. Liu Hongyu, Department of Mathematics and Statistics, University of North Carolina, USA
    Time/Place: 11:30  -  12:30
    FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University
    Abstract: In this talk, I will describe the recent theoretical and computational progress of our work on regularized transformation-optics cloaking. Ideal cloak makes use of singular metamaterials, posing great challenges for both theoretical analysis and practical fabrication. Regularizations are incorporated into the construction to avoid the singular structures.

  • Wednesday, 4th December, 2013

    Title: GPA : A statistical approach to prioritizing GWAS results by integrating pleiotropy information and annotation data
    Speaker: Dr. Can YANG, Yale Center for Statistical Genomics and Proteomics, Yale school of Medicine, USA
    Time/Place: 11:30  -  12:30
    FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University
    Abstract: Abstract: Genome-wide association studies (GWAS) suggests that that a complex disease is typically affected by many genetic variants with small or moderate effects. Identification of these risk variants remains to be a very challenging problem. Traditional approaches focusing on a single GWAS dataset ignore relevant information revealed by the Big Data in genomics: (1) Accumulating evidence suggests that different complex diseases are genetically correlated, i.e., multiple diseases share common risk genetic bases, which is known as .pleiotropy.. (2) SNPs are not equally important and functionally annotated genetic variants have revealed a consistent pattern of enrichment. Thus, we proposed a novel statistical approach, named GPA, to performing integrative analysis of multiple GWAS datasets and functional annotation. Hypothesis testing procedures were developed to facilitate statistical inference of pleiotropy and enrichment of functional annotation. A computationally efficient EM algorithm was also available to handle millions of SNP markers. We applied our approach to perform systematical analysis of psychiatric disorders. Not only did GPA identify many weak signals, but also revealed interesting genetic architectures of these disorders. The pleiotropy effect was very strong between bipolar disorder (BPD) and Schizophrenia (SCZ). The SNPs in the central nervous system genes were highly enriched for both BPD and SCZ. These results deepened our understanding of genetic etiology for psychiatric disorders. In summary, PGA can serve as an effective tool for integrative data analysis in the era of Big Data in genomics.

  • Tuesday, 10th December, 2013

    Title: Simulation of Compressible Plasma Flow in Circuit Breakers
    Speaker: Prof. JELTSCH Rolf , Seminar for Applied Mathematics, ETH Zurich, Switzerland
    Time/Place: 11:00  -  12:00
    FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University
    Abstract: The main function of a circuit breaker is to switch off the electric current safely, in case of fault current. A mechanical force separates the contacts, and an arc starts to burn between the two contacts. This plasma is described by the resistive Magnetohydrodynamics (MHD) equations. The emphasis is on very high currents (10kA-200kA) and relatively high conductivity. Radiation is incorporated by adding a Stefan's radiation. To simulate the plasma in the arc the Nektar code developed by Brown University is adapted and extended. It is based on the Discontinuous Galerkin(DG) methods allowing for triangular or quadrilateral meshes in 2d and hexagonal or tetrahedral meshes in 3d. GID is used for mesh generation. The code is extended to include Runge-Kutta time stepping, various accurate Riemann solvers for MHD, slope limiters and $SF_6$ gas data. It operates on both serial and parallel computers with arbitrary number of processors.The suitability of this Runge-Kutta Discontinuous Galerkin (RKDG) methods is analysed. In particular different numerical fluxes, different Riemann solvers and limiters, low and high order approximations on smooth and non-smooth solutions are investigated. Numerical results are given. This work has been performed by Patrick Huguenot and Harish Kumar in their Ph.D. thesis and by Vincent Wheatley.

  • Wednesday, 18th December, 2013

    Title: Two-stage stochastic optimization with Risk measure and risk aversity
    Speaker: Prof. SUN Jie, Department of Decision Sciences, National University of Singapore, Singapore
    Time/Place: 11:00  -  12:00
    SCT909, Cha Chi-ming Science Tower, HSH Campus, Hong Kong Baptist University
    Abstract: Traditionally, a two-stage stochastic linear optimization problem is solved by using a risk-neutral approach, in which the mean of the second stage optimal value is used to make a here-and-now decision. This approach can be replaced by a more realistic one by using a coherent risk measure of the second stage optimal value. This change has two major consequences. 1. It provides flexibility in dealing with the second stage problem, for example, it may acommodate risk-aversity in the decision making; 2. It makes the second stage optimazation problem take a form of worst-case optimization, therefore reduces the requirement for knowledge on distribution of the random varaibles involved. The impact to computation is also amazing. It may allow one to avoid the so-called curse of dimensionality under reasonable assumptions, thus to keep the problem computationally tractble.

  • Wednesday, 18th December, 2013

    Title: Understanding Quantitative Risk and Uncertainty in Finance
    Speaker: Prof. Shige Peng, Shandong University, China
    Time/Place: 16:30  -  17:30 (Preceded by Reception at 4:00pm)
    RRS905, Sir Run Run Shaw Building, HSH Campus, Hong Kong Baptist University
    Abstract: For more than 100 years scientists struggled to understand how to quantitatively measure and regulate financial risks. In this lecture we present some milestones in the research of this crucially important subjects, contributed by Bachelier, Markowitz, Black-Scholes, etc. We also discuss some typical methods such as value at risk (VaR), shortfall, coherent risk measure, backward SDE and nonlinear expectations.

  • Friday, 27th December, 2013

    Title: On Estimation Efficiency in dimension reduction
    Speaker: Prof. MA Yanyuan , Department of Statistics, Texas A&M University, USA
    Time/Place: 11:30  -  12:30
    FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University
    Abstract: We investigate the estimation efficiency of the central (mean) subspace in the framework of sufficient dimension reduction. We derive the semiparametric efficient score and study its practical applicability. Despite the difficulty caused by the potential high dimension issue in the variance component, we show that efficient and/or locally efficient estimators can be constructed in practice. We conduct simulation studies and a real-data analysis to demonstrate the finite sample performance and efficiency gain of the proposed estimators in comparison with several existing methods. This is joint work with Liping Zhu.

  • Monday, 30th December, 2013

    Title: Statistical Inference for Single Index Panel Date
    Speaker: Dr. ZHU Liping, School of Statistics and Management, Shanghai University of Finance and Economics, China
    Time/Place: 11:30  -  12:30
    FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University
    Abstract: We study estimation and hypothesis testing in single-index panel data models with individual effects. Through regressing the individual effects on the covariates linearly, we convert the estimation problem in single-index panel data models to that in partially linear single-index models. The conversion is valid regardless of the individual effects being random or fixed. We propose an estimating equation approach, which has a desirable double robustness property. We show that our method is applicable in single-index panel data models with heterogeneous link functions. We further design a chi-square test to evaluate whether the individual effects are purely random or not. We conduct simulations to demonstrate the finite sample performance of the method and conduct a data analysis to illustrate its usefulness.



All years