Year | Month |
2023 | Jan Feb Mar Apr May Jun Jul Aug Oct |
2022 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov Dec |
2021 | Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec |
2020 | Jan May Jun Jul Aug Sep Oct Nov Dec |
2019 | Jan Feb Mar Apr May Jun Jul Aug Oct Nov |
Title: | From Sparse Irregular Spikes to Critical Avalanches: Cost-Efficient Neural Dynamics |
Speaker: | Prof Changsong Zhou, Department of Physics, Hong Kong Baptist University, Hong Kong |
Time/Place: | 16:00 - 17:00 Zoom, (ID: 939 0417 7353 and Passcode: 594986 ) |
Abstract: |
The brain is highly energy consuming, therefore is under strong
selective pressure to achieve cost-efficiency in both cortical
connectivity and activity. Cortical neural circuits display highly
irregular spiking in individual neurons but variably sized collective
firing, oscillations and critical avalanches at the population
level, all of which have functional importance for information
processing. However, cost-efficiency as a design principle for
cortical activities has been rarely studied. Especially it is
not clear how cost-efficiency is related to ubiquitously observed
multi-scale properties of irregular firing, oscillations and
neuronal avalanches. In this talk, I review key features of the
brain as complex dynamical network systems. Then I will give
a brief introduction of our work demonstrating that prominent
multilevel neural dynamics properties can be simultaneously observed
in a generic, biologically plausible neural circuit model that
captures excitation-inhibition balance and realistic dynamics
of synaptic conductance. Their co-emergence achieves minimal
energy cost as well as maximal energy efficiency on information
capacity, when neuronal firing is coordinated and shaped by moderate
synchrony to reduce otherwise redundant spikes, and the dynamical
clustering is maintained in the form of neuronal avalanches.
We propose a semi-analytical mean-field theory to derive the
field equations governing the network macroscopic dynamics. It
reveals that the E-I balanced state of the network manifesting
irregular individual spiking is characterized by a macroscopic
stable state, which can be either a fixed point or a periodic
motion and the transition is predicted by a Hopf bifurcation
in the macroscopic field. An analysis of the impact of network
topology from random to modular networks shows that local dense
connectivity under E-I balanced dynamics appears to be the key
"less-is-more" solutions to achieve cost-efficiency organization
in neural systems.
Reference: Dongping Yang, Haijun Zhou and Changsong Zhou, Co-emergence of Multi-scale Cortical Activities of Irregular firing, Oscillations and Avalanches Achieves Cost-efficient Information Capacity, PLoS Computational Biology 13, e1005384 (2017). Junhao Liang, Tianshou Zhou and Changsong Zhou, Hopf Bifurcation in Mean Field Explains Critical Avalanches in E-I Balanced Neuronal Networks: A Mechanism for Multiscale Variability, Frontiers in Systems Neuroscience (in press). Shenjun Wang, Junhao Liang and Changsong Zhou: Less is More: Wiring-Economical Modular Networks Support Self-Sustained Firing-Economical Neural Avalanches for Efficient Processing, submitted. arXiv: 2007.02511 (2020). |
Title: | Turnpike control and deep learning |
Speaker: | Prof Enrique Zuazua, FAU Erlangen-Nürnberg, Germany |
Time/Place: | 16:00 - 17:00 Zoom, (Meeting ID: 947 0888 1860) |
Abstract: | The turnpike principle asserts that in long time horizons optimal control strategies are nearly of a steady state nature. In this lecture we shall survey on some recent results on this topic and present some its consequences on deep supervised learning. This lecture will be based in particular on recent joint work with C. Esteve, B. Geshkovski and D. Pighin. |
Title: | Euclidean Distance Matrix Optimization: Models, Algorithms and Applications |
Speaker: | Prof Hou-Duo Qi, CORMSIS (Center of Operational Research, Management Science and Information Systems), University of Southampton, UK |
Time/Place: | 16:00 - 17:00 Zoom, Meeting ID: 940 9975 0728 |
Abstract: | Euclidean Distance Matrix (EDM) optimization has become a robust approach for analyzing dissimilarity data due to its capacity of handling hard constraints, availability of fast algorithms and guaranteed error bounds. It has found applications in machine learning (dimensionality reduction), engineering (sensor network localization), social sciences (multidimensional scaling), and computational chemistry (molecular conformation). This talk will give a brief review of the approach, showcasing its mathematical theory, algorithmic development and its performance on some challenging examples. We will also discuss an emerging and exciting application in computational finance on portfolio selections. We end the talk with an open question on software implementation. |
We organize conferences and workshops every year. Hope we can see you in future.
Learn MoreProf. M. Cheng, Dr. Y. S. Hon, Dr. K. F. Lam, Prof. L. Ling, Dr. T. Tong and Prof. L. Zhu have been awarded research grants by Hong Kong Research Grant Council (RGC) — congratulations!
Learn MoreFollow HKBU Math