Learning noise-induced transitions by multi-scaling reservoir computing

Zequn Lin, Zhaofan Lu, Zengru Di, Ying Tang
Nonlinear Sciences, Adaptation and Self-Organizing Systems, Adaptation and Self-Organizing Systems (nlin.AO), Machine Learning (cs.LG)
2023-09-10 16:00:00
Noise is usually regarded as adversarial to extract the effective dynamics from time series, such that the conventional data-driven approaches usually aim at learning the dynamics by mitigating the noisy effect. However, noise can have a functional role of driving transitions between stable states underlying many natural and engineered stochastic dynamics. To capture such stochastic transitions from data, we find that leveraging a machine learning model, reservoir computing as a type of recurrent neural network, can learn noise-induced transitions. We develop a concise training protocol for tuning hyperparameters, with a focus on a pivotal hyperparameter controlling the time scale of the reservoir dynamics. The trained model generates accurate statistics of transition time and the number of transitions. The approach is applicable to a wide class of systems, including a bistable system under a double-well potential, with either white noise or colored noise. It is also aware of the asymmetry of the double-well potential, the rotational dynamics caused by non-detailed balance, and transitions in multi-stable systems. For the experimental data of protein folding, it learns the transition time between folded states, providing a possibility of predicting transition statistics from a small dataset. The results demonstrate the capability of machine-learning methods in capturing noise-induced phenomena.
PDF: Learning noise-induced transitions by multi-scaling reservoir computing.pdf
Empowered by ChatGPT