This year ICMNS has arranged optional pre-congress tutorials. The number of places will be limited, first registered, first accepted.
Date: June 23, 2019
Place: Auditorium 4, HC Ørsted Instituttet, University of Copenhagen, Universitetsparken 5, 2100 Copenhagen
Mathieu Desroches (Sophia Antipolis - Méditerranée Centre, France)
Slow-fast dynamics in bursting neuron models
Zachary Kilpatrick (University of Colorado, Boulder, USA)
Part I: Chapman-Kolmogorov equations for models of decision-making and working memory.
Abstract: We will start by introducing one of the simplest, but most impactful, models in neuroscience and psychology — the drift-diffusion model (DDM). This model characterizes the instantaneous belief of an observer accumulating noisy evidence. Models of free response decision-making tasks can be framed as first passage time problems, which can be analyzed using the corresponding backward Chapman-Kolmogorov (CK) equation.
The DDM has been extended recently to account for dynamic environments within trials using nonlinear leak terms, and the propagation of information across trials using history-dependent initial condition. Dynamic environments often incorporate memoryless stochastic switching, which adds a dimension to the CK evolution equations to account for each drift direction. Nonetheless, the CK equations can be solved much more rapidly than performing Monte Carlo simulations to characterize the response statistics of the decision model. In the limit of long and short switching timescales, asymptotics can also be used to derive explicit approximate expressions for the distribution of the belief variable.
We will also discuss related models which introduce heterogeneities into a DDM corresponding to discrete attractors in working memory. The average velocity of the state variable can be estimated using a discrete Markov-chain approximation, which assumes the state dwells near attractors until transitioning at exponentially distributed times. This tutorial will focus on both motivating these stochastic models as simple descriptions of mammalian psychophysical behaviors and demonstrating useful analytical tools for approximating their statistics.
Part II: Asymptotic analysis of stochastic neural fields
Abstract: We discuss asymptotic methods for simultaneously analyzing the effects of noise, inhomogeneities, and long-range connections in neural field equations. Neural fields are nonlinear integrodifferential equations whose integral term describes the strength of connections between neurons in a large network. Our approach to studying their solutions is formal, in that we will construct solutions to homogeneous neural fields and then consider the effects of noise, inhomogeneities, and long-range connections perturbatively.
We are particularly interesting in showing that more realistic network architectures can be utilized to improve network performance on working memory tasks. Persistent neural activity underlying short term memory takes the form of a stationary bump solution to a neural field. A perturbative equation for the evolution of the bump’s centroid in the presence of network perturbations. For instance, introducing spatial heterogeneity into network architecture establishes a multistable potential landscape, slowing the rate at which noise causes the bump to diffuse in space, as we will show using methods introduced in Part I. Ultimately, this improves the accuracy with which the network encodes memories.
If time permits, we will also introduce methods for analyzing networks with multiple layers, delays, negative feedback, and multiple interacting bumps. We also show that similar approaches can be used to analyze wave propagation in stochastic neural fields.
A detailed program will be available later.