MBI Videos

CTW: Recent Advances in Statistical Inference for Mathematical Biology

  • video photo
    Dennis Prangle
    Dennis Prangle, Mathematics & Statistics, Lancaster University
    Presentation: http://mbi.osu.edu/2011/rasmaterials/MBI_DennisPrangle.pdf

    ABC is a powerful method for inference of statistical models with intractable likelihoods. Recently there has been much interest in using ABC for model choice and concerns have been raised that the results are not robust to summary statistic choice. We propose a method of choosing useful summary statistics and apply the method to population genetic and epidemiological examples.
  • video photo
    Colin Gillespie
    In this talk I will analyse the effects of various treatments on cotton aphids Aphis gossypii. The standard analysis of count data on cotton aphids determines parameter values by assuming a deterministic growth model and combines these with the corresponding stochastic model to make predictions on population sizes, depending on treatment. Here, we use an integrated stochastic model to capture the intrinsic stochasticity, of both observed aphid counts and unobserved cumulative population size for all treatment combinations simultaneously.

    Unlike previous approaches, this allows us to explicitly explore and more accurately assess treatment interactions. Markov chain Monte Carlo methods and the moment closure technique are used within a Bayesian framework to integrate over uncertainty associated with the unobserved cumulative population size and estimate the result twenty-eight parameters. We restrict attention to data on aphid counts in the Texas High Plains obtained for three different levels of irrigation water, nitrogen fertiliser and block, but note that the methods we develop can be applied to a wide range of problems in population ecology.
  • video photo
    Joe Tien
    Bursting is a ubiquitous phenomenon in neuroscience which involves multiple time scales (fast spikes vs. long quiescent intervals). Parameter estimation for bursting models is difficult due to these multiple scales. I will describe an approach to parameter estimation for these models which utilizes the geometry underlying bursting. This is joint work with John Guckenheimer.
  • video photo
    Douglas Bates
    Douglas Bates, Department of Statistics, University of Wisconsin - Madison
    Presentation (slides version): http://mbi.osu.edu/2011/rasmaterials/ProfilingD.pdf
    Presentation (notes version): http://mbi.osu.edu/2011/rasmaterials/ProfilingN.pdf

    The use of Markov-chain Monte Carlo methods for Bayesian inference has increased awareness of the need to view the posterior distribution of the parameter (in the Bayesian sense) or the distribution of the parameter estimator for those who prefer non-Bayesian techniques. I will concentrate on non-Bayesian inference although the techniques can also be applied to the posterior density in Bayesian methods. For many statistical models, including linear and generalized linear mixed-effects models, parameter estimates are defined as the optimizer of an objective function, e.g. the MLE's maximize the log-likelihood, and inference is based upon the location of the optimizer and local approximation at the optimizer, without assessing the validity of the approximation. This made sense when fitting a single model may have involved many days waiting for the answers from shared computer systems. It doesn't make sense when models can be fit in a few seconds. By repeatedly fitting a model subject to holding a particular parameter fixed we can build up a profile of the objective with respect to the parameter and use the information to produce profile based confidence intervals. But perhaps the most important aspect of the technique is graphical presentation of the results that force us to consider the behavior of the estimator beyond the estimate, which can cast doubt on many of the principles of inference and simulation that we hold dear.
  • video photo
    Ed Ionides
    Ed Ionides, Statistics, University of Michigan
    Presentation: http://mbi.osu.edu/2011/rasmaterials/mbi12_ionides.pdf

    Characteristic features of biological dynamic systems include stochasticity, nonlinearity, measurement error, unobserved variables, unknown system parameters, and even unknown system mechanisms. I will consider the resulting inferential challenges, with particular reference to pathogen/host systems (i.e., disease transmission). I will focus on statistical inference methodology which is based on simulations from a numerical model; such methodology is said to have the plug-and-play property. Plug-and-play methodology frees the modeler from an obligation to work with models for which transition probabilities are analytically tractable. A recent advance in plug-and-play likelihood-based inference for general partially observed Markov process models has been provided by the iterated filtering algorithm. I will discuss the theory and practice of iterated filtering.
  • video photo
    Nicolas Brunel
    Parameter inference of ordinary differential equations from noisy data can be seen as a nonlinear regression problem, within a parametric setting. The use of a classical statistical method such as Nonlinear Least Squares (NLS) gives rise to difficult and heavy optimization problems due to the corresponding badly posed inverse problem. Gradient Matching algorithms use a smooth (nonparametric) estimation of the solution from which is derived a nonparametric estimate of the derivative, and gives rise to a natural criterion easier than NLS to optimize. We introduce here a new class of criteria based on a weak formulation of the ODE. The estimator derived can be viewed as a generalized moment estimators which possesses nice statistical and computational properties. Finally, we consider several examples which illustrate the efficiency and the versatility of the proposed method.
  • video photo
    Barbel Finkenstadt
    A central challenge in computational modeling of dynamic biological systems is parameter inference from experimental time course measurements. Here we present an overview of the modeling approaches based on stochastic population dynamic models and their approximations. For an application on the mesoscopic scale, we present a two dimensional continuous-time Bayesian hierarchical diffusion model which has the potential to address the different sources of variability that are relevant to the stochastic modelling of transcriptional and translational processes at the molecular level, namely, intrinsic noise due to the stochastic nature of the birth and deaths processes involved in chemical reactions, extrinsic noise arising from the cell-to-cell variation of kinetic parameters associated with these processes and noise associated with the measurement process. Inference is complicated by the fact that only the protein and rarely other molecular species are observed which is typically entailing problems of parameter identification in dynamical systems.

    For an application on the macroscopic scale, we introduce a mechanistic 'switch' model for encoding a continuous transcriptional profile of genes over time with the aim of identifying the timing properties of mRNA synthesis which is assumed to switch between periods of transcriptional activity and inactivity, each time leading to the transition of a new steady state, while mRNA degradation is an ongoing linear process. The model is rich enough to capture a wide variety of expression behaviours including periodic genes. Finally, I will also give a brief introduction to some recent work on inferring the periodicity of the expression of circadian and other oscillating genes.

    Joint work with: Maria Costa, Dan Woodcock, Dafyd Jenkins, David Rand (all Warwick Systems Biology), Michal Komorowski (Imperial College London).
  • video photo
    Subhash Lele
    Most ecological models are constructed to understand the relationship between environmental variables and an ecological response, be it site occupancy or population abundance or changes to them. The usual regression models take into account the environmental variation in the response but in many cases, the measurement of the environmental variables themselves are made with error. This is called an errors-in-variables model. Measurement error in the covariates leads to substantial issues with parameter estimability and likelihood-based inference is computationally challenging. Bayesian inference using Markov Chain Monte Carlo methods also runs into trouble because of the convergence issues with the MCMC algorithm. These convergence issues are severe especially with non-informative priors.

    Errors in variables models, linear and non-linear, can be formulated as hierarchical models. Data cloning is a recently developed computational technique to conduct likelihood- based analysis for general hierarchical models. In this paper, we show that data cloning coupled with informative priors can circumvent the convergence issues with MCMC. We develop a new testing procedure to compare multivariate distributions using empirical characteristic functions and show its usefulness in diagnosing convergence of MCMC algorithm in these tricky situations. More importantly, we show that data cloning not only facilitates parameter estimation but also diagnosing which parameters are estimable and which ones are not. This is essential for drawing scientifically meaningful inferences. We illustrate the method using various linear and non-linear regression models useful in ecology. We report a somewhat surprising result that a widely used population dynamics model, the Hasell model, is non-identifiable but a closely related Generalized Beverton-Holt model is identifiable.

    Work done in collaboration with Khurram Nadeem.
  • video photo
    Carson Chow
    Presentation: http://mbi.osu.edu/2011/rasmaterials/mbibayes20121_chow.pdf
    Differential equations are often used to model biological and physiological systems. An important and difficult problem is how to estimate parameters and decide which model among possible models is the best. I will show in several examples how Bayesian and Markov Chain Monte Carlo approaches provide a self-consistent framework to do both tasks. In particular, Bayesian parameter estimation provides a natural measure of parameter sensitivity and Bayesian model comparison automatically evaluates models by rewarding fit to the data while penalizing the number of parameters.
  • video photo
    Andrew Golightly
    Andrew Golightly, School of Mathematics & Statistics, Newcastle University
    Presentation: http://mbi.osu.edu/2011/rasmaterials/AGmbi12.pdf

    We consider the problem of performing Bayesian inference for the rate constants governing stochastic kinetic models. As well as considering inference for the resulting Markov jump process (MJP) we consider working with a diffusion approximation obtained by matching the infinitesimal mean and variance of the MJP to the drift and diffusion coefficients of a stochastic differential equation (SDE). We sample from the posterior distribution of the model parameters given observations at discrete times via recently proposed particle MCMC methods. In the case of the diffusion approximation we increase the efficiency of the inference algorithm by exploiting the structure of the SDE. We present results from two toy examples: a Lotka-Volterra system and a simple model of prokaryotic autoregulation.

View Videos By