10,341 research outputs found
An automatic adaptive method to combine summary statistics in approximate Bayesian computation
To infer the parameters of mechanistic models with intractable likelihoods,
techniques such as approximate Bayesian computation (ABC) are increasingly
being adopted. One of the main disadvantages of ABC in practical situations,
however, is that parameter inference must generally rely on summary statistics
of the data. This is particularly the case for problems involving
high-dimensional data, such as biological imaging experiments. However, some
summary statistics contain more information about parameters of interest than
others, and it is not always clear how to weight their contributions within the
ABC framework. We address this problem by developing an automatic, adaptive
algorithm that chooses weights for each summary statistic. Our algorithm aims
to maximize the distance between the prior and the approximate posterior by
automatically adapting the weights within the ABC distance function.
Computationally, we use a nearest neighbour estimator of the distance between
distributions. We justify the algorithm theoretically based on properties of
the nearest neighbour distance estimator. To demonstrate the effectiveness of
our algorithm, we apply it to a variety of test problems, including several
stochastic models of biochemical reaction networks, and a spatial model of
diffusion, and compare our results with existing algorithms
The impact of temporal sampling resolution on parameter inference for biological transport models
Imaging data has become widely available to study biological systems at
various scales, for example the motile behaviour of bacteria or the transport
of mRNA, and it has the potential to transform our understanding of key
transport mechanisms. Often these imaging studies require us to compare
biological species or mutants, and to do this we need to quantitatively
characterise their behaviour. Mathematical models offer a quantitative
description of a system that enables us to perform this comparison, but to
relate these mechanistic mathematical models to imaging data, we need to
estimate the parameters of the models. In this work, we study the impact of
collecting data at different temporal resolutions on parameter inference for
biological transport models by performing exact inference for simple velocity
jump process models in a Bayesian framework. This issue is prominent in a host
of studies because the majority of imaging technologies place constraints on
the frequency with which images can be collected, and the discrete nature of
observations can introduce errors into parameter estimates. In this work, we
avoid such errors by formulating the velocity jump process model within a
hidden states framework. This allows us to obtain estimates of the
reorientation rate and noise amplitude for noisy observations of a simple
velocity jump process. We demonstrate the sensitivity of these estimates to
temporal variations in the sampling resolution and extent of measurement noise.
We use our methodology to provide experimental guidelines for researchers
aiming to characterise motile behaviour that can be described by a velocity
jump process. In particular, we consider how experimental constraints resulting
in a trade-off between temporal sampling resolution and observation noise may
affect parameter estimates.Comment: Published in PLOS Computational Biolog
Recommended from our members
Scorch marks from the sky
Daily sunshine duration is commonly reported at weather stations. Beyond the basic duration report, more information is available from scorched cards of Campbell-Stokes sunshine recorders, such as the estimation of direct-beam solar irradiance. Sunshine cards therefore potentially provide information on sky state, as inferred from solar-radiation data. Some sites have been operational since the late 19th century, hence sunshine cards potentially provide underexploited historical data on sky state. Sunshine cards provide an example of an archive source yielding data beyond the measurements originally sought
Latitudinal Analysis of Twilight Hydroxyl Airglow
Latitudinal characterizations of twilight mesospheric hydroxyl volume emission rate (VER) from year 2002 to 2005, are made possible using the SABER (Sounding of the Atmosphere using Broadband Emission Radiometry) sensor, a ten-channel infrared radiometer onboard NASA’s TIMED (Thermosphere Ionosphere Mesosphere Energetics and Dynamics) satellite. Implementation of a binning algorithm over time and geography provides global twilight characteristics from SABER radiometric channel 9 data, centered at ¸ = 1.64 ¹m for the OH (5,3) and OH (4,2) Meinel airglow band infrared emissions, and SABER radiometric channel 8 data, centered at ¸ = 2.06 ¹m for the OH (9,7) and OH (8,6) emissions. The findings show an equatorial effect in both infrared radiometric channels. Faster rise rates are observed at sunset while slower fall rates are observed at sunrise near the equator when compared with rates calculated at midlatitudes. Both hydroxyl channels show the most distinct sunset equatorial effects in the year 2002, and the most distinct sunrise equatorial effects in the year 2005
An automatic adaptive method to combine summary statistics in approximate Bayesian computation
To infer the parameters of mechanistic models with intractable likelihoods, techniques such as approximate Bayesian computation (ABC) are increasingly being adopted. One of the main disadvantages of ABC in practical situations, however, is that parameter inference must generally rely on summary statistics of the data. This is particularly the case for problems involving high-dimensional data, such as biological imaging experiments. However, some summary statistics contain more information about parameters of interest than others, and it is not always clear how to weight their contributions within the ABC framework. We address this problem by developing an automatic, adaptive algorithm that chooses weights for each summary statistic. Our algorithm aims to maximize the distance between the prior and the approximate posterior by automatically adapting the weights within the ABC distance function. Computationally, we use a nearest neighbour estimator of the distance between distributions. We justify the algorithm theoretically based on properties of the nearest neighbour distance estimator. To demonstrate the effectiveness of our algorithm, we apply it to a variety of test problems, including several stochastic models of biochemical reaction networks, and a spatial model of diffusion, and compare our results with existing algorithms
Equivalence of operations with respect to discriminator clones
For each clone C on a set A there is an associated equivalence relation,
called C-equivalence, on the set of all operations on A, which relates two
operations iff each one is a substitution instance of the other using
operations from C. In this paper we prove that if C is a discriminator clone on
a finite set, then there are only finitely many C-equivalence classes.
Moreover, we show that the smallest discriminator clone is minimal with respect
to this finiteness property. For discriminator clones of Boolean functions we
explicitly describe the associated equivalence relations.Comment: 17 page
Positive Impact? What factors affect access, retention and graduate outcomes for university students with a background of care or family estrangement?
The Kindergarten Journal, Summer 1910
Includes the essay, The Value of Character , by Elizabeth Harrison (page 28), Personal Mention (page 20) and Alumnae Report (page 29) by Edna Dean Baker, and Extension (page 25) by J.N. Crouse, co-principal of the Chicago Kindergarten College.
Journal editors: Mrs. Todd Lunsford and Mrs. Florence Capronhttps://digitalcommons.nl.edu/harrison-writings/1029/thumbnail.jp
An Analysis Of The Effectiveness Of Podcasting As A Supplemental Instructional Tool: A Pilot Study
Podcasting is the creation of audio or video files for use on iPods and other MP3 players. It allows the user to view or listen to downloadable files wherever or whenever desired. In higher education, podcasting is experiencing extraordinary growth. While a significant volume of literature exists both lauding and lamenting the incorporation of podcasts into university curricula, the authors were unable to find any empirical studies in either the academic or popular press evaluating any benefits or detriments attributable to educational applications of podcasting. This paper presents the pilot for an empirical study of the effectiveness of podcasting as a course supplement
Podcasting In Higher Education: Does It Make A Difference?
Podcasting is a growing trend in higher education. Major software companies, such as Apple, have dedicated entire websites to podcasting. These podcasts are available to college students to be used as supplemental material for specific coursework at their particular college or university. Unfortunately, due to the new and progressive nature of the technology, empirical studies of the effectiveness of this pedagogical device are rare. This paper presents an empirical study of the effectiveness of podcasting when incorporated as supplemental course material in a university course
- …