5,763 research outputs found
Data-driven modelling of biological multi-scale processes
Biological processes involve a variety of spatial and temporal scales. A
holistic understanding of many biological processes therefore requires
multi-scale models which capture the relevant properties on all these scales.
In this manuscript we review mathematical modelling approaches used to describe
the individual spatial scales and how they are integrated into holistic models.
We discuss the relation between spatial and temporal scales and the implication
of that on multi-scale modelling. Based upon this overview over
state-of-the-art modelling approaches, we formulate key challenges in
mathematical and computational modelling of biological multi-scale and
multi-physics processes. In particular, we considered the availability of
analysis tools for multi-scale models and model-based multi-scale data
integration. We provide a compact review of methods for model-based data
integration and model-based hypothesis testing. Furthermore, novel approaches
and recent trends are discussed, including computation time reduction using
reduced order and surrogate models, which contribute to the solution of
inference problems. We conclude the manuscript by providing a few ideas for the
development of tailored multi-scale inference methods.Comment: This manuscript will appear in the Journal of Coupled Systems and
Multiscale Dynamics (American Scientific Publishers
DISTq: An Iterative Analysis of Glucose Data for Low-Cost, Real-Time and Accurate Estimation of Insulin Sensitivity
Insulin sensitivity (SI) estimation has numerous uses in medical and clinical situations. However, highresolution tests that are useful for clinical diagnosis and monitoring are often too intensive, long and costly for regular use. Simpler tests that mitigate these issues are not accurate enough for many clinical diagnostic or monitoring scenarios. The gap between these tests presents an opportunity for new approaches
A graphical method for practical and informative identifiability analyses of physiological models: A case study of insulin kinetics and sensitivity
peer reviewedBackground: Derivative based a-priori structural identifiability analyses of mathematical models can offer valuable insight into the identifiability of model parameters. However, these analyses are only capable of a binary confirmation of the mathematical distinction of parameters and a positive outcome can begin to lose relevance when measurement error is introduced. This article presents an integral based method that allows the observation of the identifiability of models with two-parameters in the presence of assay error. Methods: The method measures the distinction of the integral formulations of the parameter coefficients at the proposed sampling times. It can thus predict the susceptibility of the parameters to the effects of measurement error. The method is tested in-silico with Monte Carlo analyses of a number of insulin sensitivity test applications. Results: The method successfully captured the analogous nature of identifiability observed in Monte Carlo analyses of a number of cases including protocol alterations, parameter changes and differences in participant behaviour. However, due to the numerical nature of the analyses, prediction was not perfect in all cases. Conclusions: Thus although the current method has valuable and significant capabilities in terms of study or test protocol design, additional developments would further strengthen the predictive capability of the method. Finally, the method captures the experimental reality that sampling error and timing can negate assumed parameter identifiability and that identifiability is a continuous rather than discrete phenomenon
Integral-based filtering of continuous glucose sensor measurements for glycaemic control in critical care
Hyperglycaemia is prevalent in critical illness and increases the risk of further
complications and mortality, while tight control can reduce mortality up to 43%.
Adaptive control methods are capable of highly accurate, targeted blood glucose
regulation using limited numbers of manual measurements due to patient discomfort
and labour intensity. Therefore, the option to obtain greater data density using
emerging continuous glucose sensing devices is attractive. However, the few such
systems currently available can have errors in excess of 20-30%. In contrast, typical
bedside testing kits have errors of approximately 7-10%. Despite greater measurement
frequency larger errors significantly impact the resulting glucose and patient specific
parameter estimates, and thus the control actions determined creating an important
safety and performance issue. This paper models the impact of the Continuous
Glucose Monitoring System (CGMS, Medtronic, Northridge, CA) on model-based
parameter identification and glucose prediction. An integral-based fitting and filtering
method is developed to reduce the effect of these errors. A noise model is developed
based on CGMS data reported in the literature, and is slightly conservative with a
mean Clarke Error Grid (CEG) correlation of R=0.81 (range: 0.68-0.88) as compared to a reported value of R=0.82 in a critical care study. Using 17 virtual patient profiles
developed from retrospective clinical data, this noise model was used to test the
methods developed. Monte-Carlo simulation for each patient resulted in an average
absolute one-hour glucose prediction error of 6.20% (range: 4.97-8.06%) with an
average standard deviation per patient of 5.22% (range: 3.26-8.55%). Note that all the
methods and results are generalisable to similar applications outside of critical care,
such as less acute wards and eventually ambulatory individuals. Clinically, the results
show one possible computational method for managing the larger errors encountered
in emerging continuous blood glucose sensors, thus enabling their more effective use
in clinical glucose regulation studies
A Minimal C-Peptide Sampling Method to Capture Peak and Total Pre-Hepatic Insulin Secretion in Model-Based Experimental Insulin Sensitivity Studies
Aims and Background:
Model-based insulin sensitivity testing via the intravenous glucose tolerance test (IVGTT) or similar is clinically very intensive due to the need for frequent sampling to accurately capture the dynamics of insulin secretion and clearance. The goal of this study was to significantly reduce the number of samples required in intravenous glucose tolerance test protocols to accurately identify C-peptide and insulin secretion characteristics.
Methods:
Frequently sampled IVGTT data from 12 subjects [5 normal glucose-tolerant (NGT) and 7 type 2 diabetes mellitus (T2DM)] were analyzed to calculate insulin and C-peptide secretion using a well-accepted C-peptide model. Samples were reduced in a series of steps based on the critical IVGTT profile points required for the
accurate estimation of C-peptide secretion. The full data set of 23 measurements was reduced to sets with six or four measurements. The peak secretion rate and total secreted C-peptide during 10 and 20 minutes
postglucose input and during the total test time were calculated. Results were compared to those from the
full data set using the Wilcoxon rank sum to assess any differences.
Results:
In each case, the calculated secretion metrics were largely unchanged, within expected assay variation, and not significantly different from results obtained using the full 23 measurement data set (P < 0.05).
Conclusions:
Peak and total C-peptide and insulin secretory characteristics can be estimated accurately in an IVGTT from as few as four systematically chosen samples, providing an opportunity to minimize sampling, cost, and burden
Systems approaches and algorithms for discovery of combinatorial therapies
Effective therapy of complex diseases requires control of highly non-linear
complex networks that remain incompletely characterized. In particular, drug
intervention can be seen as control of signaling in cellular networks.
Identification of control parameters presents an extreme challenge due to the
combinatorial explosion of control possibilities in combination therapy and to
the incomplete knowledge of the systems biology of cells. In this review paper
we describe the main current and proposed approaches to the design of
combinatorial therapies, including the empirical methods used now by clinicians
and alternative approaches suggested recently by several authors. New
approaches for designing combinations arising from systems biology are
described. We discuss in special detail the design of algorithms that identify
optimal control parameters in cellular networks based on a quantitative
characterization of control landscapes, maximizing utilization of incomplete
knowledge of the state and structure of intracellular networks. The use of new
technology for high-throughput measurements is key to these new approaches to
combination therapy and essential for the characterization of control
landscapes and implementation of the algorithms. Combinatorial optimization in
medical therapy is also compared with the combinatorial optimization of
engineering and materials science and similarities and differences are
delineated.Comment: 25 page
- …