165 research outputs found
The percutaneous absorption of soman in a damaged skin porcine model and the evaluation of WoundStat™ as a topical decontaminant
PURPOSE: The aim of this study was to evaluate a candidate haemostat (WoundStat™), down-selected from previous in vitro studies, for efficacy as a potential skin decontaminant against the chemical warfare agent pinacoyl methylfluorophosphonate (Soman, GD) using an in vivo pig model. MATERIALS AND METHODS: An area of approximately 3 cm2 was dermatomed from the dorsal ear skin to a nominal depth of 100 µm. A discrete droplet of 14C-GD (300 µg kg-1) was applied directly onto the surface of the damaged skin at the centre of the dosing site. Animals assigned to the treatment group were given a 2 g application of WoundStat™ 30 s after GD challenge. The decontamination efficacy of WoundStat™ against GD was measured by the direct quantification of the distribution of 14C-GD, as well as routine determination of whole blood cholinesterase and physiological measurements. RESULTS: WoundStat™ sequestered approximately 70% of the applied 14C-GD. Internal radiolabel recovery from treated animals was approximately 1% of the initially applied dose. Whole blood cholinesterase levels decreased to less than 10% of the original value by 15 min post WoundStat™ treatment and gradually decreased until the onset of apnoea or until euthanasia. All treated animals showed signs of GD intoxication that could be grouped into early (mastication, fasciculations and tremor), intermediate (miosis, salivation and nasal secretions) and late onset (lacrimation, body spasm and apnoea) effects. Two of the six WoundStat™ treated animals survived the study duration. CONCLUSIONS: The current study has shown that the use of WoundStat™ as a decontaminant on damaged pig ear skin was unable to fully protect against GD toxicity. Importantly, the findings indicate that the use of WoundStat™ in GD contaminated wounds would not exacerbate GD toxicity. These data suggest that absorbent haemostatic products may offer some limited functionality as wound decontaminants.Peer reviewedFinal Accepted Versio
Prediction of survival probabilities with Bayesian Decision Trees
Practitioners use Trauma and Injury Severity Score (TRISS) models for predicting the survival probability of an injured patient. The accuracy of TRISS predictions is acceptable for patients with up to three typical injuries, but unacceptable for patients with a larger number of injuries or with atypical injuries. Based on a regression model, the TRISS methodology does not provide the predictive density required for accurate assessment of risk. Moreover, the regression model is difficult to interpret. We therefore consider Bayesian inference for estimating the predictive distribution of survival. The inference is based on decision tree models which recursively split data along explanatory variables, and so practitioners can understand these models. We propose the Bayesian method for estimating the predictive density and show that it outperforms the TRISS method in terms of both goodness-of-fit and classification accuracy. The developed method has been made available for evaluation purposes as a stand-alone application
A Regression Tree Approach using Mathematical Programming
Regression analysis is a machine learning approach that aims to accurately predict the value of continuous output variables from certain independent input variables, via automatic estimation of their latent relationship from data. Tree-based regression models are popular in literature due to their flexibility to model higher order non-linearity and great interpretability. Conventionally, regression tree models are trained in a two-stage procedure, i.e. recursive binary partitioning is employed to produce a tree structure, followed by a pruning process of removing insignificant leaves, with the possibility of assigning multivariate functions to terminal leaves to improve generalisation. This work introduces a novel methodology of node partitioning which, in a single optimisation model, simultaneously performs the two tasks of identifying the break-point of a binary split and assignment of multivariate functions to either leaf, thus leading to an efficient regression tree model. Using six real world benchmark problems, we demonstrate that the proposed method consistently outperforms a number of state-of-the-art regression tree models and methods based on other techniques, with an average improvement of 7–60% on the mean absolute errors (MAE) of the predictions
Design of Experiments for Screening
The aim of this paper is to review methods of designing screening
experiments, ranging from designs originally developed for physical experiments
to those especially tailored to experiments on numerical models. The strengths
and weaknesses of the various designs for screening variables in numerical
models are discussed. First, classes of factorial designs for experiments to
estimate main effects and interactions through a linear statistical model are
described, specifically regular and nonregular fractional factorial designs,
supersaturated designs and systematic fractional replicate designs. Generic
issues of aliasing, bias and cancellation of factorial effects are discussed.
Second, group screening experiments are considered including factorial group
screening and sequential bifurcation. Third, random sampling plans are
discussed including Latin hypercube sampling and sampling plans to estimate
elementary effects. Fourth, a variety of modelling methods commonly employed
with screening designs are briefly described. Finally, a novel study
demonstrates six screening methods on two frequently-used exemplars, and their
performances are compared
Grand challenges in evolutionary developmental biology
EVO-DEVO'S IDENTITY There is a widespread consensus on the view that evolutionary developmental biology (evo-devo) is the discipline eventually borne to fill the gap between evolutionary biology and developmental biology, following a divorce between these two fields that extended over more than half a century (Amundson, 2005). On closer inspection, however, this broadly acceptable perspective discloses a wealth of questions, if looked at retrospectively, and of potentially divergent possibilities, if looked at prospectively. The slow pace of integration between the different threads that were converging into evo-devo was well expressed by Raff (2000) in a survey of the main issues in this field. Some 15 years ago Raff, one of the discipline's founding fathers, remarked that "What constitutes the fundamental problems for a science of evolutionary developmental biology (evo-devo) depends on whether the scientist is a developmental biologist, a paleontologist or an evolutionary biologist" and drafted a list of at the time hot issues. Evo-devo has answered these questions only in part. However, this discipline is now mature for addressing a number of more precise, and more challenging questions, as I will argue in this article. To date, two sets of problems have been primarily floated in discussions about the identity and research targets of evo-devo. On the one hand are those centered around the (controversial) notions of evolvability, robustness and constraint in connection with the increasing appreciation of the intricacies of the genotype→phenotype map (Alberch, 1991; Altenberg, 1995; West-Eberhard, 2003; Pigliucci, 2010; Wagner and Zhang, 2011). On the other hand are those centered around the notions of origination, innovation, and novelty, the so-called "innovation triad." To Hendrikse et al. (2007), for example, evolvability is the key issue that justifies recognizing evo-devo as an autonomous discipline. Others, e.g., Muller and Newman (2005), focus instead on the innovation triad. Unfortunately, for all these candidates to core concept of evo-devo, too many alternative definitions have been proposed (or, more dangerously, implicitly assumed), thus adding new items to the dramatically increasing series of biological terms on whose definition there seem to be more and more disagreement. Eventually, we should probably learn to accept that multiple notions associated with each of these terms deserve to be retained and perhaps recognized by adjectival specifications. Similar terminological refinement is applied to other biological terms such as species (e.g., Claridge et al., 1997), homology (e.g., Minelli and Fusco, 2013a), and gene (e.g., Beurton et al., 2000). In discussing the concept of gene in historical perspective, Muller-Wille and Rheinberger (2009) have sensibly recalled Friedrich Nietzsche's (1887; second essay, para. 13) dictum, that "all concepts in which an entire process is semiotically concentrated elude definition; only that which has no history is definable." In addition to terminological ambiguity, there is an another problem with the "innovation triad"—the problem that these terms are all framed in terms of "origins." Framing definitions in terms of origin requires splitting the evolutionary sequence in two contiguous segments, "before" and "after" the origination of a new feature. This splitting is a natural consequence if origination indeed "refers to the specific causality of the generative conditions that underlie both the first origins and the later innovations of phenotypes" and especially "the very first beginnings of phenotypes, e.g., the origin of multicellular assemblies, of complex tissues, and of the generic forms that result from the self-organizational and physical principles of cell interaction (Newman, 1992, 1994). In contrast, innovation [evolutionary modes and mechanisms] and novelty [their phenotypic outcome] designate the processes and results of introducing new characters into already existing phenotypic themes of a certain architecture (bodyplans)" (Muller and Newman, 2005, p. 490). This separation, however, is artificial. The better we know a process, the less we are able to identify its exact origins, these instead being determined by arbitrary choice. In science, and especially in biological disciplines with a strong historical dimension such as evolutionary biology and developmental biology, we should frame questions in terms of transitions rather than origins
Morphological correlates to cognitive dysfunction in schizophrenia as studied with Bayesian regression
BACKGROUND: Relationships between cognitive deficits and brain morphological changes observed in schizophrenia are alternately explained by less gray matter in the brain cerebral cortex, by alterations in neural circuitry involving the basal ganglia, and by alteration in cerebellar structures and related neural circuitry. This work explored a model encompassing all of these possibilities to identify the strongest morphological relationships to cognitive skill in schizophrenia. METHODS: Seventy-one patients with schizophrenia and sixty-five healthy control subjects were characterized by neuropsychological tests covering six functional domains. Measures of sixteen brain morphological structures were taken using semi-automatic and fully manual tracing of MRI images, with the full set of measures completed on thirty of the patients and twenty controls. Group differences were calculated. A Bayesian decision-theoretic method identified those morphological features, which best explained neuropsychological test scores in the context of a multivariate response linear model with interactions. RESULTS: Patients performed significantly worse on all neuropsychological tests except some regarding executive function. The most prominent morphological observations were enlarged ventricles, reduced posterior superior vermis gray matter volumes, and increased putamen gray matter volumes in the patients. The Bayesian method associated putamen volumes with verbal learning, vigilance, and (to a lesser extent) executive function, while caudate volumes were associated with working memory. Vermis regions were associated with vigilance, executive function, and, less strongly, visuo-motor speed. Ventricular volume was strongly associated with visuo-motor speed, vocabulary, and executive function. Those neuropsychological tests, which were strongly associated to ventricular volume, showed only weak association to diagnosis, possibly because ventricular volume was regarded a proxy for diagnosis. Diagnosis was strongly associated with the other neuropsychological tests, implying that the morphological associations for these tasks reflected morphological effects and not merely group volumetric differences. Interaction effects were rarely associated, indicating that volumetric relationships to neuropsychological performance were similar for both patients and controls. CONCLUSION: The association of subcortical and cerebellar structures to verbal learning, vigilance, and working memory supports the importance of neural connectivity to these functions. The finding that a morphological indicator of diagnosis (ventricular volume) provided more explanatory power than diagnosis itself for visuo-motor speed, vocabulary, and executive function suggests that volumetric abnormalities in the disease are more important for cognition than non-morphological features
Network Evolution of Body Plans
Segmentation in arthropod embryogenesis represents a well-known example of
body plan diversity. Striped patterns of gene expression that lead to the
future body segments appear simultaneously or sequentially in long and short
germ-band development, respectively. Regulatory genes relevant for stripe
formation are evolutionarily conserved among arthropods, therefore the
differences in the observed traits are thought to have originated from how the
genes are wired. To reveal the basic differences in the network structure, we
have numerically evolved hundreds of gene regulatory networks that produce
striped patterns of gene expression. By analyzing the topologies of the
generated networks, we show that the characteristics of stripe formation in
long and short germ-band development are determined by Feed-Forward Loops
(FFLs) and negative Feed-Back Loops (FBLs) respectively. Network architectures,
gene expression patterns and knockout responses exhibited by the artificially
evolved networks agree with those reported in the fly Drosophila melanogaster
and the beetle Tribolium castaneum. For other arthropod species, principal
network architectures that remain largely unknown are predicted.Comment: 35 pages, 4 figures and 1 tabl
The Role of relA and spoT in Yersinia pestis KIM5+ Pathogenicity
The ppGpp molecule is part of a highly conserved regulatory system for mediating the growth response to various environmental conditions. This mechanism may represent a common strategy whereby pathogens such as Yersinia pestis, the causative agent of plague, regulate the virulence gene programs required for invasion, survival and persistence within host cells to match the capacity for growth. The products of the relA and spoT genes carry out ppGpp synthesis. To investigate the role of ppGpp on growth, protein synthesis, gene expression and virulence, we constructed a ΔrelA ΔspoT Y. pestis mutant. The mutant was no longer able to synthesize ppGpp in response to amino acid or carbon starvation, as expected. We also found that it exhibited several novel phenotypes, including a reduced growth rate and autoaggregation at 26°C. In addition, there was a reduction in the level of secretion of key virulence proteins and the mutant was>1,000-fold less virulent than its wild-type parent strain. Mice vaccinated subcutaneously (s.c.) with 2.5×104 CFU of the ΔrelA ΔspoT mutant developed high anti-Y. pestis serum IgG titers, were completely protected against s.c. challenge with 1.5×105 CFU of virulent Y. pestis and partially protected (60% survival) against pulmonary challenge with 2.0×104 CFU of virulent Y. pestis. Our results indicate that ppGpp represents an important virulence determinant in Y. pestis and the ΔrelA ΔspoT mutant strain is a promising vaccine candidate to provide protection against plague
Molecular platforms for targeted drug delivery
The targeted delivery of bioactive molecules to the appropriate site of action, one of the critical focuses of pharmaceutical research, improves therapeutic outcomes and increases safety at the same time; a concept envisaged by Ehrlich over 100 years ago when he described the "magic bullet" model. In the following decades, a considerable amount of research effort combined with enormous investment has carried selective drug targeting into clinical practice via the advent of monoclonal antibodies (mAbs) and antibody-drug conjugates derivatives. Additionally, a deeper understanding of physiopathological conditions of disease has permitted the tailored design of targeted drug delivery platforms that carry drugs, many copies of the same drug, and different drugs in combination to the appropriate site of action least selectively or preferentially. The acquired know-how has provided the field with the design rationale to develop a successful delivery system that will provide new and improved means to treat many intractable diseases and disorders. In this review, we discuss a wide range of molecular platforms for drug delivery, and focus on those with more success in the clinic, given their potential for targeted therapies
Chondroitin sulfates and their binding molecules in the central nervous system
Chondroitin sulfate (CS) is the most abundant glycosaminoglycan (GAG) in the central nervous system (CNS) matrix. Its sulfation and epimerization patterns give rise to different forms of CS, which enables it to interact specifically and with a significant affinity with various signalling molecules in the matrix including growth factors, receptors and guidance molecules. These interactions control numerous biological and pathological processes, during development and in adulthood. In this review, we describe the specific interactions of different families of proteins involved in various physiological and cognitive mechanisms with CSs in CNS matrix. A better understanding of these interactions could promote a development of inhibitors to treat neurodegenerative diseases
- …