1,072 research outputs found
Impact of imperfect test sensitivity on determining risk factors : the case of bovine tuberculosis
Background
Imperfect diagnostic testing reduces the power to detect significant predictors in classical cross-sectional studies. Assuming that the misclassification in diagnosis is random this can be dealt with by increasing the sample size of a study. However, the effects of imperfect tests in longitudinal data analyses are not as straightforward to anticipate, especially if the outcome of the test influences behaviour. The aim of this paper is to investigate the impact of imperfect test sensitivity on the determination of predictor variables in a longitudinal study.
Methodology/Principal Findings
To deal with imperfect test sensitivity affecting the response variable, we transformed the observed response variable into a set of possible temporal patterns of true disease status, whose prior probability was a function of the test sensitivity. We fitted a Bayesian discrete time survival model using an MCMC algorithm that treats the true response patterns as unknown parameters in the model. We applied our approach to epidemiological data of bovine tuberculosis outbreaks in England and investigated the effect of reduced test sensitivity in the determination of risk factors for the disease. We found that reduced test sensitivity led to changes to the collection of risk factors associated with the probability of an outbreak that were chosen in the ‘best’ model and to an increase in the uncertainty surrounding the parameter estimates for a model with a fixed set of risk factors that were associated with the response variable.
Conclusions/Significance
We propose a novel algorithm to fit discrete survival models for longitudinal data where values of the response variable are uncertain. When analysing longitudinal data, uncertainty surrounding the response variable will affect the significance of the predictors and should therefore be accounted for either at the design stage by increasing the sample size or at the post analysis stage by conducting appropriate sensitivity analyses
Granitic Boulder Erosion Caused by Chaparral Wildfire: Implications for Cosmogenic Radionuclide Dating of Bedrock Surfaces
Rock surface erosion by wildfire is significant and widespread but has not been quantified in southern California or for chaparral ecosystems. Quantifying the surface erosion of bedrock outcrops and boulders is critical for determination of age using cosmogenic radionuclide techniques, as even modest surface erosion removes the accumulation of the cosmogenic radionuclides and causes significant underestimate of age. This study documents the effects on three large granitic boulders following the Esperanza Fire of 2006 in southern California. Spalled rock fragments were quantified by measuring the removed rock volume from each measured boulder. Between 7% and 55% of the total surface area of the boulders spalled in this single fire. The volume of spalled material, when normalized across the entire surface area, represents a mean surface lowering of 0.7–12.3 mm. Spalled material was thicker on the flanks of the boulders, and the height of the fire effects significantly exceeded the height of the vegetation prior to the wildfire. Surface erosion of boulders and bedrock outcrops as a result of wildfire spalling results in fresh surfaces that appear unaffected by chemical weathering. Such surfaces may be preferentially selected by researchers for cosmogenic surface dating because of their fresh appearance, leading to an underestimate of age
NOTCH SIGNALING INHIBITED BY IKAROS1 IN HUMAN T-CELL ACUTE LYMPHOBLASTIC LEUKEMIA
The highly conserved Notch signaling pathway regulates cell growth, differentiation, survival and apoptosis. In hematopoiesis, Notch signaling drives commitment to the T-cell fate and promotes differentiation, T-cell receptor signaling and immune function. In T-cells, Notch signaling is oncogenic when constitutively active, promoting proliferation and survival while inhibiting differentiation. The majority of patients with T-cell acute lymphoblastic leukemia (TALL) have activating Notch mutations. Similarly, Notch activation is present in more than 75% of murine T-ALL models. In murine T-ALL, loss of Ikaros, a zinc finger transcriptional regulator, leads to disrupted differentiation and also cooperates with activated Notch signaling to promote T-ALL. Loss of Ikaros function most frequently results from alternative RNA splicing, which leads to expression of non-DNA-binding isoforms. In murine T-ALL, Notch and Ikaros are reciprocally regulated. Specifically, constitutive Notch3 activation promotes expression of non-DNA-binding Ikaros isoforms in vivo, leading to loss of Ikaros function. Additionally, Ikaros regulates Notch by competing for binding sites in the promoter regions of Notch target genes, Hes1 and pTα.
To determine whether this reciprocal regulation of Notch and Ikaros occurs in human T-ALL, we expressed active intracellular Notch receptors in human T-ALL cell lines and showed that Notch activation does not significantly increase the expression of non-DNAbinding Ikaros isoforms. While Notch activation has little effect on the pattern of Ikaros isoform expression in human T-ALL cell lines, Ikaros exerts a repressive effect on Notch signaling, similar to the Notch repression described in murine T-ALLs. Specifically, we demonstrated that Ikaros overexpression inhibits growth of human T-ALLs and that Ikaros downregulates expression of Notch target genes Hes1 and Hes5 at the transcriptional level. Interestingly, Ikaros inhibits growth more effectively in the cell lines with stronger Notch activation. In addition, Ikaros was shown to inhibit the Notch target gene Hes1 in both Notch dependent and independent manners. These data support the hypothesis that disruption of Ikaros contributes to proliferation of human T-ALL through de-repression of Notch signaling. In summary, we have demonstrated that Ikaros represses Notch signaling in human T-ALL
Recommended from our members
Saving the World for All the Wrong Reasons: Extrinsic Motivation Reduces Favorability of Prosocial Acts
When observing the prosocial acts of others, people tend to be very concerned with the reasons for act. A charitable donation motivated by concern for the charitable cause is seen as noble, while the same donation motivated by image enhancement is seen as disingenuous. In a series of six studies, participants consistently evaluated extrinsically motivated prosocial acts to be subjectively smaller and less impactful than the identical but intrinsically motivated act, and evaluated the extrinsically motivated actors less favorably than intrinsically motivated actors. These effects were robust across different prosocial domains and across different types of acts, including the donation of money and time and for conservation behaviors. These results demonstrate that motivation information causes people to violate strict adherence to principles of fungibility, using contextual information to evaluate equal fungible units differently. Two further studies establish that people will adjust their choices of products and resource allocation to punish extrinsically motivated actors and reward intrinsically motivated actors. The authors discuss these findings relative to formal principles of rationality, and propose an explanation of contextualized rationality. The implications of these findings for policy-making and implementation are discussed
Strategy for efficient generation of numerous full-length cDNA clones of classical swine fever virus for haplotyping
Abstract Background Direct molecular cloning of full-length cDNAs derived from viral RNA is an approach to identify the individual viral genomes within a virus population. This enables characterization of distinct viral haplotypes present during infection. Results In this study, we recover individual genomes of classical swine fever virus (CSFV), present in a pig infected with vKos that was rescued from a cDNA clone corresponding to the highly virulent CSFV Koslov strain. Full-length cDNA amplicons (ca. 12.3 kb) were made by long RT-PCR, using RNA extracted from serum, and inserted directly into a cloning vector prior to detailed characterization of the individual viral genome sequences. The amplicons used for cloning were deep sequenced, which revealed low level sequence variation (< 5%) scattered across the genome consistent with the clone-derived origin of vKos. Numerous full-length cDNA clones were generated using these amplicons and full-genome sequencing of individual cDNA clones revealed insights into the virus diversity and the haplotypes present during infection. Most cDNA clones were unique, containing several single-nucleotide polymorphisms, and phylogenetic reconstruction revealed a low degree of order. Conclusions This optimized methodology enables highly efficient construction of full-length cDNA clones corresponding to individual viral genomes present within RNA virus populations
Recommended from our members
Artificial intelligence approaches to predicting and detecting cognitive decline in older adults: A conceptual review.
Preserving cognition and mental capacity is critical to aging with autonomy. Early detection of pathological cognitive decline facilitates the greatest impact of restorative or preventative treatments. Artificial Intelligence (AI) in healthcare is the use of computational algorithms that mimic human cognitive functions to analyze complex medical data. AI technologies like machine learning (ML) support the integration of biological, psychological, and social factors when approaching diagnosis, prognosis, and treatment of disease. This paper serves to acquaint clinicians and other stakeholders with the use, benefits, and limitations of AI for predicting, diagnosing, and classifying mild and major neurocognitive impairments, by providing a conceptual overview of this topic with emphasis on the features explored and AI techniques employed. We present studies that fell into six categories of features used for these purposes: (1) sociodemographics; (2) clinical and psychometric assessments; (3) neuroimaging and neurophysiology; (4) electronic health records and claims; (5) novel assessments (e.g., sensors for digital data); and (6) genomics/other omics. For each category we provide examples of AI approaches, including supervised and unsupervised ML, deep learning, and natural language processing. AI technology, still nascent in healthcare, has great potential to transform the way we diagnose and treat patients with neurocognitive disorders
Effectiveness of Biological Surrogates for Predicting Patterns of Marine Biodiversity: A Global Meta-Analysis
The use of biological surrogates as proxies for biodiversity patterns is gaining popularity, particularly in marine systems where field surveys can be expensive and species richness high. Yet, uncertainty regarding their applicability remains because of inconsistency of definitions, a lack of standard methods for estimating effectiveness, and variable spatial scales considered. We present a Bayesian meta-analysis of the effectiveness of biological surrogates in marine ecosystems. Surrogate effectiveness was defined both as the proportion of surrogacy tests where predictions based on surrogates were better than random (i.e., low probability of making a Type I error; P) and as the predictability of targets using surrogates (R2). A total of 264 published surrogacy tests combined with prior probabilities elicited from eight international experts demonstrated that the habitat, spatial scale, type of surrogate and statistical method used all influenced surrogate effectiveness, at least according to either P or R2. The type of surrogate used (higher-taxa, cross-taxa or subset taxa) was the best predictor of P, with the higher-taxa surrogates outperforming all others. The marine habitat was the best predictor of R2, with particularly low predictability in tropical reefs. Surrogate effectiveness was greatest for higher-taxa surrogates at a <10-km spatial scale, in low-complexity marine habitats such as soft bottoms, and using multivariate-based methods. Comparisons with terrestrial studies in terms of the methods used to study surrogates revealed that marine applications still ignore some problems with several widely used statistical approaches to surrogacy. Our study provides a benchmark for the reliable use of biological surrogates in marine ecosystems, and highlights directions for future development of biological surrogates in predicting biodiversity
Rubisco is not really so bad
Ribulose-1,5-bisphosphate carboxylase/oxygenase (Rubisco) is the most widespread carboxylating enzyme in autotrophic organisms. Its kinetic and structural properties have been intensively studied for more than half a century. Yet important aspects of the catalytic mechanism remain poorly understood, especially the oxygenase reaction. Because of its relatively modest turnover rate (a few catalytic events per second) and the competitive inhibition by oxygen, Rubisco is often viewed as an inefficient catalyst for CO2 fixation. Considerable efforts have been devoted to improving its catalytic efficiency, so far without success. In this review, we re-examine Rubisco's catalytic performance by comparison with other chemically related enzymes. We find that Rubisco is not especially slow. Furthermore, considering both the nature and the complexity of the chemical reaction, its kinetic properties are unremarkable. Although not unique to Rubisco, oxygenation is not systematically observed in enolate and enamine forming enzymes and cannot be considered as an inevitable consequence of the mechanism. It is more likely the result of a compromise between chemical and metabolic imperatives. We argue that a better description of Rubisco mechanism is still required to better understand the link between CO2 and O2 reactivity and the rationale of Rubisco diversification and evolution.C. B. and G. D. F. acknowledge funding by the Australian Government through the Australian Research Council Centre of Excellence for Translational Photosynthesis (Project CE140100015), and G. T. thanks the Australian Research Council for its support via a Fellowship under contract FT140100645
Granitic Boulder Erosion Caused by Chaparral Wildfire: Implications for Cosmogenic Radionuclide Dating of Bedrock Surfaces
Rock surface erosion by wildfire is significant and widespread but has not been quantified in southern California or for chaparral ecosystems. Quantifying the surface erosion of bedrock outcrops and boulders is critical for determination of age using cosmogenic radionuclide techniques, as even modest surface erosion removes the accumulation of the cosmogenic radionuclides and causes significant underestimate of age. This study documents the effects on three large granitic boulders following the Esperanza Fire of 2006 in southern California. Spalled rock fragments were quantified by measuring the removed rock volume from each measured boulder. Between 7% and 55% of the total surface area of the boulders spalled in this single fire. The volume of spalled material, when normalized across the entire surface area, represents a mean surface lowering of 0.7–12.3 mm. Spalled material was thicker on the flanks of the boulders, and the height of the fire effects significantly exceeded the height of the vegetation prior to the wildfire. Surface erosion of boulders and bedrock outcrops as a result of wildfire spalling results in fresh surfaces that appear unaffected by chemical weathering. Such surfaces may be preferentially selected by researchers for cosmogenic surface dating because of their fresh appearance, leading to an underestimate of age
- …