915 research outputs found
Transcriptional adaptation of Mycobacterium tuberculosis within macrophages: Insights into the phagosomal environment
Little is known about the biochemical environment in phagosomes harboring an infectious agent. To assess the state of this organelle we captured the transcriptional responses of Mycobacterium tuberculosis (MTB) in macrophages from wild-type and nitric oxide (NO) synthase 2âdeficient mice before and after immunologic activation. The intraphagosomal transcriptome was compared with the transcriptome of MTB in standard broth culture and during growth in diverse conditions designed to simulate features of the phagosomal environment. Genes expressed differentially as a consequence of intraphagosomal residence included an interferon ďż˝ â and NO-induced response that intensifies an iron-scavenging program, converts the microbe from aerobic to anaerobic respiration, and induces a dormancy regulon. Induction of genes involved in the activation and ďż˝-oxidation of fatty acids indicated that fatty acids furnish carbon and energy. Induction of ďż˝E-dependent, sodium dodecyl sulfateâregulated genes and genes involved in mycolic acid modification pointed to damage and repair of the cell envelope. Sentinel genes within the intraphagosomal transcriptome were induced similarly by MTB in the lungs of mice. The microbial transcriptome thus served as a bioprobe of the MTB phagosomal environment
Necessary and sufficient conditions of solution uniqueness in minimization
This paper shows that the solutions to various convex minimization
problems are \emph{unique} if and only if a common set of conditions are
satisfied. This result applies broadly to the basis pursuit model, basis
pursuit denoising model, Lasso model, as well as other models that
either minimize or impose the constraint , where
is a strictly convex function. For these models, this paper proves that,
given a solution and defining I=\supp(x^*) and s=\sign(x^*_I),
is the unique solution if and only if has full column rank and there
exists such that and for . This
condition is previously known to be sufficient for the basis pursuit model to
have a unique solution supported on . Indeed, it is also necessary, and
applies to a variety of other models. The paper also discusses ways to
recognize unique solutions and verify the uniqueness conditions numerically.Comment: 6 pages; revised version; submitte
Phylogenetic and environmental context of a Tournaisian tetrapod fauna
The end-Devonian to mid-Mississippian time interval has long been known for its depauperate palaeontological record, especially for tetrapods. This interval encapsulates the time of increasing terrestriality among tetrapods, but only two Tournaisian localities previously produced tetrapod fossils. Here we describe five new Tournaisian tetrapods (, , , and ) from two localities in their environmental context. A phylogenetic analysis retrieved three taxa as stem tetrapods, interspersed among Devonian and Carboniferous forms, and two as stem amphibians, suggesting a deep split among crown tetrapods. We also illustrate new tetrapod specimens from these and additional localities in the Scottish Borders region. The new taxa and specimens suggest that tetrapod diversification was well established by the Tournaisian. Sedimentary evidence indicates that the tetrapod fossils are usually associated with sandy siltstones overlying wetland palaeosols. Tetrapods were probably living on vegetated surfaces that were subsequently flooded. We show that atmospheric oxygen levels were stable across the Devonian/Carboniferous boundary, and did not inhibit the evolution of terrestriality. This wealth of tetrapods from Tournaisian localities highlights the potential for discoveries elsewhere.NERC consortium grants NE/J022713/1 (Cambridge), NE/J020729/1 (Leicester), NE/J021067/1 (BGS), NE/J020621/1 (NMS) and NE/J021091/1 (Southampton
Precisely timed oculomotor and parietal EEG activity in perceptual switching
Blinks and saccades cause transient interruptions of visual input. To investigate how such effects influence our perceptual state, we analyzed the time courses of blink and saccade rates in relation to perceptual switching in the Necker cube. Both time courses of blink and saccade rates showed peaks at different moments along the switching process. A peak in blinking rate appeared 1,000Â ms prior to the switching responses. Blinks occurring around this peak were associated with subsequent switching to the preferred interpretation of the Necker cube. Saccade rates showed a peak 150Â ms prior to the switching response. The direction of saccades around this peak was predictive of the perceived orientation of the Necker cube afterwards. Peak blinks were followed and peak saccades were preceded by transient parietal theta band activity indicating the changing of the perceptual interpretation. Precisely-timed blinks, therefore, can initiate perceptual switching, and precisely-timed saccades can facilitate an ongoing change of interpretation
Combining Shapley value and statistics to the analysis of gene expression data in children exposed to air pollution
<p>Abstract</p> <p>Background</p> <p>In gene expression analysis, statistical tests for differential gene expression provide lists of candidate genes having, individually, a sufficiently low <it>p</it>-value. However, the interpretation of each single <it>p</it>-value within complex systems involving several interacting genes is problematic. In parallel, in the last sixty years, <it>game theory </it>has been applied to political and social problems to assess the power of interacting agents in forcing a decision and, more recently, to represent the relevance of genes in response to certain conditions.</p> <p>Results</p> <p>In this paper we introduce a Bootstrap procedure to test the null hypothesis that each gene has the same relevance between two conditions, where the relevance is represented by the Shapley value of a particular coalitional game defined on a microarray data-set. This method, which is called <it>Comparative Analysis of Shapley value </it>(shortly, CASh), is applied to data concerning the gene expression in children differentially exposed to air pollution. The results provided by CASh are compared with the results from a parametric statistical test for testing differential gene expression. Both lists of genes provided by CASh and t-test are informative enough to discriminate exposed subjects on the basis of their gene expression profiles. While many genes are selected in common by CASh and the parametric test, it turns out that the biological interpretation of the differences between these two selections is more interesting, suggesting a different interpretation of the main biological pathways in gene expression regulation for exposed individuals. A simulation study suggests that CASh offers more power than t-test for the detection of differential gene expression variability.</p> <p>Conclusion</p> <p>CASh is successfully applied to gene expression analysis of a data-set where the joint expression behavior of genes may be critical to characterize the expression response to air pollution. We demonstrate a synergistic effect between coalitional games and statistics that resulted in a selection of genes with a potential impact in the regulation of complex pathways.</p
Evaluation of Jackknife and Bootstrap for Defining Confidence Intervals for Pairwise Agreement Measures
Several research fields frequently deal with the analysis of diverse classification results of the same entities. This should imply an objective detection of overlaps and divergences between the formed clusters. The congruence between classifications can be quantified by clustering agreement measures, including pairwise agreement measures. Several measures have been proposed and the importance of obtaining confidence intervals for the point estimate in the comparison of these measures has been highlighted. A broad range of methods can be used for the estimation of confidence intervals. However, evidence is lacking about what are the appropriate methods for the calculation of confidence intervals for most clustering agreement measures. Here we evaluate the resampling techniques of bootstrap and jackknife for the calculation of the confidence intervals for clustering agreement measures. Contrary to what has been shown for some statistics, simulations showed that the jackknife performs better than the bootstrap at accurately estimating confidence intervals for pairwise agreement measures, especially when the agreement between partitions is low. The coverage of the jackknife confidence interval is robust to changes in cluster number and cluster size distribution
Reconstructing âthe Alcoholicâ: Recovering from Alcohol Addiction and the Stigma this Entails
Public perception of alcohol addiction is frequently negative, whilst an important part of recovery is the construction of a positive sense of self. In order to explore how this might be achieved, we investigated how those who self-identify as in recovery from alcohol problems view themselves and their difficulties with alcohol and how they make sense of othersâ responses to their addiction. Semi-structured interviews with six individuals who had been in recovery between 5 and 35 years and in contact with Alcoholics Anonymous were analysed using Interpretative Phenomenological Analysis. The participants were acutely aware of stigmatising images of âalcoholicsâ and described having struggled with a considerable dilemma in accepting this identity themselves. However, to some extent they were able to resist stigma by conceiving of an âaware alcoholic selfâ which was divorced from their previously unaware self and formed the basis for a new more knowing and valued identity
A proposed method to investigate reliability throughout a questionnaire
<p>Abstract</p> <p>Background</p> <p>Questionnaires are used extensively in medical and health care research and depend on validity and reliability. However, participants may differ in interest and awareness throughout long questionnaires, which can affect reliability of their answers. A method is proposed for "screening" of systematic change in random error, which could assess changed reliability of answers.</p> <p>Methods</p> <p>A simulation study was conducted to explore whether systematic change in reliability, expressed as changed random error, could be assessed using unsupervised classification of subjects by cluster analysis (CA) and estimation of intraclass correlation coefficient (ICC). The method was also applied on a clinical dataset from 753 cardiac patients using the Jalowiec Coping Scale.</p> <p>Results</p> <p>The simulation study showed a relationship between the systematic change in random error throughout a questionnaire and the slope between the estimated ICC for subjects classified by CA and successive items in a questionnaire. This slope was proposed as an awareness measure - to assessing if respondents provide only a random answer or one based on a substantial cognitive effort. Scales from different factor structures of Jalowiec Coping Scale had different effect on this awareness measure.</p> <p>Conclusions</p> <p>Even though assumptions in the simulation study might be limited compared to real datasets, the approach is promising for assessing systematic change in reliability throughout long questionnaires. Results from a clinical dataset indicated that the awareness measure differed between scales.</p
Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression
<p>Abstract</p> <p>Background</p> <p>When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL) techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR) modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison.</p> <p>Results</p> <p>The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR.</p> <p>Conclusions</p> <p>The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.</p
- âŚ