2,876 research outputs found

    Direct Separation of Short Range Order in Intermixed Nanocrystalline and Amorphous Phases

    Get PDF
    Diffraction anomalous fine-structure (DAFS) and extended x-ray absorption fine-structure (EXAFS) measurements were combined to determine short range order (SRO) about a single atomic type in a sample of mixed amorphous and nanocrystalline phases of germanium. EXAFS yields information about the SRO of all Ge atoms in the sample, while DAFS determines the SRO of only the ordered fraction. We determine that the first-shell distance distribution is bimodal; the nanocrystalline distance is the same as the bulk crystal, to within 0.01(2)   Å, but the mean amorphous Ge-Ge bond length is expanded by 0.076(19)   Å. This approach can be applied to many systems of mixed amorphous and nanocrystalline phases

    ASO Author Reflections: Hematological Biomarkers of Survival in Cutaneous Melanoma

    Get PDF

    On ‘Organized Crime’ in the illicit antiquities trade: moving beyond the definitional debate

    Get PDF
    The extent to which ‘organized crime’ is involved in illicit antiquities trafficking is unknown and frequently debated. This paper explores the significance and scale of the illicit antiquities trade as a unique transnational criminal phenomenon that is often said to be perpetrated by and exhibit traits of so-called ‘organized crime.’ The definitional debate behind the term ‘organized crime’ is considered as a potential problem impeding our understanding of its existence or extent in illicit antiquities trafficking, and a basic progression-based model is then suggested as a new tool to move beyond the definitional debate for future research that may help to elucidate the actors, processes and criminal dynamics taking place within the illicit antiquities trade from source to market. The paper concludes that researchers should focus not on the question of whether organized criminals- particularly in a traditionally conceived, mafia-type stereotypical sense- are involved in the illicit antiquities trade, but instead on the structure and progression of antiquities trafficking itself that embody both organized and criminal dynamics

    Electrophysiological Heterogeneity of Fast-Spiking Interneurons: Chandelier versus Basket Cells

    Get PDF
    In the prefrontal cortex, parvalbumin-positive inhibitory neurons play a prominent role in the neural circuitry that subserves working memory, and alterations in these neurons contribute to the pathophysiology of schizophrenia. Two morphologically distinct classes of parvalbumin neurons that target the perisomatic region of pyramidal neurons, chandelier cells (ChCs) and basket cells (BCs), are generally thought to have the same "fast-spiking" phenotype, which is characterized by a short action potential and high frequency firing without adaptation. However, findings from studies in different species suggest that certain electrophysiological membrane properties might differ between these two cell classes. In this study, we assessed the physiological heterogeneity of fast-spiking interneurons as a function of two factors: species (macaque monkey vs. rat) and morphology (chandelier vs. basket). We showed previously that electrophysiological membrane properties of BCs differ between these two species. Here, for the first time, we report differences in ChCs membrane properties between monkey and rat. We also found that a number of membrane properties differentiate ChCs from BCs. Some of these differences were species-independent (e.g., fast and medium afterhyperpolarization, firing frequency, and depolarizing sag), whereas the differences in the first spike latency between ChCs and BCs were species-specific. Our findings indicate that different combinations of electrophysiological membrane properties distinguish ChCs from BCs in rodents and primates. Such electrophysiological differences between ChCs and BCs likely contribute to their distinctive roles in cortical circuitry in each species. © 2013 Povysheva et al

    Using Neural Networks for Relation Extraction from Biomedical Literature

    Full text link
    Using different sources of information to support automated extracting of relations between biomedical concepts contributes to the development of our understanding of biological systems. The primary comprehensive source of these relations is biomedical literature. Several relation extraction approaches have been proposed to identify relations between concepts in biomedical literature, namely, using neural networks algorithms. The use of multichannel architectures composed of multiple data representations, as in deep neural networks, is leading to state-of-the-art results. The right combination of data representations can eventually lead us to even higher evaluation scores in relation extraction tasks. Thus, biomedical ontologies play a fundamental role by providing semantic and ancestry information about an entity. The incorporation of biomedical ontologies has already been proved to enhance previous state-of-the-art results.Comment: Artificial Neural Networks book (Springer) - Chapter 1

    Integrity and Its Counterfeits: Implications for Economy, Business and Management

    Get PDF
    While the concept of integrity has long been explored by great philosophers and thinkers, its application in modern and postmodern business and economic contexts has been underdeveloped. Little have been done to address the vagueness and paradoxicality of integrity and its shadow reality of counterfeits. The thematic collection, which this paper complements, entitled ‘Integrity and Its Counterfeits: Implications for Economy, Business and Management’, makes a contribution towards filling the gap between the abstract concept of integrity and its application into business and economy, with a particular attention on the ambiguous, equivocal and diverse meanings of the concept, the complex and dynamic practicality of integrity, and the grey and dark areas of business out of integrity. This article introduces the background of the research theme and provides exemplary debates and emerging avenues of discussion on this topic

    Adjusting for multiple prognostic factors in the analysis of randomised trials

    Get PDF
    Background: When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods: We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results: Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions: It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not generally need to depend on the method of randomisation used. Most methods of analysis work well with large sample sizes, however treating strata as random effects should be the analysis method of choice with binary or time-to-event outcomes and a small sample size

    Groups whose word problem is a Petri net language

    Get PDF
    There has been considerable interest in exploring the connections between the word problem of a finitely generated group as a formal language and the algebraic structure of the group. However, there are few complete characterizations that tell us precisely which groups have their word problem in a specified class of languages. We investigate which finitely generated groups have their word problem equal to a language accepted by a Petri net and give a complete classification, showing that a group has such a word problem if and only if it is virtually abelian

    Reporting on covariate adjustment in randomised controlled trials before and after revision of the 2001 CONSORT statement: a literature review

    Get PDF
    <p>Abstract</p> <p>Objectives</p> <p>To evaluate the use and reporting of adjusted analysis in randomised controlled trials (RCTs) and compare the quality of reporting before and after the revision of the CONSORT Statement in 2001.</p> <p>Design</p> <p>Comparison of two cross sectional samples of published articles.</p> <p>Data Sources</p> <p>Journal articles indexed on PubMed in December 2000 and December 2006.</p> <p>Study Selection</p> <p>Parallel group RCTs with a full publication carried out in humans and published in English</p> <p>Main outcome measures</p> <p>Proportion of articles reported adjusted analysis; use of adjusted analysis; the reason for adjustment; the method of adjustment and the reporting of adjusted analysis results in the main text and abstract.</p> <p>Results</p> <p>In both cohorts, 25% of studies reported adjusted analysis (84/355 in 2000 vs 113/422 in 2006). Compared with articles reporting only unadjusted analyses, articles that reported adjusted analyses were more likely to specify primary outcomes, involve multiple centers, perform stratified randomization, be published in general medical journals, and recruit larger sample sizes. In both years a minority of articles explained why and how covariates were selected for adjustment (20% to 30%). Almost all articles specified the statistical methods used for adjustment (99% in 2000 vs 100% in 2006) but only 5% and 10%, respectively, reported both adjusted and unadjusted results as recommended in the CONSORT guidelines.</p> <p>Conclusion</p> <p>There was no evidence of change in the reporting of adjusted analysis results five years after the revision of the CONSORT Statement and only a few articles adhered fully to the CONSORT recommendations.</p
    • …
    corecore