105 research outputs found
The empirical analysis of non-problematic video gaming and cognitive skills: a systematic review
Videogames have become one of the most popular leisure activities worldwide, including multiple game genres with different characteristics and levels of involvement required. Although a small minority of excessive players suffer detrimental consequences including impairment of several cognitive skills (e.g., inhibition, decision-making), it has also been demonstrated that playing videogames can improve different cognitive skills. Therefore, the current paper systematically reviewed the empirical studies experimentally investigating the positive impact of videogames on cognitive skills. Following a number of inclusion and exclusion criteria, a total of 32 papers were identified as empirically investigating three specific skills: taskswitching (eight studies), attentional control (22 studies), and sub-second time perception (two studies). Results demonstrated that compared to control groups, non-problematic use of videogames can lead to improved task-switching, more effective top-down attentional control and processing speed and increased sub-second time perception. Two studies highlighted the impact of gaming on cognitive skills differs depends upon game genre. The studies reviewed suggest that videogame play can have a positive impact on cognitive processes for players
Batch effect correction for genome-wide methylation data with Illumina Infinium platform
<p>Abstract</p> <p>Background</p> <p>Genome-wide methylation profiling has led to more comprehensive insights into gene regulation mechanisms and potential therapeutic targets. Illumina Human Methylation BeadChip is one of the most commonly used genome-wide methylation platforms. Similar to other microarray experiments, methylation data is susceptible to various technical artifacts, particularly batch effects. To date, little attention has been given to issues related to normalization and batch effect correction for this kind of data.</p> <p>Methods</p> <p>We evaluated three common normalization approaches and investigated their performance in batch effect removal using three datasets with different degrees of batch effects generated from HumanMethylation27 platform: quantile normalization at average β value (QNβ); two step quantile normalization at probe signals implemented in "lumi" package of R (lumi); and quantile normalization of A and B signal separately (ABnorm). Subsequent Empirical Bayes (EB) batch adjustment was also evaluated.</p> <p>Results</p> <p>Each normalization could remove a portion of batch effects and their effectiveness differed depending on the severity of batch effects in a dataset. For the dataset with minor batch effects (Dataset 1), normalization alone appeared adequate and "lumi" showed the best performance. However, all methods left substantial batch effects intact in the datasets with obvious batch effects and further correction was necessary. Without any correction, 50 and 66 percent of CpGs were associated with batch effects in Dataset 2 and 3, respectively. After QNβ, lumi or ABnorm, the number of CpGs associated with batch effects were reduced to 24, 32, and 26 percent for Dataset 2; and 37, 46, and 35 percent for Dataset 3, respectively. Additional EB correction effectively removed such remaining non-biological effects. More importantly, the two-step procedure almost tripled the numbers of CpGs associated with the outcome of interest for the two datasets.</p> <p>Conclusion</p> <p>Genome-wide methylation data from Infinium Methylation BeadChip can be susceptible to batch effects with profound impacts on downstream analyses and conclusions. Normalization can reduce part but not all batch effects. EB correction along with normalization is recommended for effective batch effect removal.</p
Lamin A Rod Domain Mutants Target Heterochromatin Protein 1α and β for Proteasomal Degradation by Activation of F-Box Protein, FBXW10
Lamins are major structural proteins of the nucleus and contribute to the organization of various nuclear functions. Mutations in the human lamin A gene cause a number of highly degenerative diseases, collectively termed as laminopathies. Cells expressing lamin mutations exhibit abnormal nuclear morphology and altered heterochromatin organization; however, the mechanisms responsible for these defects are not well understood.The lamin A rod domain mutants G232E, Q294P and R386K are either diffusely distributed or form large aggregates in the nucleoplasm, resulting in aberrant nuclear morphology in various cell types. We examined the effects of these lamin mutants on the distribution of heterochromatin protein 1 (HP1) isoforms. HeLa cells expressing these mutants showed a heterogeneous pattern of HP1alpha and beta depletion but without altering HP1gamma levels. Changes in HP1alpha and beta were not observed in cells expressing wild-type lamin A or mutant R482L, which assembled normally at the nuclear rim. Treatment with proteasomal inhibitors led to restoration of levels of HP1 isoforms and also resulted in stable association of lamin mutants with the nuclear periphery, rim localization of the inner nuclear membrane lamin-binding protein emerin and partial improvement of nuclear morphology. A comparison of the stability of HP1 isoforms indicated that HP1alpha and beta displayed increased turnover and higher basal levels of ubiquitination than HP1gamma. Transcript analysis of components of the ubiquitination pathway showed that a specific F-box protein, FBXW10 was induced several-fold in cells expressing lamin mutants. Importantly, ectopic expression of FBXW10 in HeLa cells led to depletion of HP1alpha and beta without alteration of HP1gamma levels.Mislocalized lamins can induce ubiquitin-mediated proteasomal degradation of certain HP1 isoforms by activation of FBXW10, a member of the F-box family of proteins that is involved in E3 ubiquitin ligase activity
Bivalent-Like Chromatin Markers Are Predictive for Transcription Start Site Distribution in Human
Deep sequencing of 5′ capped transcripts has revealed a variety of transcription initiation patterns, from narrow, focused promoters to wide, broad promoters. Attempts have already been made to model empirically classified patterns, but virtually no quantitative models for transcription initiation have been reported. Even though both genetic and epigenetic elements have been associated with such patterns, the organization of regulatory elements is largely unknown. Here, linear regression models were derived from a pool of regulatory elements, including genomic DNA features, nucleosome organization, and histone modifications, to predict the distribution of transcription start sites (TSS). Importantly, models including both active and repressive histone modification markers, e.g. H3K4me3 and H4K20me1, were consistently found to be much more predictive than models with only single-type histone modification markers, indicating the possibility of “bivalent-like” epigenetic control of transcription initiation. The nucleosome positions are proposed to be coded in the active component of such bivalent-like histone modification markers. Finally, we demonstrated that models trained on one cell type could successfully predict TSS distribution in other cell types, suggesting that these models may have a broader application range
The ever-expanding conundrum of primary osteoporosis: aetiopathogenesis, diagnosis, and treatment
Lawson criterion for ignition exceeded in an inertial fusion experiment
For more than half a century, researchers around the world have been engaged in attempts to achieve fusion ignition as a proof of principle of various fusion concepts. Following the Lawson criterion, an ignited plasma is one where the fusion heating power is high enough to overcome all the physical processes that cool the fusion plasma, creating a positive thermodynamic feedback loop with rapidly increasing temperature. In inertially confined fusion, ignition is a state where the fusion plasma can begin "burn propagation" into surrounding cold fuel, enabling the possibility of high energy gain. While "scientific breakeven" (i.e., unity target gain) has not yet been achieved (here target gain is 0.72, 1.37 MJ of fusion for 1.92 MJ of laser energy), this Letter reports the first controlled fusion experiment, using laser indirect drive, on the National Ignition Facility to produce capsule gain (here 5.8) and reach ignition by nine different formulations of the Lawson criterion
Recommended from our members
Risk-stratified treatment for drug-susceptible pulmonary tuberculosis
The Phase 3 randomized controlled trial, TBTC Study 31/ACTG A5349 (NCT02410772) demonstrated that a 4-month rifapentine-moxifloxacin regimen for drug-susceptible pulmonary tuberculosis was safe and effective. The primary efficacy outcome was 12-month tuberculosis disease free survival, while the primary safety outcome was the proportion of grade 3 or higher adverse events during the treatment period. We conducted an analysis of demographic, clinical, microbiologic, radiographic, and pharmacokinetic data and identified risk factors for unfavorable outcomes and adverse events. Among participants receiving the rifapentine-moxifloxacin regimen, low rifapentine exposure is the strongest driver of tuberculosis-related unfavorable outcomes (HR 0.65 for every 100 µg∙h/mL increase, 95%CI 0.54–0.77). The only other risk factors identified are markers of higher baseline disease severity, namely Xpert MTB/RIF cycle threshold and extent of disease on baseline chest radiography (Xpert: HR 1.43 for every 3-cycle-threshold decrease, 95%CI 1.07–1.91; extensive disease: HR 2.02, 95%CI 1.07–3.82). From these risk factors, we developed a simple risk stratification to classify disease phenotypes as easier-, moderately-harder, or harder-to-treat TB. Notably, high rifapentine exposures are not associated with any predefined adverse safety outcomes. Our results suggest that the easier-to-treat subgroup may be eligible for further treatment shortening while the harder-to-treat subgroup may need higher doses or longer treatment
Transimulation - protein biosynthesis web service
Although translation is the key step during gene expression, it remains poorly characterized at the level of individual genes. For this reason, we developed Transimulation - a web service measuring translational activity of genes in three model organisms: Escherichia coli, Saccharomyces cerevisiae and Homo sapiens. The calculations are based on our previous computational model of translation and experimental data sets. Transimulation quantifies mean translation initiation and elongation time (expressed in SI units), and the number of proteins produced per transcript. It also approximates the number of ribosomes that typically occupy a transcript during translation, and simulates their propagation. The simulation of ribosomes' movement is interactive and allows modifying the coding sequence on the fly. It also enables uploading any coding sequence and simulating its translation in one of three model organisms. In such a case, ribosomes propagate according to mean codon elongation times of the host organism, which may prove useful for heterologous expression. Transimulation was used to examine evolutionary conservation of translational parameters of orthologous genes. Transimulation may be accessed at http://nexus.ibb.waw.pl/Transimulation (requires Java version 1.7 or higher). Its manual and source code, distributed under the GPL-2.0 license, is freely available at the website
Improvement of the high temperature oxidation of TaC-strengthened Co(Cr)-based cast superalloys by the addition of nickel
- …
