3,447 research outputs found
A Factorization Law for Entanglement Decay
We present a simple and general factorization law for quantum systems shared
by two parties, which describes the time evolution of entanglement upon passage
of either component through an arbitrary noisy channel. The robustness of
entanglement-based quantum information processing protocols is thus easily and
fully characterized by a single quantity.Comment: 4 pages, 5 figure
Numerical properties of staggered quarks with a taste-dependent mass term
The numerical properties of staggered Dirac operators with a taste-dependent
mass term proposed by Adams [1,2] and by Hoelbling [3] are compared with those
of ordinary staggered and Wilson Dirac operators. In the free limit and on
(quenched) interacting configurations, we consider their topological
properties, their spectrum, and the resulting pion mass. Although we also
consider the spectral structure, topological properties, locality, and
computational cost of an overlap operator with a staggered kernel, we call
attention to the possibility of using the Adams and Hoelbling operators without
the overlap construction. In particular, the Hoelbling operator could be used
to simulate two degenerate flavors without additive mass renormalization, and
thus without fine-tuning in the chiral limit.Comment: 14 pages, 9 figures. V2: published version; important note added
regarding Hoelbling fermions, otherwise minor change
Understanding Needs, Identifying Opportunities: ICT in the View of Universal Design
This article provides food for thoughts elaborated by peer researchers who, basing on their studies and on current literature on relationships between Universal Design (UD) and Information and Communication Technologies (ICT), wish to share few key issues related to the challenges offered by the involvement of final users in designing product and services. Referring to approaches from different disciplines, key questions will be highlighted on which a debate could start, focused on the issue of promoting inclusion and how a close relationship among these different areas of knowledge can contribute to bridge the gap between the potential of new technologies and the real and diversified need by persons. Thus, actively contributing toward the empowerment of the community of belonging
A statistical model for estimation of fish density including correlation in size, space, time and between species from research survey data
Trawl survey data with high spatial and seasonal coverage were analysed using a variant of the Log Gaussian Cox Process (LGCP) statistical model to estimate unbiased relative fish densities. The model estimates correlations between observations according to time, space, and fish size and includes zero observations and over-dispersion. The model utilises the fact the correlation between numbers of fish caught increases when the distance in space and time between the fish decreases, and the correlation between size groups in a haul increases when the difference in size decreases. Here the model is extended in two ways. Instead of assuming a natural scale size correlation, the model is further developed to allow for a transformed length scale. Furthermore, in the present application, the spatial- and size-dependent correlation between species was included. For cod (Gadus morhua) and whiting (Merlangius merlangus), a common structured size correlation was fitted, and a separable structure between the time and space-size correlation was found for each species, whereas more complex structures were required to describe the correlation between species (and space-size). The within-species time correlation is strong, whereas the correlations between the species are weaker over time but strong within the year
Recognition of Face Identity and Emotion in Expressive Specific Language Impairment
Objective: To study face and emotion recognition in children with mostly expressive specific language impairment (SLI-E). Subjects and Methods: A test movie to study perception and recognition of faces and mimic-gestural expression was applied to 24 children diagnosed as suffering from SLI-E and an age-matched control group of normally developing children. Results: Compared to a normal control group, the SLI-E children scored significantly worse in both the face and expression recognition tasks with a preponderant effect on emotion recognition. The performance of the SLI-E group could not be explained by reduced attention during the test session. Conclusion: We conclude that SLI-E is associated with a deficiency in decoding non-verbal emotional facial and gestural information, which might lead to profound and persistent problems in social interaction and development. Copyright (C) 2012 S. Karger AG, Base
Prediction of the binding affinities of peptides to class II MHC using a regularized thermodynamic model
<p>Abstract</p> <p>Background</p> <p>The binding of peptide fragments of extracellular peptides to class II MHC is a crucial event in the adaptive immune response. Each MHC allotype generally binds a distinct subset of peptides and the enormous number of possible peptide epitopes prevents their complete experimental characterization. Computational methods can utilize the limited experimental data to predict the binding affinities of peptides to class II MHC.</p> <p>Results</p> <p>We have developed the Regularized Thermodynamic Average, or RTA, method for predicting the affinities of peptides binding to class II MHC. RTA accounts for all possible peptide binding conformations using a thermodynamic average and includes a parameter constraint for regularization to improve accuracy on novel data. RTA was shown to achieve higher accuracy, as measured by AUC, than SMM-align on the same data for all 17 MHC allotypes examined. RTA also gave the highest accuracy on all but three allotypes when compared with results from 9 different prediction methods applied to the same data. In addition, the method correctly predicted the peptide binding register of 17 out of 18 peptide-MHC complexes. Finally, we found that suboptimal peptide binding registers, which are often ignored in other prediction methods, made significant contributions of at least 50% of the total binding energy for approximately 20% of the peptides.</p> <p>Conclusions</p> <p>The RTA method accurately predicts peptide binding affinities to class II MHC and accounts for multiple peptide binding registers while reducing overfitting through regularization. The method has potential applications in vaccine design and in understanding autoimmune disorders. A web server implementing the RTA prediction method is available at <url>http://bordnerlab.org/RTA/</url>.</p
Eta Carinae -- Physics of the Inner Ejecta
Eta Carinae's inner ejecta are dominated observationally by the bright
Weigelt blobs and their famously rich spectra of nebular emission and
absorption lines. They are dense (n_e ~ 10^7 to 10^8 cm^-3), warm (T_e ~ 6000
to 7000 K) and slow moving (~40 km/s) condensations of mostly neutral (H^0)
gas. Located within 1000 AU of the central star, they contain heavily
CNO-processed material that was ejected from the star about a century ago.
Outside the blobs, the inner ejecta include absorption-line clouds with similar
conditions, plus emission-line gas that has generally lower densities and a
wider range of speeds (reaching a few hundred km/s) compared to the blobs. The
blobs appear to contain a negligible amount of dust and have a nearly dust-free
view of the central source, but our view across the inner ejecta is severely
affected by uncertain amounts of dust having a patchy distribution in the
foreground. Emission lines from the inner ejecta are powered by photoionization
and fluorescent processes. The variable nature of this emission, occurring in a
5.54 yr event cycle, requires specific changes to the incident flux that hold
important clues to the nature of the central object.Comment: This is Chapter 5 in a book entitled: Eta Carinae and the Supernova
Impostors, Kris Davidson and Roberta M. Humphreys, editors Springe
Integrative Genomics Viewer
Author Manuscript 2012 May 07.To the Editor:
Rapid improvements in sequencing and array-based platforms are resulting in a flood of diverse genome-wide data, including data from exome and whole-genome sequencing, epigenetic surveys, expression profiling of coding and noncoding RNAs, single nucleotide polymorphism (SNP) and copy number profiling, and functional assays. Analysis of these large, diverse data sets holds the promise of a more comprehensive understanding of the genome and its relation to human disease. Experienced and knowledgeable human review is an essential component of this process, complementing computational approaches. This calls for efficient and intuitive visualization tools able to scale to very large data sets and to flexibly integrate multiple data types, including clinical data. However, the sheer volume and scope of data pose a significant challenge to the development of such tools.National Institute of General Medical Sciences (U.S.) (R01GM074024)National Cancer Institute (U.S.) (R21CA135827)National Human Genome Research Institute (U.S.) (U54HG003067
- âŠ