3,468 research outputs found
A Hierarchical Approach to Protein Molecular Evolution
Biological diversity has evolved despite the essentially infinite complexity
of protein sequence space. We present a hierarchical approach to the efficient
searching of this space and quantify the evolutionary potential of our approach
with Monte Carlo simulations. These simulations demonstrate that non-homologous
juxtaposition of encoded structure is the rate-limiting step in the production
of new tertiary protein folds. Non-homologous ``swapping'' of low energy
secondary structures increased the binding constant of a simulated protein by
relative to base substitution alone. Applications of our approach
include the generation of new protein folds and modeling the molecular
evolution of disease.Comment: 15 pages. 2 figures. LaTeX styl
A peak-clustering method for MEG group analysis to minimise artefacts due to smoothness
Magnetoencephalography (MEG), a non-invasive technique for characterizing brain electrical activity, is gaining popularity as a tool for assessing group-level differences between experimental conditions. One method for assessing task-condition effects involves beamforming, where a weighted sum of field measurements is used to tune activity on a voxel-by-voxel basis. However, this method has been shown to produce inhomogeneous smoothness differences as a function of signal-to-noise across a volumetric image, which can then produce false positives at the group level. Here we describe a novel method for group-level analysis with MEG beamformer images that utilizes the peak locations within each participant's volumetric image to assess group-level effects. We compared our peak-clustering algorithm with SnPM using simulated data. We found that our method was immune to artefactual group effects that can arise as a result of inhomogeneous smoothness differences across a volumetric image. We also used our peak-clustering algorithm on experimental data and found that regions were identified that corresponded with task-related regions identified in the literature. These findings suggest that our technique is a robust method for group-level analysis with MEG beamformer images
Peak-Clustering Method for MEG Group Analysis to Minimise Artefacts Due to Smoothness
Magnetoencephalography (MEG), a non-invasive technique for characterizing brain electrical activity, is gaining popularity as a tool for assessing group-level differences between experimental conditions. One method for assessing task-condition effects involves beamforming, where a weighted sum of field measurements is used to tune activity on a voxel-by-voxel basis. However, this method has been shown to produce inhomogeneous smoothness differences as a function of signal-to-noise across a volumetric image, which can then produce false positives at the group level. Here we describe a novel method for group-level analysis with MEG beamformer images that utilizes the peak locations within each participantâs volumetric image to assess group-level effects. We compared our peak-clustering algorithm with SnPM using simulated data. We found that our method was immune to artefactual group effects that can arise as a result of inhomogeneous smoothness differences across a volumetric image. We also used our peak-clustering algorithm on experimental data and found that regions were identified that corresponded with task-related regions identified in the literature. These findings suggest that our technique is a robust method for group-level analysis with MEG beamformer images
Recommended from our members
A POLARIZED PROTON TARGET
We have successfully conducted a series of experiments involving scattering of high energy pions and protons from a target containing polarized protons. Results of some of these experiments were reported at this conference, and in the literature. Proton polarizations as high as 65% have been measured; the average polarization during sustained data-taking has been typically 45%
The Reionization History at High Redshifts I: Physical Models and New Constraints from CMB Polarization
The recent discovery of a high optical depth tau to Thomson scattering from
the WMAP data implies that significant reionization took place at redshifts
z~15. This discovery has important implications for the sources of
reionization, and allows, for the first time, constraints to be placed on
physical reionization scenarios out to redshift z~20. Using a new suite of
semi-analytic reionization models, we show that the high value of tau requires
a surprisingly high efficiency epsilon of the first generation of UV sources
for injecting ionizing photons into the intergalactic medium. We find that no
simple reionization model can be consistent with the combination of the WMAP
result with data from the z<6.5 universe. Satisfying both constraints requires
either of the following: (i) H_2 molecules form efficiently at z~20, survive
feedback processes, and allow UV sources in halos with virial temperatures
below Tvir=10^4 K to contribute substantially to reionization, or (ii) the
efficiency epsilon in halos with Tvir>10^4K decreased by a factor of ~ 30
between (z~20) and (z~6). We discuss the relevant physical issues to produce
either scenario, and argue that both options are viable, and allowed by current
data. In detailed models of the reionization history, we find that the
evolution of the ionized fractions in the two scenarios have distinctive
features that Planck can distinguish at 3 sigma significance. At the high WMAP
value for tau, Planck will also be able to provide tight statistical
constraints on reionization model parameters, and elucidate much of the physics
at the end of the Dark Ages. The sources responsible for the high optical depth
discovered by WMAP should be directly detectable out to z~15 by the James Webb
Space Telescope.Comment: cosmetic changes to figures; text unchange
Seroprevalence of rubella antibodies and determinants of susceptibility to rubella in a cohort of pregnant women in Canada, 2008â2011
Long term control of rubella and congenital rubella syndrome relies on high population-level immunity
against rubella, particularly among women of childbearing age. In Canada, all pregnant women should be
screened so that susceptible new mothers can be offered vaccination for rubella before discharge. This
study was undertaken to estimate rubella susceptibility in a cohort of pregnant women in Canada and
to identify associated socio-economic and demographic factors. Biobanked plasma samples were
obtained from the Maternal-Infant Research on Environmental Chemicals (MIREC) study, in which pregnant
women were recruited between 2008 and 2011. Socio-demographic characteristics and obstetric
histories were collected. Second trimester plasma samples (n = 1,752) were tested for rubella-specific
IgG using an in-house enzyme-linked immunosorbent assay. The percentage of women with IgG titers
<5 IU/mL, 5â10 IU/mL, and 10 IU/mL were 2.3%, 10.1%, and 87.6%, respectively. Rates of seronegativity,
defined as <5 IU/mL, were 3.1% in women who had no previous live birth and 1.6% in women who had
given birth previously. Among the latter group, seronegativity was higher in women with high school
education or less (adjusted OR (aOR) 5.93, 95% CI 2.08â16.96) or with a college or trade school diploma
(aOR 3.82, 95% CI 1.45â10.12), compared to university graduates, and those born outside Canada (aOR
2.60, 95% CI 1.07â6.31). In conclusion, a large majority of pregnant women were found to be immune
to rubella. Further research is needed to understand inequalities in vaccine uptake or access, and more
effort is needed to promote catch-up measles-mumps-rubella vaccination among socioeconomically disadvantaged
and immigrant women of childbearing age
Complementary Patents and Market Structure
Many high technology goods are based on standards that require several essential patents owned by different IP holders. This gives rise to a complements and a double mark-up problem. We compare the welfare effects of two different business strategies dealing with these problems. Vertical integration of an IP holder and a downstream producer solves the double mark-up problem between these firms. Nevertheless, it may raise royalty rates and reduce output as compared to non-integration. Horizontal integration of IP holders solves the complements problem but not the double mark-up problem. Vertical integration discourages entry and reduces innovation incentives, while horizontal integration always benefits from entry and innovatio
Non-obviousness and Screening
The paper offers a novel justification for the non-obviousness patentability requirement. An innovation involves two stages: research results in a technology blueprint, which development transforms into a profitable activity. An innovator, who is either efficient or inefficient, must rely on outside finance for the development. Only patented technologies are developed. Strengthening the non-obviousness requirement alleviates adverse selection by discouraging inefficient innovators from doing research, but creates inefficiencies by excluding marginal innovations. We show that it is socially optimal to raise the non-obviousness requirement so as to exclude bad innovators; we also provide several robustness checks and discuss the policy implications
On the Behaviour of General-Purpose Applications on Cloud Storages
Managing data over cloud infrastructures raises novel challenges with respect to existing and well studied approaches such as ACID and long running transactions. One of the main requirements is to provide availability and partition tolerance in a scenario with replicas and distributed control. This comes at the price of a weaker consistency, usually called eventual consistency. These weak memory models have proved to be suitable in a number of scenarios, such as the analysis of large data with Map-Reduce. However, due to the widespread availability of cloud infrastructures, weak storages are used not only by specialised applications but also by general purpose applications. We provide a formal approach, based on process calculi, to reason about the behaviour of programs that rely on cloud stores. For instance, one can check that the composition of a process with a cloud store ensures `strong' properties through a wise usage of asynchronous message-passing
- âŠ