432 research outputs found
MHC genomics and disease: Looking back to go forward
Ancestral haplotypes are conserved but extremely polymorphic kilobase sequences, which have been faithfully inherited over at least hundreds of generations in spite of migration and admixture. They carry susceptibility and resistance to diverse diseases, including deficiencies of CYP21 hydroxylase (47.1) and complement components (18.1), as well as numerous autoimmune diseases (8.1). The haplotypes are detected by segregation within ethnic groups rather than by SNPs and GWAS. Susceptibility to some other diseases is carried by specific alleles shared by multiple ancestral haplotypes, e.g., ankylosing spondylitis and narcolepsy. The difference between these two types of association may explain the disappointment with many GWAS. Here we propose a pathway for combining the two different approaches. SNP typing is most useful after the conserved ancestral haplotypes have been defined by other methods
Clifford algebras and universal sets of quantum gates
In this paper is shown an application of Clifford algebras to the
construction of computationally universal sets of quantum gates for -qubit
systems. It is based on the well-known application of Lie algebras together
with the especially simple commutation law for Clifford algebras, which states
that all basic elements either commute or anticommute.Comment: 4 pages, REVTeX (2 col.), low-level language corrections, PR
Biological and economic management strategy evaluations of the eastern king prawn fishery
Stock assessment of the eastern king prawn (EKP) fishery, and the subsequent advice to management and industry, could be improved by addressing a number of issues. The recruitment dynamics of EKP in the northern (i.e., North Reef to the Swain Reefs) parts of the fishery need to be clarified. Fishers report that the size of the prawns from these areas when they recruit to the fishing grounds is resulting in suboptimal sizes/ages at first capture, and therefore localised growth overfishing. There is a need to assess alternative harvest strategies of the EKP fishery, via computer simulations, particularly seasonal and monthly or lunar-based closures to identify scenarios that improve the value of the catch, decrease costs and reduce the risk of overfishing, prior to implementing new management measures
Use of Risk Assessment Tools to Guide Decision-Making in the Primary Prevention of Atherosclerotic Cardiovascular Disease: A Special Report from the American Heart Association and American College of Cardiology
Risk assessment is a critical step in the current approach to primary prevention of atherosclerotic cardiovascular disease. Knowledge of the 10-year risk for atherosclerotic cardiovascular disease identifies patients in higher-risk groups who are likely to have greater net benefit and lower number needed to treat for both statins and antihypertensive therapy. Current US prevention guidelines for blood pressure and cholesterol management recommend use of the pooled cohort equations to start a process of shared decision-making between clinicians and patients in primary prevention. The pooled cohort equations have been widely validated and are broadly useful for the general US clinical population. But, they may systematically underestimate risk in patients from certain racial/ethnic groups, those with lower socioeconomic status or with chronic inflammatory diseases, and overestimate risk in patients with higher socioeconomic status or who have been closely engaged with preventive healthcare services. If uncertainty remains for patients at borderline or intermediate risk, or if the patient is undecided after a patient-clinician discussion with consideration of risk enhancing factors (eg, family history), additional testing with measurement of coronary artery calcium can be useful to reclassify risk estimates and improve selection of patients for use or avoidance of statin therapy. This special report summarizes the rationale and evidence base for quantitative risk assessment, reviews strengths and limitations of existing risk scores, discusses approaches for refining individual risk estimates for patients, and provides practical advice regarding implementation of risk assessment and decision-making strategies in clinical practice
Exact Minimum Eigenvalue Distribution of an Entangled Random Pure State
A recent conjecture regarding the average of the minimum eigenvalue of the
reduced density matrix of a random complex state is proved. In fact, the full
distribution of the minimum eigenvalue is derived exactly for both the cases of
a random real and a random complex state. Our results are relevant to the
entanglement properties of eigenvectors of the orthogonal and unitary ensembles
of random matrix theory and quantum chaotic systems. They also provide a rare
exactly solvable case for the distribution of the minimum of a set of N {\em
strongly correlated} random variables for all values of N (and not just for
large N).Comment: 13 pages, 2 figures included; typos corrected; to appear in J. Stat.
Phy
Quantization and Compressive Sensing
Quantization is an essential step in digitizing signals, and, therefore, an
indispensable component of any modern acquisition system. This book chapter
explores the interaction of quantization and compressive sensing and examines
practical quantization strategies for compressive acquisition systems.
Specifically, we first provide a brief overview of quantization and examine
fundamental performance bounds applicable to any quantization approach. Next,
we consider several forms of scalar quantizers, namely uniform, non-uniform,
and 1-bit. We provide performance bounds and fundamental analysis, as well as
practical quantizer designs and reconstruction algorithms that account for
quantization. Furthermore, we provide an overview of Sigma-Delta
() quantization in the compressed sensing context, and also
discuss implementation issues, recovery algorithms and performance bounds. As
we demonstrate, proper accounting for quantization and careful quantizer design
has significant impact in the performance of a compressive acquisition system.Comment: 35 pages, 20 figures, to appear in Springer book "Compressed Sensing
and Its Applications", 201
The rate of colonization by macro-invertebrates on artificial substrate samplers
The influence of exposure time upon macro-invertebrate colonization on modified Hester-Dendy substrate samplers was investigated over a 60-day period. The duration of exposure affected the number of individuals, taxa and community diversity. The numbers of individuals colonizing the samplers reached a maximum after 39 days and then began to decrease, due to the emergence of adult insects. Coefficients of variation for the four replicate samples retrieved each sampling day fluctuated extensively throughout the study. No tendencies toward increasing or decreasing coefficients of variation were noted with increasing time of sampler exposure. The number of taxa colonizing the samplers increased throughout the study period. The community diversity index was calculated for each sampling day and this function tended to increase throughout the same period. This supports the hypothesis that an exposure period of 6 weeks, as recommended by the United States Environmental Protection Agency, may not always provide adequate opportunity for a truly representative community of macro-invertebrates to colonize multiplate samplers. Many of the taxa were collected in quite substantial proportions after periods of absence or extreme sparseness. This is attributed to the growth of periphyton and the collection of other materials that created food and new habitats suitable for the colonization of new taxa. Investigation of the relationship between ‘equitability’ and length of exposure revealed that equitability did not vary like diversity with increased time of exposure.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/72073/1/j.1365-2427.1979.tb01522.x.pd
High-Intensity Statins Benefit High-Risk Patients: Why and How to Do Better
Review of the US and European literature indicates that most patients at high risk for atherosclerotic cardiovascular disease (ASCVD are not treated with high-intensity statins, despite strong clinical-trial evidence of maximal statin benefit. High-intensity statins are recommended for 2 categories of patients: those with ASCVD (secondary prevention) and high-risk patients without clinical ASCVD. Most patients with ASCVD are candidates for high-intensity statins, with a goal for low-density lipoprotein cholesterol reduction of 50% or greater. A subgroup of patients with ASCVD are at very high risk and can benefit by the addition of nonstatin drugs (ezetimibe with or without bile acid sequestrant or bempedoic acid and/or a proprotein convertase subtilisin/kexin type 9 inhibitor). High-risk primary prevention patients are those with severe hypercholesterolemia, diabetes with associated risk factors, and patients aged 40 to 75 years with a 10-year risk for ASCVD of 20% or greater. In patients with a 10-year risk of 7.5% to less than 20%, coronary artery calcium scoring is an option; if the coronary artery calcium score is 300 or more Agatston units, the patient can be up-classified to high risk. If high-intensity statin treatment is not tolerated in high-risk patients, a reasonable approach is to combine a moderate-intensity statin with ezetimibe. In very high-risk patients, proprotein convertase subtilisin/kexin type 9 inhibitors lower low-density lipoprotein cholesterol levels substantially and hence reduce risk as well
Search for direct production of charginos and neutralinos in events with three leptons and missing transverse momentum in √s = 7 TeV pp collisions with the ATLAS detector
A search for the direct production of charginos and neutralinos in final states with three electrons or muons and missing transverse momentum is presented. The analysis is based on 4.7 fb−1 of proton–proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with Standard Model expectations in three signal regions that are either depleted or enriched in Z-boson decays. Upper limits at 95% confidence level are set in R-parity conserving phenomenological minimal supersymmetric models and in simplified models, significantly extending previous results
Jet size dependence of single jet suppression in lead-lead collisions at sqrt(s(NN)) = 2.76 TeV with the ATLAS detector at the LHC
Measurements of inclusive jet suppression in heavy ion collisions at the LHC
provide direct sensitivity to the physics of jet quenching. In a sample of
lead-lead collisions at sqrt(s) = 2.76 TeV corresponding to an integrated
luminosity of approximately 7 inverse microbarns, ATLAS has measured jets with
a calorimeter over the pseudorapidity interval |eta| < 2.1 and over the
transverse momentum range 38 < pT < 210 GeV. Jets were reconstructed using the
anti-kt algorithm with values for the distance parameter that determines the
nominal jet radius of R = 0.2, 0.3, 0.4 and 0.5. The centrality dependence of
the jet yield is characterized by the jet "central-to-peripheral ratio," Rcp.
Jet production is found to be suppressed by approximately a factor of two in
the 10% most central collisions relative to peripheral collisions. Rcp varies
smoothly with centrality as characterized by the number of participating
nucleons. The observed suppression is only weakly dependent on jet radius and
transverse momentum. These results provide the first direct measurement of
inclusive jet suppression in heavy ion collisions and complement previous
measurements of dijet transverse energy imbalance at the LHC.Comment: 15 pages plus author list (30 pages total), 8 figures, 2 tables,
submitted to Physics Letters B. All figures including auxiliary figures are
available at
http://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/HION-2011-02
- …