5,900 research outputs found
Whitney coverings and the tent spaces for the Gaussian measure
We introduce a technique for handling Whitney decompositions in Gaussian
harmonic analysis and apply it to the study of Gaussian analogues of the
classical tent spaces of Coifman, Meyer and Stein.Comment: 13 pages, 1 figure. Revised version incorporating referee's comments.
To appear in Arkiv for Matemati
Second Order Perturbations of Flat Dust FLRW Universes with a Cosmological Constant
We summarize recent results concerning the evolution of second order
perturbations in flat dust irrotational FLRW models with . We
show that asymptotically these perturbations tend to constants in time, in
agreement with the cosmic no-hair conjecture. We solve numerically the second
order scalar perturbation equation, and very briefly discuss its all time
behaviour and some possible implications for the structure formation.Comment: 6 pages, 1 figure. to be published in "Proceedings of the 5th
Alexander Friedmann Seminar on Gravitation and Cosmology", Int. Journ. Mod.
Phys. A (2002). Macros: ws-ijmpa.cls, ws-p9-75x6-50.cl
Maximum entropy models for antibody diversity
Recognition of pathogens relies on families of proteins showing great
diversity. Here we construct maximum entropy models of the sequence repertoire,
building on recent experiments that provide a nearly exhaustive sampling of the
IgM sequences in zebrafish. These models are based solely on pairwise
correlations between residue positions, but correctly capture the higher order
statistical properties of the repertoire. Exploiting the interpretation of
these models as statistical physics problems, we make several predictions for
the collective properties of the sequence ensemble: the distribution of
sequences obeys Zipf's law, the repertoire decomposes into several clusters,
and there is a massive restriction of diversity due to the correlations. These
predictions are completely inconsistent with models in which amino acid
substitutions are made independently at each site, and are in good agreement
with the data. Our results suggest that antibody diversity is not limited by
the sequences encoded in the genome, and may reflect rapid adaptation to
antigenic challenges. This approach should be applicable to the study of the
global properties of other protein families
Big data and data repurposing – using existing data to answer new questions in vascular dementia research
Introduction:
Traditional approaches to clinical research have, as yet, failed to provide effective treatments for vascular dementia (VaD). Novel approaches to collation and synthesis of data may allow for time and cost efficient hypothesis generating and testing. These approaches may have particular utility in helping us understand and treat a complex condition such as VaD.
Methods:
We present an overview of new uses for existing data to progress VaD research. The overview is the result of consultation with various stakeholders, focused literature review and learning from the group’s experience of successful approaches to data repurposing. In particular, we benefitted from the expert discussion and input of delegates at the 9th International Congress on Vascular Dementia (Ljubljana, 16-18th October 2015).
Results:
We agreed on key areas that could be of relevance to VaD research: systematic review of existing studies; individual patient level analyses of existing trials and cohorts and linking electronic health record data to other datasets. We illustrated each theme with a case-study of an existing project that has utilised this approach.
Conclusions:
There are many opportunities for the VaD research community to make better use of existing data. The volume of potentially available data is increasing and the opportunities for using these resources to progress the VaD research agenda are exciting. Of course, these approaches come with inherent limitations and biases, as bigger datasets are not necessarily better datasets and maintaining rigour and critical analysis will be key to optimising data use
Unblocking Temperatures of Viscous Remanent Magnetism in Displaced Granitic Boulders, Icicle Creek Glacial Moraines (Washington, USA)
Viscous remanent magnetization (VRM) may partially overprint original magnetization in rocks displaced by geomorphic events. An established theoretical relationship between the time and temperature of acquisition of VRM and the time and temperature of demagnetization suggests that laboratory demagnetization (unblocking) of VRM can be used to estimate the displacement age of rocks. We test this hypothesis at four nested glacial moraines in the Icicle Creek drainage of central Washington, the ages of which were previously determined by cosmogenic surface exposure dating. The moraines are composed primarily of granodiorite boulders, and magnetic remanence is carried dominantly by magnetite. Both the maximum and average pVRM demagnetization temperatures (TD) increase with relative age of the moraines. For the three younger moraines, the average TD yields an age comparable to the cosmogenic age, within uncertainty of pVRM acquisition temperature. Uncertainty in the acquisition and demagnetization temperatures can limit the utility of pVRM for absolute dating
Two Distant Halo Velocity Groups Discovered by the Palomar Transient Factory
We report the discovery of two new halo velocity groups (Cancer groups A and B) traced by 8 distant RR Lyrae stars and observed by the Palomar Transient Factory (PTF) survey at R.A.~129°, Dec~20° (l~205°, b~32°). Located at 92 kpc from the Galactic center (86 kpc from the Sun), these are some of the most distant substructures in the Galactic halo known to date. Follow-up spectroscopic observations with the Palomar Observatory 5.1-m Hale telescope and W. M. Keck Observatory 10-m Keck I telescope indicate that the two groups are moving away from the Galaxy at v_(gsr) = 78.0+-5.6 km s^(-1) (Cancer group A) and v_(gsr) = 16.3+-7.1 km s^(-1) (Cancer group B). The groups have velocity dispersions of σ_(v_)gsr))=12.4+-5.0 km s^(-1) and σ _(v_(gsr))=14.9+-6.2 km s^(-1), and are spatially extended (about several kpc) making it very unlikely that they are bound systems, and are more likely to be debris of tidally disrupted dwarf galaxies or globular clusters. Both groups are metal-poor (median metallicities of [Fe/H]^A = -1.6 dex and [Fe/H]^B =-2.1 dex), and have a somewhat uncertain (due to small sample size) metallicity dispersion of ~0.4 dex, suggesting dwarf galaxies as progenitors. Two additional RR Lyrae stars with velocities consistent with those of the Cancer groups have been observed ~25 ° east, suggesting possible extension of the groups in that direction
Recommended from our members
Medication decision-making for patients with renal insufficiency in inpatient and outpatient care at a US Veterans Affairs Medical Centre: a qualitative, cognitive task analysis.
BackgroundMany studies identify factors that contribute to renal prescribing errors, but few examine how healthcare professionals (HCPs) detect and recover from an error or potential patient safety concern. Knowledge of this information could inform advanced error detection systems and decision support tools that help prevent prescribing errors.ObjectiveTo examine the cognitive strategies that HCPs used to recognise and manage medication-related problems for patients with renal insufficiency.DesignHCPs submitted documentation about medication-related incidents. We then conducted cognitive task analysis interviews. Qualitative data were analysed inductively.SettingInpatient and outpatient facilities at a major US Veterans Affairs Medical Centre.ParticipantsPhysicians, nurses and pharmacists who took action to prevent or resolve a renal-drug problem in patients with renal insufficiency.OutcomesEmergent themes from interviews, as related to recognition of renal-drug problems and decision-making processes.ResultsWe interviewed 20 HCPs. Results yielded a descriptive model of the decision-making process, comprised of three main stages: detect, gather information and act. These stages often followed a cyclical path due largely to the gradual decline of patients' renal function. Most HCPs relied on being vigilant to detect patients' renal-drug problems rather than relying on systems to detect unanticipated cues. At each stage, HCPs relied on different cognitive cues depending on medication type: for renally eliminated medications, HCPs focused on gathering renal dosing guidelines, while for nephrotoxic medications, HCPs investigated the need for particular medication therapy, and if warranted, safer alternatives.ConclusionsOur model is useful for trainees so they can gain familiarity with managing renal-drug problems. Based on findings, improvements are warranted for three aspects of healthcare systems: (1) supporting the cyclical nature of renal-drug problem management via longitudinal tracking mechanisms, (2) providing tools to alleviate HCPs' heavy reliance on vigilance and (3) supporting HCPs' different decision-making needs for renally eliminated versus nephrotoxic medications
Science data quality assessment for the Large Synoptic Survey Telescope
LSST will have a Science Data Quality Assessment (SDQA) subsystem for the assessment of the data products that will be produced during the course of a 10 yr survey. The LSST will produce unprecedented volumes of astronomical data as it surveys the accessible sky every few nights. The SDQA subsystem will enable comparisons of the science data with expectations from prior experience and models, and with established requirements for the survey. While analogous systems have been built for previous large astronomical surveys, SDQA for LSST must meet a unique combination of challenges. Chief among them will be the extraordinary data rate and volume, which restricts the bulk of the quality computations to the automated processing stages, as revisiting the pixels for a post-facto evaluation is prohibitively expensive. The identification of appropriate scientific metrics is driven by the breadth of the expected science, the scope of the time-domain survey, the need to tap the widest possible pool of scientific expertise, and the historical tendency of new quality metrics to be crafted and refined as experience grows. Prior experience suggests that contemplative, off-line quality analyses are essential to distilling new automated quality metrics, so the SDQA architecture must support integrability with a variety of custom and community-based tools, and be flexible to embrace evolving QA demands. Finally, the time-domain nature of LSST means every exposure may be useful for some scientific purpose, so the model of quality thresholds must be sufficiently rich to reflect the quality demands of diverse science aims
Antichain cutsets of strongly connected posets
Rival and Zaguia showed that the antichain cutsets of a finite Boolean
lattice are exactly the level sets. We show that a similar characterization of
antichain cutsets holds for any strongly connected poset of locally finite
height. As a corollary, we get such a characterization for semimodular
lattices, supersolvable lattices, Bruhat orders, locally shellable lattices,
and many more. We also consider a generalization to strongly connected
hypergraphs having finite edges.Comment: 12 pages; v2 contains minor fixes for publicatio
- …