5,484 research outputs found
Faster algorithms for 1-mappability of a sequence
In the k-mappability problem, we are given a string x of length n and
integers m and k, and we are asked to count, for each length-m factor y of x,
the number of other factors of length m of x that are at Hamming distance at
most k from y. We focus here on the version of the problem where k = 1. The
fastest known algorithm for k = 1 requires time O(mn log n/ log log n) and
space O(n). We present two algorithms that require worst-case time O(mn) and
O(n log^2 n), respectively, and space O(n), thus greatly improving the state of
the art. Moreover, we present an algorithm that requires average-case time and
space O(n) for integer alphabets if m = {\Omega}(log n/ log {\sigma}), where
{\sigma} is the alphabet size
Recommended from our members
A fixed-target platform for serial femtosecond crystallography in a hydrated environment.
For serial femtosecond crystallography at X-ray free-electron lasers, which entails collection of single-pulse diffraction patterns from a constantly refreshed supply of microcrystalline sample, delivery of the sample into the X-ray beam path while maintaining low background remains a technical challenge for some experiments, especially where this methodology is applied to relatively low-ordered samples or those difficult to purify and crystallize in large quantities. This work demonstrates a scheme to encapsulate biological samples using polymer thin films and graphene to maintain sample hydration in vacuum conditions. The encapsulated sample is delivered into the X-ray beam on fixed targets for rapid scanning using the Roadrunner fixed-target system towards a long-term goal of low-background measurements on weakly diffracting samples. As a proof of principle, we used microcrystals of the 24 kDa rapid encystment protein (REP24) to provide a benchmark for polymer/graphene sandwich performance. The REP24 microcrystal unit cell obtained from our sandwiched in-vacuum sample was consistent with previously established unit-cell parameters and with those measured by us without encapsulation in humidified helium, indicating that the platform is robust against evaporative losses. While significant scattering from water was observed because of the sample-deposition method, the polymer/graphene sandwich itself was shown to contribute minimally to background scattering
Deciding Quantifier-Free Presburger Formulas Using Parameterized Solution Bounds
Given a formula in quantifier-free Presburger arithmetic, if it has a
satisfying solution, there is one whose size, measured in bits, is polynomially
bounded in the size of the formula. In this paper, we consider a special class
of quantifier-free Presburger formulas in which most linear constraints are
difference (separation) constraints, and the non-difference constraints are
sparse. This class has been observed to commonly occur in software
verification. We derive a new solution bound in terms of parameters
characterizing the sparseness of linear constraints and the number of
non-difference constraints, in addition to traditional measures of formula
size. In particular, we show that the number of bits needed per integer
variable is linear in the number of non-difference constraints and logarithmic
in the number and size of non-zero coefficients in them, but is otherwise
independent of the total number of linear constraints in the formula. The
derived bound can be used in a decision procedure based on instantiating
integer variables over a finite domain and translating the input
quantifier-free Presburger formula to an equi-satisfiable Boolean formula,
which is then checked using a Boolean satisfiability solver. In addition to our
main theoretical result, we discuss several optimizations for deriving tighter
bounds in practice. Empirical evidence indicates that our decision procedure
can greatly outperform other decision procedures.Comment: 26 page
Airborne observations of methane emissions from rice cultivation in the Sacramento Valley of California
Airborne measurements of methane (CH4) and carbon dioxide (CO2) were taken over the rice growing region of California's Sacramento Valley in the late spring of 2010 and 2011. From these and ancillary measurements, we show that CH4 mixing ratios were higher in the planetary boundary layer above the Sacramento Valley during the rice growing season than they were before it, which we attribute to emissions from rice paddies. We derive daytime emission fluxes of CH4 between 0.6 and 2.0% of the CO2 taken up by photosynthesis on a per carbon, or mole to mole, basis. We also use a mixing model to determine an average CH 4/CO2 flux ratio of -0.6% for one day early in the growing season of 2010. We conclude the CH4/CO2 flux ratio estimates from a single rice field in a previous study are representative of rice fields in the Sacramento Valley. If generally true, the California Air Resources Board (CARB) greenhouse gas inventory emission rate of 2.7×1010g CH4/yr is approximately three times lower than the range of probable CH4 emissions (7.8-9.3×10 10g CH4/yr) from rice cultivation derived in this study. We attribute this difference to decreased burning of the residual rice crop since 1991, which leads to an increase in CH4 emissions from rice paddies in succeeding years, but which is not accounted for in the CARB inventory. © 2012. American Geophysical Union. All Rights Reserved
Considering the Case for Biodiversity Cycles: Reexamining the Evidence for Periodicity in the Fossil Record
Medvedev and Melott (2007) have suggested that periodicity in fossil
biodiversity may be induced by cosmic rays which vary as the Solar System
oscillates normal to the galactic disk. We re-examine the evidence for a 62
million year (Myr) periodicity in biodiversity throughout the Phanerozoic
history of animal life reported by Rohde & Mueller (2005), as well as related
questions of periodicity in origination and extinction. We find that the signal
is robust against variations in methods of analysis, and is based on
fluctuations in the Paleozoic and a substantial part of the Mesozoic.
Examination of origination and extinction is somewhat ambiguous, with results
depending upon procedure. Origination and extinction intensity as defined by RM
may be affected by an artifact at 27 Myr in the duration of stratigraphic
intervals. Nevertheless, when a procedure free of this artifact is implemented,
the 27 Myr periodicity appears in origination, suggesting that the artifact may
ultimately be based on a signal in the data. A 62 Myr feature appears in
extinction, when this same procedure is used. We conclude that evidence for a
periodicity at 62 Myr is robust, and evidence for periodicity at approximately
27 Myr is also present, albeit more ambiguous.Comment: Minor modifications to reflect final published versio
Revealing the electronic structure of a carbon nanotube carrying a supercurrent
Carbon nanotubes (CNTs) are not intrinsically superconducting but they can
carry a supercurrent when connected to superconducting electrodes. This
supercurrent is mainly transmitted by discrete entangled electron-hole states
confined to the nanotube, called Andreev Bound States (ABS). These states are a
key concept in mesoscopic superconductivity as they provide a universal
description of Josephson-like effects in quantum-coherent nanostructures (e.g.
molecules, nanowires, magnetic or normal metallic layers) connected to
superconducting leads. We report here the first tunneling spectroscopy of
individually resolved ABS, in a nanotube-superconductor device. Analyzing the
evolution of the ABS spectrum with a gate voltage, we show that the ABS arise
from the discrete electronic levels of the molecule and that they reveal
detailed information about the energies of these levels, their relative spin
orientation and the coupling to the leads. Such measurements hence constitute a
powerful new spectroscopic technique capable of elucidating the electronic
structure of CNT-based devices, including those with well-coupled leads. This
is relevant for conventional applications (e.g. superconducting or normal
transistors, SQUIDs) and quantum information processing (e.g. entangled
electron pairs generation, ABS-based qubits). Finally, our device is a new type
of dc-measurable SQUID
Biodiversity Loss and the Taxonomic Bottleneck: Emerging Biodiversity Science
Human domination of the Earth has resulted in dramatic changes to global and local patterns of biodiversity. Biodiversity is critical to human sustainability because it drives the ecosystem services that provide the core of our life-support system. As we, the human species, are the primary factor leading to the decline in biodiversity, we need detailed information about the biodiversity and species composition of specific locations in order to understand how different species contribute to ecosystem services and how humans can sustainably conserve and manage biodiversity. Taxonomy and ecology, two fundamental sciences that generate the knowledge about biodiversity, are associated with a number of limitations that prevent them from providing the information needed to fully understand the relevance of biodiversity in its entirety for human sustainability: (1) biodiversity conservation strategies that tend to be overly focused on research and policy on a global scale with little impact on local biodiversity; (2) the small knowledge base of extant global biodiversity; (3) a lack of much-needed site-specific data on the species composition of communities in human-dominated landscapes, which hinders ecosystem management and biodiversity conservation; (4) biodiversity studies with a lack of taxonomic precision; (5) a lack of taxonomic expertise and trained taxonomists; (6) a taxonomic bottleneck in biodiversity inventory and assessment; and (7) neglect of taxonomic resources and a lack of taxonomic service infrastructure for biodiversity science. These limitations are directly related to contemporary trends in research, conservation strategies, environmental stewardship, environmental education, sustainable development, and local site-specific conservation. Today’s biological knowledge is built on the known global biodiversity, which represents barely 20% of what is currently extant (commonly accepted estimate of 10 million species) on planet Earth. Much remains unexplored and unknown, particularly in hotspots regions of Africa, South Eastern Asia, and South and Central America, including many developing or underdeveloped countries, where localized biodiversity is scarcely studied or described. ‘‘Backyard biodiversity’’, defined as local biodiversity near human habitation, refers to the natural resources and capital for ecosystem services at the grassroots level, which urgently needs to be explored, documented, and conserved as it is the backbone of sustainable economic development in these countries. Beginning with early identification and documentation of local flora and fauna, taxonomy has documented global biodiversity and natural history based on the collection of ‘‘backyard biodiversity’’ specimens worldwide. However, this branch of science suffered a continuous decline in the latter half of the twentieth century, and has now reached a point of potential demise. At present there are very few professional taxonomists and trained local parataxonomists worldwide, while the need for, and demands on, taxonomic services by conservation and resource management communities are rapidly increasing. Systematic collections, the material basis of biodiversity information, have been neglected and abandoned, particularly at institutions of higher learning. Considering the rapid increase in the human population and urbanization, human sustainability requires new conceptual and practical approaches to refocusing and energizing the study of the biodiversity that is the core of natural resources for sustainable development and biotic capital for sustaining our life-support system. In this paper we aim to document and extrapolate the essence of biodiversity, discuss the state and nature of taxonomic demise, the trends of recent biodiversity studies, and suggest reasonable approaches to a biodiversity science to facilitate the expansion of global biodiversity knowledge and to create useful data on backyard biodiversity worldwide towards human sustainability
Patterns of primary care and mortality among patients with schizophrenia or diabetes: a cluster analysis approach to the retrospective study of healthcare utilization
Abstract Background Patients with schizophrenia have difficulty managing their medical healthcare needs, possibly resulting in delayed treatment and poor outcomes. We analyzed whether patients reduced primary care use over time, differentially by diagnosis with schizophrenia, diabetes, or both schizophrenia and diabetes. We also assessed whether such patterns of primary care use were a significant predictor of mortality over a 4-year period. Methods The Veterans Healthcare Administration (VA) is the largest integrated healthcare system in the United States. Administrative extracts of the VA's all-electronic medical records were studied. Patients over age 50 and diagnosed with schizophrenia in 2002 were age-matched 1:4 to diabetes patients. All patients were followed through 2005. Cluster analysis explored trajectories of primary care use. Proportional hazards regression modelled the impact of these primary care utilization trajectories on survival, controlling for demographic and clinical covariates. Results Patients comprised three diagnostic groups: diabetes only (n = 188,332), schizophrenia only (n = 40,109), and schizophrenia with diabetes (Scz-DM, n = 13,025). Cluster analysis revealed four distinct trajectories of primary care use: consistent over time, increasing over time, high and decreasing, low and decreasing. Patients with schizophrenia only were likely to have low-decreasing use (73% schizophrenia-only vs 54% Scz-DM vs 52% diabetes). Increasing use was least common among schizophrenia patients (4% vs 8% Scz-DM vs 7% diabetes) and was associated with improved survival. Low-decreasing primary care, compared to consistent use, was associated with shorter survival controlling for demographics and case-mix. The observational study was limited by reliance on administrative data. Conclusion Regular primary care and high levels of primary care were associated with better survival for patients with chronic illness, whether psychiatric or medical. For schizophrenia patients, with or without comorbid diabetes, primary care offers a survival benefit, suggesting that innovations in treatment retention targeting at-risk groups can offer significant promise of improving outcomes.http://deepblue.lib.umich.edu/bitstream/2027.42/78274/1/1472-6963-9-127.xmlhttp://deepblue.lib.umich.edu/bitstream/2027.42/78274/2/1472-6963-9-127.pdfPeer Reviewe
A hippocampal Cdk5 pathway regulates extinction of contextual fear
Treatment of emotional disorders involves the promotion of extinction processes, which are defined as the learned reduction of fear. The molecular mechanisms underlying extinction have only begun to be elucidated. By employing genetic and pharmacological approaches in mice, we show here that extinction requires downregulation of Rac-1 and cyclin-dependent kinase 5 (Cdk5), and upregulation of p21 activated kinase-1 (PAK-1) activity. This is physiologically achieved by a Rac-1–dependent relocation of the Cdk5 activator p35 from the membrane to the cytosol and dissociation of p35 from PAK-1. Moreover, our data suggest that Cdk5/p35 activity prevents extinction in part by inhibition of PAK-1 activity in a Rac-1–dependent manner. We propose that extinction of contextual fear is regulated by counteracting components of a molecular pathway involving Rac-1, Cdk5 and PAK-1. Our data suggest that this pathway could provide a suitable target for therapeutic treatment of emotional disorders.National Institutes of Health (U.S.) (Grant NS051874)Alexander von Humboldt-Stiftung (German Research Foundation Fellowship)European Neuroscience Institute Goettinge
Rationale, design and conduct of a randomised controlled trial evaluating a primary care-based complex intervention to improve the quality of life of heart failure patients: HICMan (Heidelberg Integrated Case Management) : study protocol
Background: Chronic congestive heart failure (CHF) is a complex disease with rising prevalence, compromised quality of life (QoL), unplanned hospital admissions, high mortality and therefore high burden of illness. The delivery of care for these patients has been criticized and new strategies addressing crucial domains of care have been shown to be effective on patients' health outcomes, although these trials were conducted in secondary care or in highly organised Health Maintenance Organisations. It remains unclear whether a comprehensive primary care-based case management for the treating general practitioner (GP) can improve patients' QoL. Methods/Design: HICMan is a randomised controlled trial with patients as the unit of randomisation. Aim is to evaluate a structured, standardized and comprehensive complex intervention for patients with CHF in a 12-months follow-up trial. Patients from intervention group receive specific patient leaflets and documentation booklets as well as regular monitoring and screening by a prior trained practice nurse, who gives feedback to the GP upon urgency. Monitoring and screening address aspects of disease-specific selfmanagement, (non)pharmacological adherence and psychosomatic and geriatric comorbidity. GPs are invited to provide a tailored structured counselling 4 times during the trial and receive an additional feedback on pharmacotherapy relevant to prognosis (data of baseline documentation). Patients from control group receive usual care by their GPs, who were introduced to guidelineoriented management and a tailored health counselling concept. Main outcome measurement for patients' QoL is the scale physical functioning of the SF-36 health questionnaire in a 12-month follow-up. Secondary outcomes are the disease specific QoL measured by the Kansas City Cardiomyopathy questionnaire (KCCQ), depression and anxiety disorders (PHQ-9, GAD-7), adherence (EHFScBS and SANA), quality of care measured by an adapted version of the Patient Chronic Illness Assessment of Care questionnaire (PACIC) and NTproBNP. In addition, comprehensive clinical data are collected about health status, comorbidity, medication and health care utilisation. Discussion: As the targeted patient group is mostly cared for and treated by GPs, a comprehensive primary care-based guideline implementation including somatic, psychosomatic and organisational aspects of the delivery of care (HICMAn) is a promising intervention applying proven strategies for optimal care. Trial registration: Current Controlled Trials ISRCTN30822978
- …
