362 research outputs found
The current and future role of artificial intelligence in optimizing donor organ utilization and recipient outcomes in heart transplantation
Heart failure (HF) is a leading cause of morbidity and mortality in the United States. While medical management and mechanical circulatory support have undergone significant advancement in recent years, orthotopic heart transplantation (OHT) remains the most definitive therapy for refractory HF. OHT has seen steady improvement in patient survival and quality of life (QoL) since its inception, with one-year mortality now under 8%. However, a significant number of HF patients are unable to receive OHT due to scarcity of donor hearts. The United Network for Organ Sharing has recently revised its organ allocation criteria in an effort to provide more equitable access to OHT. Despite these changes, there are many potential donor hearts that are inevitably rejected. Arbitrary regulations from the centers for Medicare and Medicaid services and fear of repercussions if one-year mortality falls below established values has led to a current state of excessive risk aversion for which organs are accepted for OHT. Furthermore, non-standardized utilization of extended criteria donors and donation after circulatory death, exacerbate the organ shortage. Data-driven systems can improve donor-recipient matching, better predict patient QoL post-OHT, and decrease needless organ waste through more uniform application of acceptance criteria. Thus, we propose a data-driven future for OHT and a move to patient-centric and holistic transplantation care processes
Scheme dependence of NLO corrections to exclusive processes
We apply the so-called conformal subtraction scheme to predict perturbatively
exclusive processes beyond leading order. Taking into account evolution
effects, we study the scheme dependence for the photon-to-pion transition form
factor and the electromagnetic pion form factor at next-to-leading order for
different pion distribution amplitudes. Relying on the conformally covariant
operator product expansion and using the known higher order results for
polarized deep inelastic scattering, we are able to predict perturbative
corrections to the hard-scattering amplitude of the photon-to-pion transition
form factor beyond next-to-leading order in the conformal scheme restricted to
the conformal limit of the theory.Comment: RevTeX, 25 pages, 2 figures, 5 tables, minor changes, to be published
in Phys. Rev.
The PMIP4 contribution to CMIP6 – Part 2: two interglacials, scientific objective and experimental design for Holocene and last interglacial simulations
Two interglacial epochs are included in the suite of Paleoclimate Modeling Intercomparison Project (PMIP4) simulations in the Coupled Model Intercomparison Project (CMIP6). The experimental protocols for Tier 1 simulations of the mid-Holocene (midHolocene, 6000 years before present) and the Last Interglacial (lig127k, 127,000 years before present) are described here. These equilibrium simulations are designed to examine the impact of changes in orbital forcing at times when atmospheric greenhouse gas levels were similar to those of the preindustrial period and the continental configurations were almost identical to modern. These simulations test our understanding of the interplay between radiative forcing and atmospheric circulation, and the connections among large-scale and regional climate changes giving rise to phenomena such as land-sea contrast and high-latitude amplification in temperature changes, and responses of the monsoons, as compared to today. They also provide an opportunity, through carefully designed additional CMIP6 Tier 2 and Tier 3 sensitivity experiments of PMIP4, to quantify the strength of atmosphere, ocean, cryosphere, and land-surface feedbacks. Sensitivity experiments are proposed to investigate the role of freshwater forcing in triggering abrupt climate changes within interglacial epochs. These feedback experiments naturally lead to a focus on climate evolution during interglacial periods, which will be examined through transient experiments. Analyses of the sensitivity simulations will also focus on interactions between extratropical and tropical circulation, and the relationship between changes in mean climate state and climate variability on annual to multi-decadal timescales. The comparative abundance of paleoenvironmental data and of quantitative climate reconstructions for the Holocene and Last Interglacial make these two epochs ideal candidates for systematic evaluation of model performance, and such comparisons will shed new light on the importance of external feedbacks (e.g., vegetation, dust) and the ability of state-of-the-art models to simulate climate changes realistically
LSST: from Science Drivers to Reference Design and Anticipated Data Products
(Abridged) We describe here the most ambitious survey currently planned in
the optical, the Large Synoptic Survey Telescope (LSST). A vast array of
science will be enabled by a single wide-deep-fast sky survey, and LSST will
have unique survey capability in the faint time domain. The LSST design is
driven by four main science themes: probing dark energy and dark matter, taking
an inventory of the Solar System, exploring the transient optical sky, and
mapping the Milky Way. LSST will be a wide-field ground-based system sited at
Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m
effective) primary mirror, a 9.6 deg field of view, and a 3.2 Gigapixel
camera. The standard observing sequence will consist of pairs of 15-second
exposures in a given field, with two such visits in each pointing in a given
night. With these repeats, the LSST system is capable of imaging about 10,000
square degrees of sky in a single filter in three nights. The typical 5
point-source depth in a single visit in will be (AB). The
project is in the construction phase and will begin regular survey operations
by 2022. The survey area will be contained within 30,000 deg with
, and will be imaged multiple times in six bands, ,
covering the wavelength range 320--1050 nm. About 90\% of the observing time
will be devoted to a deep-wide-fast survey mode which will uniformly observe a
18,000 deg region about 800 times (summed over all six bands) during the
anticipated 10 years of operations, and yield a coadded map to . The
remaining 10\% of the observing time will be allocated to projects such as a
Very Deep and Fast time domain survey. The goal is to make LSST data products,
including a relational database of about 32 trillion observations of 40 billion
objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures
available from https://www.lsst.org/overvie
Identification and thermochemical analysis of high-lignin feedstocks for biofuel and biochemical production
Background - Lignin is a highly abundant biopolymer synthesized by plants as a complex component of plant secondary cell walls. Efforts to utilize lignin-based bioproducts are needed. Results - Herein we identify and characterize the composition and pyrolytic deconstruction characteristics of high-lignin feedstocks. Feedstocks displaying the highest levels of lignin were identified as drupe endocarp biomass arising as agricultural waste from horticultural crops. By performing pyrolysis coupled to gas chromatography-mass spectrometry, we characterized lignin-derived deconstruction products from endocarp biomass and compared these with switchgrass. By comparing individual pyrolytic products, we document higher amounts of acetic acid, 1-hydroxy-2-propanone, acetone and furfural in switchgrass compared to endocarp tissue, which is consistent with high holocellulose relative to lignin. By contrast, greater yields of lignin-based pyrolytic products such as phenol, 2-methoxyphenol, 2-methylphenol, 2-methoxy-4-methylphenol and 4-ethyl-2-methoxyphenol arising from drupe endocarp tissue are documented. Conclusions - Differences in product yield, thermal decomposition rates and molecular species distribution among the feedstocks illustrate the potential of high-lignin endocarp feedstocks to generate valuable chemicals by thermochemical deconstruction
Disease Gene Characterization through Large-Scale Co-Expression Analysis
In the post genome era, a major goal of biology is the identification of specific roles for individual genes. We report a new genomic tool for gene characterization, the UCLA Gene Expression Tool (UGET).Celsius, the largest co-normalized microarray dataset of Affymetrix based gene expression, was used to calculate the correlation between all possible gene pairs on all platforms, and generate stored indexes in a web searchable format. The size of Celsius makes UGET a powerful gene characterization tool. Using a small seed list of known cartilage-selective genes, UGET extended the list of known genes by identifying 32 new highly cartilage-selective genes. Of these, 7 of 10 tested were validated by qPCR including the novel cartilage-specific genes SDK2 and FLJ41170. In addition, we retrospectively tested UGET and other gene expression based prioritization tools to identify disease-causing genes within known linkage intervals. We first demonstrated this utility with UGET using genetically heterogeneous disorders such as Joubert syndrome, microcephaly, neuropsychiatric disorders and type 2 limb girdle muscular dystrophy (LGMD2) and then compared UGET to other gene expression based prioritization programs which use small but discrete and well annotated datasets. Finally, we observed a significantly higher gene correlation shared between genes in disease networks associated with similar complex or Mendelian disorders.UGET is an invaluable resource for a geneticist that permits the rapid inclusion of expression criteria from one to hundreds of genes in genomic intervals linked to disease. By using thousands of arrays UGET annotates and prioritizes genes better than other tools especially with rare tissue disorders or complex multi-tissue biological processes. This information can be critical in prioritization of candidate genes for sequence analysis
Transforming Growth Factor: β Signaling Is Essential for Limb Regeneration in Axolotls
Axolotls (urodele amphibians) have the unique ability, among vertebrates, to perfectly regenerate many parts of their body including limbs, tail, jaw and spinal cord following injury or amputation. The axolotl limb is the most widely used structure as an experimental model to study tissue regeneration. The process is well characterized, requiring multiple cellular and molecular mechanisms. The preparation phase represents the first part of the regeneration process which includes wound healing, cellular migration, dedifferentiation and proliferation. The redevelopment phase represents the second part when dedifferentiated cells stop proliferating and redifferentiate to give rise to all missing structures. In the axolotl, when a limb is amputated, the missing or wounded part is regenerated perfectly without scar formation between the stump and the regenerated structure. Multiple authors have recently highlighted the similarities between the early phases of mammalian wound healing and urodele limb regeneration. In mammals, one very important family of growth factors implicated in the control of almost all aspects of wound healing is the transforming growth factor-beta family (TGF-β). In the present study, the full length sequence of the axolotl TGF-β1 cDNA was isolated. The spatio-temporal expression pattern of TGF-β1 in regenerating limbs shows that this gene is up-regulated during the preparation phase of regeneration. Our results also demonstrate the presence of multiple components of the TGF-β signaling machinery in axolotl cells. By using a specific pharmacological inhibitor of TGF-β type I receptor, SB-431542, we show that TGF-β signaling is required for axolotl limb regeneration. Treatment of regenerating limbs with SB-431542 reveals that cellular proliferation during limb regeneration as well as the expression of genes directly dependent on TGF-β signaling are down-regulated. These data directly implicate TGF-β signaling in the initiation and control of the regeneration process in axolotls
Metacarpal trabecular bone varies with distinct hand-positions used in hominid locomotion
Trabecular bone remodels during life in response to loading and thus should, at least in part, reflect potential variation in the magnitude, frequency and direction of joint loading across different hominid species. Here we analyse the trabecular structure across all non-pollical metacarpal distal heads (Mc2-5) in extant great apes, expanding on previous volume of interest and whole-epiphysis analyses that have largely focussed on only the first or third metacarpal. Specifically, we employ both a univariate statistical mapping and a multivariate approach to test for both inter-ray and interspecific differences in relative trabecular bone volume fraction (RBV/TV) and degree of anisotropy (DA) in Mc2-5 subchondral trabecular bone. Results demonstrate that while DA values only separate Pongo from African apes (Pan troglodytes, Pan paniscus, Gorilla gorilla), RBV/TV distribution varies with the predicted loading of the metacarpophalangeal (McP) joints during locomotor behaviours in each species. Gorilla exhibits a relatively dorsal distribution of RBV/TV consistent with habitual hyper-extension of the McP joints during knuckle-walking, whereas Pongo has a palmar distribution consistent with flexed McP joints used to grasp arboreal substrates. Both Pan species possess a disto-dorsal distribution of RBV/TV, compatible with multiple hand postures associated with a more varied locomotor regime. Further inter-ray comparisons reveal RBV/TV patterns consistent with varied knuckle-walking postures in Pan species in contrast to higher RBV/TV values toward the midline of the hand in Mc2 and Mc5 of Gorilla, consistent with habitual palm-back knuckle-walking. These patterns of trabecular bone distribution and structure reflect different behavioural signals that could be useful for determining the behaviours of fossil hominins
Physiological Correlates of Volunteering
We review research on physiological correlates of volunteering, a neglected but promising research field. Some of these correlates seem to be causal factors influencing volunteering. Volunteers tend to have better physical health, both self-reported and expert-assessed, better mental health, and perform better on cognitive tasks. Research thus far has rarely examined neurological, neurochemical, hormonal, and genetic correlates of volunteering to any significant extent, especially controlling for other factors as potential confounds. Evolutionary theory and behavioral genetic research suggest the importance of such physiological factors in humans. Basically, many aspects of social relationships and social activities have effects on health (e.g., Newman and Roberts 2013; Uchino 2004), as the widely used biopsychosocial (BPS) model suggests (Institute of Medicine 2001). Studies of formal volunteering (FV), charitable giving, and altruistic behavior suggest that physiological characteristics are related to volunteering, including specific genes (such as oxytocin receptor [OXTR] genes, Arginine vasopressin receptor [AVPR] genes, dopamine D4 receptor [DRD4] genes, and 5-HTTLPR). We recommend that future research on physiological factors be extended to non-Western populations, focusing specifically on volunteering, and differentiating between different forms and types of volunteering and civic participation
- …