90 research outputs found
Globular Cluster and Galaxy Formation: M31, the Milky Way and Implications for Globular Cluster Systems of Spiral Galaxies
The globular cluster (GC) systems of the Milky Way and of our neighboring
spiral galaxy, M31, comprise 2 distinct entities, differing in 3 respects. 1.
M31 has young GCs, ages from ~100 Myr to 5 Gyr old, as well as old globular
clusters. No such young GCs are known in the Milky Way. 2. We confirm that the
oldest M31 GCs have much higher nitrogen abundances than do Galactic GCs at
equivalent metallicities. 3. Morrison et al. found M31 has a subcomponent of
GCs that follow closely the disk rotation curve of M31. Such a GC system in our
own Galaxy has yet to be found. These data are interpreted in terms of the
hierarchical-clustering-merging (HCM) paradigm for galaxy formation. We infer
that M31 has absorbed more of its dwarf systems than has the Milky Way. This
inference has 3 implications: 1. All spiral galaxies likely differ in their GC
properties, depending on how many companions each galaxy has, and when the
parent galaxy absorbs them. The the Milky Way ties down one end of this
spectrum, as almost all of its GCs were absorbed 10-12 Gyr ago. 2. It suggests
that young GCs are preferentially formed in the dwarf companions of parent
galaxies, and then absorbed by the parent galaxy during mergers. 3. Young GCs
seen in tidally-interacting galaxies might come from dwarf companions of these
galaxies, rather than be made a-new in the tidal interaction. There is no ready
explanation for the marked difference in nitrogen abundance for old M31 GCs
relative to the oldest Galactic GCs. The predictions made by Li & Burstein
regarding the origin of nitrogen abundance in globular clusters are consistent
with what is found for the old M31 GCs compared to that for the two 5 Gyr-old
M31 GCs.Comment: to be published in ApJ, Oct 2004; 13 pages of text, 2 tables, 7
postscript figure
Prospectus, November 26, 1986
https://spark.parkland.edu/prospectus_1986/1031/thumbnail.jp
Effects of donor cause of death, ischemia time, inotrope exposure, troponin values, cardiopulmonary resuscitation, electrocardiographic and echocardiographic data on recipient outcomes: A review of the literature
BackgroundHeart transplantation has become standard of care for pediatric patients with either end‐stage heart failure or inoperable congenital heart defects. Despite increasing surgical complexity and overall volume, however, annual transplant rates remain largely unchanged. Data demonstrating pediatric donor heart refusal rates of 50% suggest optimizing donor utilization is critical. This review evaluated the impact of donor characteristics surrounding the time of death on pediatric heart transplant recipient outcomes.MethodsAn extensive literature review was performed to identify articles focused on donor characteristics surrounding the time of death and their impact on pediatric heart transplant recipient outcomes.ResultsPotential pediatric heart transplant recipient institutions commonly receive data from seven different donor death‐related categories with which to determine organ acceptance: cause of death, need for CPR, serum troponin, inotrope exposure, projected donor ischemia time, electrocardiographic, and echocardiographic results. Although DITs up to 8 hours have been reported with comparable recipient outcomes, most data support minimizing this period to <4 hours. CVA as a cause of death may be associated with decreased recipient survival but is rare in the pediatric population. Otherwise, however, in the setting of an acceptable donor heart with a normal echocardiogram, none of the other data categories surrounding donor death negatively impact pediatric heart transplant recipient survival.ConclusionsEchocardiographic evaluation is the most important donor clinical information following declaration of brain death provided to potential recipient institutions. Considering its relative importance, every effort should be made to allow direct image visualization.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/154939/1/petr13676.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/154939/2/petr13676_am.pd
A Temporal Role Of Type I Interferon Signaling in CD8+ T Cell Maturation during Acute West Nile Virus Infection
A genetic absence of the common IFN- α/β signaling receptor (IFNAR) in mice is associated with enhanced viral replication and altered adaptive immune responses. However, analysis of IFNAR-/- mice is limited for studying the functions of type I IFN at discrete stages of viral infection. To define the temporal functions of type I IFN signaling in the context of infection by West Nile virus (WNV), we treated mice with MAR1-5A3, a neutralizing, non cell-depleting anti-IFNAR antibody. Inhibition of type I IFN signaling at or before day 2 after infection was associated with markedly enhanced viral burden, whereas treatment at day 4 had substantially less effect on WNV dissemination. While antibody treatment prior to infection resulted in massive expansion of virus-specific CD8+ T cells, blockade of type I IFN signaling starting at day 4 induced dysfunctional CD8+ T cells with depressed cytokine responses and expression of phenotypic markers suggesting exhaustion. Thus, only the later maturation phase of anti-WNV CD8+ T cell development requires type I IFN signaling. WNV infection experiments in BATF3-/- mice, which lack CD8-α dendritic cells and have impaired priming due to inefficient antigen cross-presentation, revealed a similar effect of blocking IFN signaling on CD8+ T cell maturation. Collectively, our results suggest that cell non-autonomous type I IFN signaling shapes maturation of antiviral CD8+ T cell response at a stage distinct from the initial priming event
Sensory Communication
Contains table of contents for Section 2, an introduction and reports on twelve research projects.National Institutes of Health Grant 5 R01 DC00117National Institutes of Health Contract 2 P01 DC00361National Institutes of Health Grant 5 R01 DC00126National Institutes of Health Grant R01-DC00270U.S. Air Force - Office of Scientific Research Contract AFOSR-90-0200National Institutes of Health Grant R29-DC00625U.S. Navy - Office of Naval Research Grant N00014-88-K-0604U.S. Navy - Office of Naval Research Grant N00014-91-J-1454U.S. Navy - Office of Naval Research Grant N00014-92-J-1814U.S. Navy - Naval Training Systems Center Contract N61339-93-M-1213U.S. Navy - Naval Training Systems Center Contract N61339-93-C-0055U.S. Navy - Naval Training Systems Center Contract N61339-93-C-0083U.S. Navy - Office of Naval Research Grant N00014-92-J-4005U.S. Navy - Office of Naval Research Grant N00014-93-1-119
Kinematic Plasticity during Flight in Fruit Bats: Individual Variability in Response to Loading
All bats experience daily and seasonal fluctuation in body mass. An increase in mass requires changes in flight kinematics to produce the extra lift necessary to compensate for increased weight. How bats modify their kinematics to increase lift, however, is not well understood. In this study, we investigated the effect of a 20% increase in mass on flight kinematics for Cynopterus brachyotis, the lesser dog-faced fruit bat. We reconstructed the 3D wing kinematics and how they changed with the additional mass. Bats showed a marked change in wing kinematics in response to loading, but changes varied among individuals. Each bat adjusted a different combination of kinematic parameters to increase lift, indicating that aerodynamic force generation can be modulated in multiple ways. Two main kinematic strategies were distinguished: bats either changed the motion of the wings by primarily increasing wingbeat frequency, or changed the configuration of the wings by increasing wing area and camber. The complex, individual-dependent response to increased loading in our bats points to an underappreciated aspect of locomotor control, in which the inherent complexity of the biomechanical system allows for kinematic plasticity. The kinematic plasticity and functional redundancy observed in bat flight can have evolutionary consequences, such as an increase potential for morphological and kinematic diversification due to weakened locomotor trade-offs
The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy
Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations.
Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves.
Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p 90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score.
Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care
Finishing the euchromatic sequence of the human genome
The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead
Recommended from our members
Effect of Hydrocortisone on Mortality and Organ Support in Patients With Severe COVID-19: The REMAP-CAP COVID-19 Corticosteroid Domain Randomized Clinical Trial.
Importance: Evidence regarding corticosteroid use for severe coronavirus disease 2019 (COVID-19) is limited. Objective: To determine whether hydrocortisone improves outcome for patients with severe COVID-19. Design, Setting, and Participants: An ongoing adaptive platform trial testing multiple interventions within multiple therapeutic domains, for example, antiviral agents, corticosteroids, or immunoglobulin. Between March 9 and June 17, 2020, 614 adult patients with suspected or confirmed COVID-19 were enrolled and randomized within at least 1 domain following admission to an intensive care unit (ICU) for respiratory or cardiovascular organ support at 121 sites in 8 countries. Of these, 403 were randomized to open-label interventions within the corticosteroid domain. The domain was halted after results from another trial were released. Follow-up ended August 12, 2020. Interventions: The corticosteroid domain randomized participants to a fixed 7-day course of intravenous hydrocortisone (50 mg or 100 mg every 6 hours) (n = 143), a shock-dependent course (50 mg every 6 hours when shock was clinically evident) (n = 152), or no hydrocortisone (n = 108). Main Outcomes and Measures: The primary end point was organ support-free days (days alive and free of ICU-based respiratory or cardiovascular support) within 21 days, where patients who died were assigned -1 day. The primary analysis was a bayesian cumulative logistic model that included all patients enrolled with severe COVID-19, adjusting for age, sex, site, region, time, assignment to interventions within other domains, and domain and intervention eligibility. Superiority was defined as the posterior probability of an odds ratio greater than 1 (threshold for trial conclusion of superiority >99%). Results: After excluding 19 participants who withdrew consent, there were 384 patients (mean age, 60 years; 29% female) randomized to the fixed-dose (n = 137), shock-dependent (n = 146), and no (n = 101) hydrocortisone groups; 379 (99%) completed the study and were included in the analysis. The mean age for the 3 groups ranged between 59.5 and 60.4 years; most patients were male (range, 70.6%-71.5%); mean body mass index ranged between 29.7 and 30.9; and patients receiving mechanical ventilation ranged between 50.0% and 63.5%. For the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively, the median organ support-free days were 0 (IQR, -1 to 15), 0 (IQR, -1 to 13), and 0 (-1 to 11) days (composed of 30%, 26%, and 33% mortality rates and 11.5, 9.5, and 6 median organ support-free days among survivors). The median adjusted odds ratio and bayesian probability of superiority were 1.43 (95% credible interval, 0.91-2.27) and 93% for fixed-dose hydrocortisone, respectively, and were 1.22 (95% credible interval, 0.76-1.94) and 80% for shock-dependent hydrocortisone compared with no hydrocortisone. Serious adverse events were reported in 4 (3%), 5 (3%), and 1 (1%) patients in the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively. Conclusions and Relevance: Among patients with severe COVID-19, treatment with a 7-day fixed-dose course of hydrocortisone or shock-dependent dosing of hydrocortisone, compared with no hydrocortisone, resulted in 93% and 80% probabilities of superiority with regard to the odds of improvement in organ support-free days within 21 days. However, the trial was stopped early and no treatment strategy met prespecified criteria for statistical superiority, precluding definitive conclusions. Trial Registration: ClinicalTrials.gov Identifier: NCT02735707
Mortality from gastrointestinal congenital anomalies at 264 hospitals in 74 low-income, middle-income, and high-income countries: a multicentre, international, prospective cohort study
Summary
Background Congenital anomalies are the fifth leading cause of mortality in children younger than 5 years globally.
Many gastrointestinal congenital anomalies are fatal without timely access to neonatal surgical care, but few studies
have been done on these conditions in low-income and middle-income countries (LMICs). We compared outcomes of
the seven most common gastrointestinal congenital anomalies in low-income, middle-income, and high-income
countries globally, and identified factors associated with mortality.
Methods We did a multicentre, international prospective cohort study of patients younger than 16 years, presenting to
hospital for the first time with oesophageal atresia, congenital diaphragmatic hernia, intestinal atresia, gastroschisis,
exomphalos, anorectal malformation, and Hirschsprung’s disease. Recruitment was of consecutive patients for a
minimum of 1 month between October, 2018, and April, 2019. We collected data on patient demographics, clinical
status, interventions, and outcomes using the REDCap platform. Patients were followed up for 30 days after primary
intervention, or 30 days after admission if they did not receive an intervention. The primary outcome was all-cause,
in-hospital mortality for all conditions combined and each condition individually, stratified by country income status.
We did a complete case analysis.
Findings We included 3849 patients with 3975 study conditions (560 with oesophageal atresia, 448 with congenital
diaphragmatic hernia, 681 with intestinal atresia, 453 with gastroschisis, 325 with exomphalos, 991 with anorectal
malformation, and 517 with Hirschsprung’s disease) from 264 hospitals (89 in high-income countries, 166 in middleincome
countries, and nine in low-income countries) in 74 countries. Of the 3849 patients, 2231 (58·0%) were male.
Median gestational age at birth was 38 weeks (IQR 36–39) and median bodyweight at presentation was 2·8 kg (2·3–3·3).
Mortality among all patients was 37 (39·8%) of 93 in low-income countries, 583 (20·4%) of 2860 in middle-income
countries, and 50 (5·6%) of 896 in high-income countries (p<0·0001 between all country income groups).
Gastroschisis had the greatest difference in mortality between country income strata (nine [90·0%] of ten in lowincome
countries, 97 [31·9%] of 304 in middle-income countries, and two [1·4%] of 139 in high-income countries;
p≤0·0001 between all country income groups). Factors significantly associated with higher mortality for all patients
combined included country income status (low-income vs high-income countries, risk ratio 2·78 [95% CI 1·88–4·11],
p<0·0001; middle-income vs high-income countries, 2·11 [1·59–2·79], p<0·0001), sepsis at presentation (1·20
[1·04–1·40], p=0·016), higher American Society of Anesthesiologists (ASA) score at primary intervention
(ASA 4–5 vs ASA 1–2, 1·82 [1·40–2·35], p<0·0001; ASA 3 vs ASA 1–2, 1·58, [1·30–1·92], p<0·0001]), surgical safety
checklist not used (1·39 [1·02–1·90], p=0·035), and ventilation or parenteral nutrition unavailable when needed
(ventilation 1·96, [1·41–2·71], p=0·0001; parenteral nutrition 1·35, [1·05–1·74], p=0·018). Administration of
parenteral nutrition (0·61, [0·47–0·79], p=0·0002) and use of a peripherally inserted central catheter (0·65
[0·50–0·86], p=0·0024) or percutaneous central line (0·69 [0·48–1·00], p=0·049) were associated with lower mortality.
Interpretation Unacceptable differences in mortality exist for gastrointestinal congenital anomalies between lowincome,
middle-income, and high-income countries. Improving access to quality neonatal surgical care in LMICs will
be vital to achieve Sustainable Development Goal 3.2 of ending preventable deaths in neonates and children younger
than 5 years by 2030
- …