196 research outputs found
Impact of a hospice rapid response service on preferred place of death, and costs
Background: Many people with a terminal illness would prefer to die at home. A new palliative rapid response service (RRS) provided by a large hospice provider in South East England was evaluated (2010) to provide evidence of impact on achieving preferred place of death and costs. The RRS was delivered by a team of trained health care assistants and available 24/7. The purpose of this study was to (i) compare the characteristics of RRS users and non-users, (ii) explore differences in the proportions of users and non-users dying in the place of their choice, (iii) monitor the whole system service utilisation of users and non-users, and compare costs. Methods: All hospice patients who died with a preferred place of death recorded during an 18 month period were included. Data (demographic, preferences for place of death) were obtained from hospice records. Dying in preferred place was modelled using stepwise logistic regression analysis. Service use data (period between referral to hospice and death) were obtained from general practitioners, community providers, hospitals, social services, hospice, and costs calculated using validated national tariffs. Results: Of 688 patients referred to the hospice when the RRS was operational, 247 (35.9 %) used it. Higher proportions of RRS users than non-users lived in their own homes with a co-resident carer (40.3 % vs. 23.7 %); more non-users lived alone or in residential care (58.8 % vs. 76.3 %). Chances of dying in the preferred place were enhanced 2.1 times by being a RRS user, compared to a non-user, and 1.5 times by having a co-resident carer, compared to living at home alone or in a care home. Total service costs did not differ between users and non-users, except when referred to hospice very close to death (users had higher costs). Conclusions: Use of the RRS was associated with increased likelihood of dying in the preferred place. The RRS is cost neutral
Teosinte Inflorescence Phytolith Assemblages Mirror Zea Taxonomy
Molecular DNA analyses of the New World grass (Poaceae) genus Zea, comprising five species, has resolved taxonomic issues including the most likely teosinte progenitor (Zea mays ssp. parviglumis) of maize (Zea mays ssp. mays). However, archaeologically, little is known about the use of teosinte by humans both prior to and after the domestication of maize. One potential line of evidence to explore these relationships is opaline phytoliths produced in teosinte fruit cases. Here we use multidimensional scaling and multiple discriminant analyses to determine if rondel phytolith assemblages from teosinte fruitcases reflect teosinte taxonomy. Our results indicate that rondel phytolith assemblages from the various taxa, including subspecies, can be statistically discriminated. This indicates that it will be possible to investigate the archaeological histories of teosinte use pending the recovery of appropriate samples
The persisting burden of invasive pneumococcal disease in HIV patients: an observational cohort study
<p>Abstract</p> <p>Background</p> <p>The increasing use of highly active antiretroviral therapy (HAART) and pneumococcal immunization along with shifting community exposures may have altered the burden of <it>Streptococcus pneumoniae </it>disease in HIV-infected persons. We describe the burden and risk factors for pneumococcal disease in the modern era of HIV care and evaluate the use of a 23-valent pneumococcal polysaccharide vaccine (PPV-23).</p> <p>Methods</p> <p>The incidence of invasive pneumococcal disease (IPD) between January 1<sup>st</sup>, 2000 and January 1<sup>st</sup>, 2010 in a regional HIV population in Southern Alberta, Canada was determined by linking comprehensive laboratory and hospital surveillance data. Clinical and epidemiologic data including risk factors for <it>S. pneumoniae</it>, history of pneumococcal immunization, serotypes of infections, and length of any hospitalizations for pneumococcal disease were evaluated with multivariate analysis. CD4 count and viral load at immunization were evaluated with a nested case-control analysis.</p> <p>Results</p> <p>In 1946 HIV-patients with 11,099 person-years of follow up, there were 68 distinct episodes of pneumococcal disease occurring in 50 patients. Increased risk was seen if female, age >60, Aboriginal ethnicity, lower education, injection drug use, smoking, nadir CD4 <200/μL, chronic obstructive pulmonary disease, and hepatitis C. Overall, the incidence of IPD was 342/100,000 person-years and was reduced to 187/100,000 within three years of PPV-23 immunization (P < 0.01). Although 78% of patients received PPV-23, 74% of IPD episodes were caused by PPV-23 serotypes. In a case-control analysis, HIV viral load at immunization was significantly predictive of PPV-23 failure, while CD4 count was not. 80% of IPD cases required hospitalization: median length of stay was 7 days (range: 1-71); four patients died.</p> <p>Conclusions</p> <p>Despite universal access to intensive measures to prevent pneumococcal disease including the widespread use of HAART and PPV-23 immunization, the incidence of IPD remains high in HIV patients with its associated morbidity and mortality.</p
Effectiveness of an online curriculum for medical students on genetics, genetic testing and counseling
Background: It is increasingly important that physicians have a thorough understanding of the basic science of human genetics and the ethical, legal and social implications (ELSI) associated with genetic testing and counseling. Methods: The authors developed a series of web-based courses for medical students on these topics. The course modules are interactive, emphasize clinical case studies, and can easily be incorporated into existing medical school curricula. Results: Results of a ‘real world’ effectiveness trial indicate that the courses have a statistically significant effect on knowledge, attitude, intended behavior and self-efficacy related to genetic testing (p<0.001; N varies between 163 and 596 for each course). Conclusions: The results indicate that this curriculum is an effective tool for educating medical students on the ELSI associated with genetic testing and for promoting positive changes in students' confidence, counseling attitudes and behaviors
A Phylogeny and Timescale for the Evolution of Pseudocheiridae (Marsupialia: Diprotodontia) in Australia and New Guinea
Pseudocheiridae (Marsupialia: Diprotodontia) is a family of endemic Australasian arboreal folivores, more commonly known as ringtail possums. Seventeen extant species are grouped into six genera (Pseudocheirus, Pseudochirulus, Hemibelideus, Petauroides, Pseudochirops, Petropseudes). Pseudochirops and Pseudochirulus are the only genera with representatives on New Guinea and surrounding western islands. Here, we examine phylogenetic relationships among 13 of the 17 extant pseudocheirid species based on protein-coding portions of the ApoB, BRCA1, ENAM, IRBP, Rag1, and vWF genes. Maximum parsimony, maximum likelihood, and Bayesian methods were used to estimate phylogenetic relationships. Two different relaxed molecular clock methods were used to estimate divergence times. Bayesian and maximum parsimony methods were used to reconstruct ancestral character states for geographic provenance and maximum elevation occupied. We find robust support for the monophyly of Pseudocheirinae (Pseudochirulus + Pseudocheirus), Hemibelidinae (Hemibelideus + Petauroides), and Pseudochiropsinae (Pseudochirops + Petropseudes), respectively, and for an association of Pseudocheirinae and Hemibelidinae to the exclusion of Pseudochiropsinae. Within Pseudochiropsinae, Petropseudes grouped more closely with the New Guinean Pseudochirops spp. than with the Australian Pseudochirops archeri, rendering Pseudochirops paraphyletic. New Guinean species belonging to Pseudochirops are monophyletic, as are New Guinean species belonging to Pseudochirulus. Molecular dates and ancestral reconstructions of geographic provenance combine to suggest that the ancestors of extant New Guinean Pseudochirops spp. and Pseudochirulus spp. dispersed from Australia to New Guinea ∼12.1–6.5 Ma (Pseudochirops) and ∼6.0–2.4 Ma (Pseudochirulus). Ancestral state reconstructions support the hypothesis that occupation of high elevations (>3000 m) is a derived feature that evolved on the terminal branch leading to Pseudochirops cupreus, and either evolved in the ancestor of Pseudochirulus forbesi, Pseudochirulus mayeri, and Pseudochirulus caroli, with subsequent loss in P. caroli, or evolved independently in P. mayeri and P. forbesi. Divergence times within the New Guinean Pseudochirops clade are generally coincident with the uplift of the central cordillera and other highlands. Diversification within New Guinean Pseudochirulus occurred in the Plio-Pleistocene after the establishment of the Central Range and other highlands
Comparison of Plasmodium berghei challenge models for the evaluation of pre-erythrocytic malaria vaccines and their effect on perceived vaccine efficacy
<p>Abstract</p> <p>Background</p> <p>The immunological mechanisms responsible for protection against malaria infection vary among <it>Plasmodium </it>species, host species and the developmental stage of parasite, and are poorly understood. A challenge with live parasites is the most relevant approach to testing the efficacy of experimental malaria vaccines. Nevertheless, in the mouse models of <it>Plasmodium berghei </it>and <it>Plasmodium yoelii</it>, parasites are usually delivered by intravenous injection. This route is highly artificial and particularly in the <it>P. berghei </it>model produces inconsistent challenge results. The initial objective of this study was to compare an optimized intravenous (IV) delivery challenge model with an optimized single infectious mosquito bite challenge model. Finding shortcomings of both approaches, an alternative approach was explored, <it>i.e</it>., the subcutaneous challenge.</p> <p>Methods</p> <p>Mice were infected with <it>P. berghei </it>sporozoites by intravenous (tail vein) injection, single mosquito bite, or subcutaneous injection of isolated parasites into the subcutaneous pouch at the base of the hind leg. Infection was determined in blood smears 7 and 14 days later. To determine the usefulness of challenge models for vaccine testing, mice were immunized with circumsporozoite-based DNA vaccines by gene gun.</p> <p>Results</p> <p>Despite modifications that allowed infection with a much smaller than reported number of parasites, the IV challenge remained insufficiently reliable and reproducible. Variations in the virulence of the inoculum, if not properly monitored by the rigorous inclusion of sporozoite titration curves in each experiment, can lead to unacceptable variations in reported vaccine efficacies. In contrast, mice with different genetic backgrounds were consistently infected by a single mosquito bite, without overwhelming vaccine-induced protective immune responses. Because of the logistical challenges associated with the mosquito bite model, the subcutaneous challenge route was optimized. This approach, too, yields reliable challenge results, albeit requiring a relatively large inoculum.</p> <p>Conclusions</p> <p>Although a single bite by <it>P. berghei </it>infected <it>Anopheles </it>mosquitoes was superior to the IV challenge route, it is laborious. However, any conclusive evaluation of a pre-erythrocytic malaria vaccine candidate should require challenge through the natural anatomic target site of the parasite, the skin. The subcutaneous injection of isolated parasites represents an attractive compromise. Similar to the mosquito bite model, it allows vaccine-induced antibodies to exert their effect and is, therefore not as prone to the artifacts of the IV challenge.</p
A Review on the Oral Health Impacts of Acculturation
The impact of acculturation on systemic health has been extensively investigated and is regarded as an important explanatory factor for health disparity. However, information is limited and fragmented on the oral health implications of acculturation. This study aimed to review the current evidence on the oral health impact of acculturation. Papers were retrieved from five electronic databases. Twenty-seven studies were included in this review. Their scientific quality was rated and key findings were summarized. Seventeen studies investigated the impacts of acculturation on the utilization of dental services; among them, 16 reported positive associations between at least one acculturation indicator and use of dental services. All 15 studies relating acculturation to oral diseases (dental caries and periodontal disease) suggested better oral health among acculturated individuals. Evidence is lacking to support that better oral health of acculturated immigrants is attributable to their improved dental attendance. Further researches involving other oral health behaviors and diseases and incorporating refined acculturation scales are needed. Prospective studies will facilitate the understanding on the trajectory of immigrants’ oral health along the acculturation continuum
The Evolution of Compact Binary Star Systems
We review the formation and evolution of compact binary stars consisting of
white dwarfs (WDs), neutron stars (NSs), and black holes (BHs). Binary NSs and
BHs are thought to be the primary astrophysical sources of gravitational waves
(GWs) within the frequency band of ground-based detectors, while compact
binaries of WDs are important sources of GWs at lower frequencies to be covered
by space interferometers (LISA). Major uncertainties in the current
understanding of properties of NSs and BHs most relevant to the GW studies are
discussed, including the treatment of the natal kicks which compact stellar
remnants acquire during the core collapse of massive stars and the common
envelope phase of binary evolution. We discuss the coalescence rates of binary
NSs and BHs and prospects for their detections, the formation and evolution of
binary WDs and their observational manifestations. Special attention is given
to AM CVn-stars -- compact binaries in which the Roche lobe is filled by
another WD or a low-mass partially degenerate helium-star, as these stars are
thought to be the best LISA verification binary GW sources.Comment: 105 pages, 18 figure
Pneumonia and poverty: a prospective population-based study among children in Brazil
<p>Abstract</p> <p>Background</p> <p>Children in developing country suffer the highest burden of pneumonia. However, few studies have evaluated associations between poverty and pneumonia.</p> <p>Methods</p> <p>A prospective population-based study on pneumonia was carried out as part of the Latin America Epidemiological Assessment of Pneumococcus (LEAP study). Chest x-rays were obtained for children one to 35 months old with suspected pneumonia presenting to emergency care centers and hospital emergency rooms in Goiania, Brazil. Chest radiographs were evaluated according to WHO guidelines. Clustering of radiologically-confirmed pneumonia were evaluated using a Poisson-based spatial scan statistic. Associations between census socioeconomic indicators and pneumonia incidence rates were analyzed using generalized linear models.</p> <p>Results</p> <p>From May, 2007 to May, 2009, chest radiographs were obtained from 11 521 children with clinical pneumonia; 3955 episodes were classified as radiologically-confirmed. Incidence rates were significantly higher in very low income areas (4825.2 per 10<sup>5</sup>) compared to high income areas (1637.3 per 10<sup>5</sup>). Spatial analysis identified clustering of confirmed pneumonia in Western (RR 1.78; p = 0.001) and Southeast (RR 1.46; p = 0.001) regions of the city, and clustering of hospitalized pneumonia in the Western region (RR 1.69; p = 0.001). Lower income households and illiteracy were associated with pneumonia incidence.</p> <p>Conclusions</p> <p>In infants the risk of developing pneumonia is inversely associated with the head of household income and with the woman educational level. Areas with deprived socioeconomic conditions had higher incidence of pneumonia and should be targeted for high vaccination coverage.</p
Long-term survival in patients with non-small cell lung cancer and synchronous brain metastasis treated with whole-brain radiotherapy and thoracic chemoradiation
<p>Abstract</p> <p>Background</p> <p>Brain metastases occur in 30-50% of Non-small cell lung cancer (NSCLC) patients and confer a worse prognosis and quality of life. These patients are usually treated with Whole-brain radiotherapy (WBRT) followed by systemic therapy. Few studies have evaluated the role of chemoradiotherapy to the primary tumor after WBRT as definitive treatment in the management of these patients.</p> <p>Methods</p> <p>We reviewed the outcome of 30 patients with primary NSCLC and brain metastasis at diagnosis without evidence of other metastatic sites. Patients were treated with WBRT and after induction chemotherapy with paclitaxel and cisplatin for two cycles. In the absence of progression, concurrent chemoradiotherapy for the primary tumor with weekly paclitaxel and carboplatin was indicated, with a total effective dose of 60 Gy. If disease progression was ruled out, four chemotherapy cycles followed.</p> <p>Results</p> <p>Median Progression-free survival (PFS) and Overall survival (OS) were 8.43 ± 1.5 and 31.8 ± 15.8 months, respectively. PFS was 39.5% at 1 year and 24.7% at 2 years. The 1- and 2-year OS rates were 71.1 and 60.2%, respectively. Three-year OS was significantly superior for patients with N0-N1 stage disease vs. N2-N3 (60 vs. 24%, respectively; Response rate [RR], 0.03; <it>p</it>= 0.038).</p> <p>Conclusions</p> <p>Patients with NSCLC and brain metastasis might benefit from treatment with WBRT and concurrent thoracic chemoradiotherapy. The subgroup of N0-N1 patients appears to achieve the greatest benefit. The result of this study warrants a prospective trial to confirm the benefit of this treatment.</p
- …