316 research outputs found

    Hearing and cognitive impairments increase the risk of long-term care admissions

    Get PDF
    Background and Objectives: The objective of the study was to understand how sensory impairments, alone or in combination with cognitive impairment (CI), relate to long-term care (LTC) admissions. Research Design and Methods: This retrospective cohort study used existing information from two interRAI assessments; the Resident Assessment Instrument for Home Care (RAI-HC) and the Minimum Data Set 2.0 (MDS 2.0), which were linked at the individual level for 371,696 unique individuals aged 65+ years. The exposure variables of interest included hearing impairment (HI), vision impairment (VI) and dual sensory impairment (DSI) ascertained at participants’ most recent RAI-HC assessment. The main outcome was admission to LTC. Survival analysis, using Cox proportional hazards regression models and Kaplan–Meier curves, was used to identify risk factors associated with LTC admissions. Observations were censored if they remained in home care, died or were discharged somewhere other than to LTC. Results: In this sample, 12.7% of clients were admitted to LTC, with a mean time to admission of 49.6 months (SE = 0.20). The main risk factor for LTC admission was a diagnosis of Alzheimer’s dementia (HR = 1.87; CI: 1.83, 1.90). A significant interaction between HI and CI was found, whereby individuals with HI but no CI had a slightly faster time to admission (40.5 months; HR = 1.14) versus clients with both HI and CI (44.9 months; HR = 2.11). Discussion and Implications: Although CI increases the risk of LTC admission, HI is also important, making it is imperative to continue to screen for sensory issues among older home care clients

    The Foraging Tunnel System of the Namibian Desert Termite, Baucaliotermes hainesi

    Get PDF
    The harvester termite, Baucaliotermes hainesi (Fuller) (Termitidae: Nasutitermitinae), is an endemic in southern Namibia, where it collects and eats dry grass. At the eastern, landward edge of the Namib Desert, the nests of these termites are sometimes visible above ground surface, and extend at least 60 cm below ground. The termites gain access to foraging areas through underground foraging tunnels that emanate from the nest. The looseness of the desert sand, combined with the hardness of the cemented sand tunnels allowed the use of a gasolinepowered blower and soft brushes to expose tunnels lying 5 to 15 cm below the surface. The tunnels form a complex system that radiates at least 10 to 15 m from the nest with crossconnections between major tunnels. At 50 to 75 cm intervals, the tunnels are connected to the surface by vertical risers that can be opened to gain foraging access to the surrounding area. Foraging termites rarely need to travel more than a meter on the ground surface. The tunnels swoop up and down forming high points at riser locations, and they have a complex architecture. In the center runs a smooth, raised walkway along which termites travel, and along the sides lie pockets that act as depots where foragers deposit grass pieces harvested from the surface. Presumably, these pieces are transported to the nest by a second group of termites. There are also several structures that seem to act as vertical highways to greater depths, possibly even to moist soil. A census of a single nest revealed about 45,000 termites, of which 71% were workers, 9% soldiers and 6% neotenic supplementary reproductives. The nest consisted of a hard outer “carapace” of cemented sand, with a central living space of smooth, sweeping arches and surfaces. A second species of termite, Promirotermes sp. nested in the outer carapace

    Sri Lankan tsunami refugees: a cross sectional study of the relationships between housing conditions and self-reported health

    Get PDF
    BACKGROUND: On the 26th December 2004 the Asian tsunami devastated the Sri Lankan coastline. More than two years later, over 14,500 families were still living in transitional shelters. This study compares the health of the internally displaced people (IDP), living in transitional camps with those in permanent housing projects provided by government and non-government organisations in Sri Lanka. METHODS: This study was conducted in seven transitional camps and five permanent housing projects in the south west of Sri Lanka. Using an interviewer-led questionnaire, data on the IDPs' self-reported health and housing conditions were collected from 154 participants from transitional camps and 147 participants from permanent housing projects. Simple tabulation with non-parametric tests and logistic regression were used to identify and analyse relationships between housing conditions and the reported prevalence of specific symptoms. RESULTS: Analysis showed that living conditions were significantly worse in transitional camps than in permanent housing projects for all factors investigated, except 'having a leaking roof'. Transitional camp participants scored significantly lower on self-perceived overall health scores than those living in housing projects. After controlling for gender, age and marital status, living in a transitional camp compared to a housing project was found to be a significant risk factor for the following symptoms; coughs OR: 3.53 (CI: 2.11-5.89), stomach ache 4.82 (2.19-10.82), headache 5.20 (3.09-8.76), general aches and pains 6.44 (3.67-11.33) and feeling generally unwell 2.28 (2.51-7.29). Within transitional camp data, the only condition shown to be a significant risk factor for any symptom was household population density, which increased the risk of stomach aches 1.40 (1.09-1.79) and headaches 1.33 (1.01-1.77). CONCLUSION: Internally displaced people living in transitional camps are a vulnerable population and specific interventions need to be targeted at this population to address the health inequalities that they report to be experiencing. Further studies need to be conducted to establish which aspects of their housing environment predispose them to poorer health

    The Evolution of Host Specialization in the Vertebrate Gut Symbiont Lactobacillus reuteri

    Get PDF
    Recent research has provided mechanistic insight into the important contributions of the gut microbiota to vertebrate biology, but questions remain about the evolutionary processes that have shaped this symbiosis. In the present study, we showed in experiments with gnotobiotic mice that the evolution of Lactobacillus reuteri with rodents resulted in the emergence of host specialization. To identify genomic events marking adaptations to the murine host, we compared the genome of the rodent isolate L. reuteri 100-23 with that of the human isolate L. reuteri F275, and we identified hundreds of genes that were specific to each strain. In order to differentiate true host-specific genome content from strain-level differences, comparative genome hybridizations were performed to query 57 L. reuteri strains originating from six different vertebrate hosts in combination with genome sequence comparisons of nine strains encompassing five phylogenetic lineages of the species. This approach revealed that rodent strains, although showing a high degree of genomic plasticity, possessed a specific genome inventory that was rare or absent in strains from other vertebrate hosts. The distinct genome content of L. reuteri lineages reflected the niche characteristics in the gastrointestinal tracts of their respective hosts, and inactivation of seven out of eight representative rodent-specific genes in L. reuteri 100-23 resulted in impaired ecological performance in the gut of mice. The comparative genomic analyses suggested fundamentally different trends of genome evolution in rodent and human L. reuteri populations, with the former possessing a large and adaptable pan-genome while the latter being subjected to a process of reductive evolution. In conclusion, this study provided experimental evidence and a molecular basis for the evolution of host specificity in a vertebrate gut symbiont, and it identified genomic events that have shaped this process

    Control of star formation by supersonic turbulence

    Full text link
    Understanding the formation of stars in galaxies is central to much of modern astrophysics. For several decades it has been thought that stellar birth is primarily controlled by the interplay between gravity and magnetostatic support, modulated by ambipolar diffusion. Recently, however, both observational and numerical work has begun to suggest that support by supersonic turbulence rather than magnetic fields controls star formation. In this review we outline a new theory of star formation relying on the control by turbulence. We demonstrate that although supersonic turbulence can provide global support, it nevertheless produces density enhancements that allow local collapse. Inefficient, isolated star formation is a hallmark of turbulent support, while efficient, clustered star formation occurs in its absence. The consequences of this theory are then explored for both local star formation and galactic scale star formation. (ABSTRACT ABBREVIATED)Comment: Invited review for "Reviews of Modern Physics", 87 pages including 28 figures, in pres

    The case for a 'sub-millimeter SDSS' : a 3D map of galaxy evolution to z~10

    Get PDF
    Science White paper submitted to the Astro2020 Decadal SurveyThe Sloan Digital Sky Survey (SDSS) was revolutionary because of the extraordinary breadth and ambition of its optical imaging and spectroscopy. We argue that a 'sub-millimeter SDSS' - a sensitive large-area imaging+spectroscopic survey in the sub-mm window - will revolutionize our understanding of galaxy evolution in the early Universe. By detecting the thermal dust continuum emission and atomic and molecular line emission of galaxies out to z~10 it will be possible to measure the redshifts, star formation rates, dust and gas content of hundreds of thousands of high-z galaxies down to ~L*. Many of these galaxies will have counterparts visible in the deep optical imaging of the Large Synoptic Survey Telescope. This 3D map of galaxy evolution will span the peak epoch of galaxy formation all the way back to cosmic dawn, measuring the co-evolution of the star formation rate density and molecular gas content of galaxies, tracking the production of metals and charting the growth of large-scale structure.Non peer reviewe

    Conformational Dynamics of Single pre-mRNA Molecules During \u3cem\u3eIn Vitro\u3c/em\u3e Splicing

    Get PDF
    The spliceosome is a complex small nuclear RNA (snRNA)-protein machine that removes introns from pre-mRNAs via two successive phosphoryl transfer reactions. The chemical steps are isoenergetic, yet splicing requires at least eight RNA-dependent ATPases responsible for substantial conformational rearrangements. To comprehensively monitor pre-mRNA conformational dynamics, we developed a strategy for single-molecule FRET (smFRET) that uses a small, efficiently spliced yeast pre-mRNA, Ubc4, in which donor and acceptor fluorophores are placed in the exons adjacent to the 5â€Č and 3â€Č splice sites. During splicing in vitro, we observed a multitude of generally reversible time-and ATP-dependent conformational transitions of individual pre-mRNAs. The conformational dynamics of branchpoint and 3â€Č-splice site mutants differ from one another and from wild type. Because all transitions are reversible, spliceosome assembly appears to be occurring close to thermal equilibrium
    • 

    corecore