3,856 research outputs found

    Caring for continence in stroke care settings: a qualitative study of patients’ and staff perspectives on the implementation of a new continence care intervention

    Get PDF
    Objectives: Investigate the perspectives of patients and nursing staff on the implementation of an augmented continence care intervention after stroke. Design: Qualitative data were elicited during semi-structured interviews with patients (n = 15) and staff (14 nurses; nine nursing assistants) and analysed using thematic analysis. Setting: Mixed acute and rehabilitation stroke ward. Participants: Stroke patients and nursing staff that experienced an enhanced continence care intervention. Results: Four themes emerged from patients’ interviews describing: (a) challenges communicating about continence (initiating conversations and information exchange); (b) mixed perceptions of continence care; (c) ambiguity of focus between mobility and continence issues; and (d) inconsistent involvement in continence care decision making. Patients’ perceptions reflected the severity of their urinary incontinence. Staff described changes in: (i) knowledge as a consequence of specialist training; (ii) continence interventions (including the development of nurse-led initiatives to reduce the incidence of unnecessary catheterisation among patients admitted to their ward); (iii) changes in attitude towards continence from containment approaches to continence rehabilitation; and (iv) the challenges of providing continence care within a stroke care context including limitations in access to continence care equipment or products, and institutional attitudes towards continence. Conclusion: Patients (particularly those with severe urinary incontinence) described challenges communicating about and involvement in continence care decisions. In contrast, nurses described improved continence knowledge, attitudes and confidence alongside a shift from containment to rehabilitative approaches. Contextual components including care from point of hospital admission, equipment accessibility and interdisciplinary approaches were perceived as important factors to enhancing continence care

    Aspirated capacitor measurements of air conductivity and ion mobility spectra

    Full text link
    Measurements of ions in atmospheric air are used to investigate atmospheric electricity and particulate pollution. Commonly studied ion parameters are (1) air conductivity, related to the total ion number concentration, and (2) the ion mobility spectrum, which varies with atmospheric composition. The physical principles of air ion instrumentation are long-established. A recent development is the computerised aspirated capacitor, which measures ions from (a) the current of charged particles at a sensing electrode, and (b) the rate of charge exchange with an electrode at a known initial potential, relaxing to a lower potential. As the voltage decays, only ions of higher and higher mobility are collected by the central electrode and contribute to the further decay of the voltage. This enables extension of the classical theory to calculate ion mobility spectra by inverting voltage decay time series. In indoor air, ion mobility spectra determined from both the novel voltage decay inversion, and an established voltage switching technique, were compared and shown to be of similar shape. Air conductivities calculated by integration were: 5.3 +- 2.5 fS/m and 2.7 +- 1.1 fS/m respectively, with conductivity determined to be 3 fS/m by direct measurement at a constant voltage. Applications of the new Relaxation Potential Inversion Method (RPIM) include air ion mobility spectrum retrieval from historical data, and computation of ion mobility spectra in planetary atmospheres.Comment: To be published in Review of Scientific Instrument

    Illusions of gunk

    Get PDF
    The possibility of gunk has been used to argue against mereological nihilism. This paper explores two responses on the part of the microphysical mereological nihilist: (1) the contingency defence, which maintains that nihilism is true of the actual world; but that at other worlds, composition occurs; (2) the impossibility defence, which maintains that nihilism is necessary true, and so gunk worlds are impossible. The former is argued to be ultimately unstable; the latter faces the explanatorily burden of explaining the illusion that gunk is possible. It is argued that we can discharge this burden by focussing on the contingency of the microphysicalist aspect of microphysical mereological nihilism. The upshot is that gunk-based arguments against microphysical mereological nihilism can be resisted

    Time trends in survival and readmission following coronary artery bypass grafting in Scotland, 1981-96: retrospective observational study

    Get PDF
    Improvements in coronary revascularisation techniques and an increase in the use of percutaneous interventions1 have led to a rise in the number of coronary artery bypass grafting operations in older patients with more severe cardiac disease and worse comorbidity and who have previously undergone revascularisation procedures. 2 3 Advances in surgical and anaesthetic techniques have prevented a worsening risk profile from being translated into an increase in perioperative deaths. 2 3 The aim of our study was to examine time trends in major outcomes up to two years after coronary artery bypass grafting

    AAAI: an Argument Against Artificial Intelligence

    Get PDF
    The ethical concerns regarding the successful development of an Artificial Intelligence have received a lot of attention lately. The idea is that even if we have good reason to believe that it is very unlikely, the mere possibility of an AI causing extreme human suffering is important enough to warrant serious consideration. Others look at this problem from the opposite perspective, namely that of the AI itself. Here the idea is that even if we have good reason to believe that it is very unlikely, the mere possibility of humanity causing extreme suffering to an AI is important enough to warrant serious consideration. This paper starts from the observation that both concerns rely on problematic philosophical assumptions. Rather than tackling these assumptions directly, it proceeds to present an argument that if one takes these assumptions seriously, then one has a moral obligation to advocate for a ban on the development of a conscious AI

    Urea treatment affects safe rates of seed placed nitrogen in Saskatchewan

    Get PDF
    Non-Peer ReviewedPlacing urea in close proximity to seed can cause seedling damage resulting in poor crop establishment. Plant densities are often well below the optimum, and plants that do emerge can exhibit poor vigor. Several strategies have been developed to reduce risk of seed damage from urea. Restricting the amount that is seed placed, placing urea at a safe distance and placement before or after seeding are effective but may not allow for application of adequate N or increase equipment and operating costs. Recently treatments applied to the urea granule such as Agrotain and polymer coating have been developed to slow the conversion to ammonium. Research suggests that the safe rate of N can be increased by 50% where Agrotain is used and are less clear when polymer coatings are used. To demonstrate how Agrotain and polymer treated urea affect crop establishment and yield, rates of 0, 1, 1.5, 2 and 4 times the recommended safe rate were seed placed at Scott, Swift Current, Canora and Redvers, Saskatchewan. Trials were conducted with wheat at all locations, and canola at Scott. Seed placed untreated urea was used as a check. As well, an alternate option using seed placed untreated urea followed by liquid urea ammonium nitrate dribble banded 20 to 35 days after seeding was investigated. Impact of treatments on plant density varied with rainfall across locations. Sites with lower precipitation after seeding indicated more severe damage to seedlings. Untreated urea placed with the seed had the greatest impact on plant density but, Agrotain and polymer treatments also led to decreases at high N rates. The improvement of Agrotain over untreated urea generally confirmed manufacturer recommendations that safe rates of seed placed urea can be increased by about 50%. The polymer was very effective at reducing damage from seed placed urea, but still generally resulted in fewer plants than side band at 4 times the recommended rate of N. Grain yield responses were also variable across locations. At most sites where plant stand reductions were high yield was also affected. Differences between all treatments were small at N rates up to 2 times the recommended rate but at 4 times, yield was reduced for Agrotain treated and untreated seed placed N. For treatments where liquid dribble band was compared to side banding little difference in yield was observed when soil residual N was high and precipitation was low. A reduction in yield was found when soil N and precipitation were low. Where the N supply from soil was large and precipitation higher, yield of dribble banded crop continued to respond after side banded crops had peaked

    Can Modal Skepticism Defeat Humean Skepticism?

    Get PDF
    My topic is moderate modal skepticism in the spirit of Peter van Inwagen. Here understood, this is a conservative version of modal empiricism that severely limits the extent to which an ordinary agent can reasonably believe “exotic” possibility claims. I offer a novel argument in support of this brand of skepticism: modal skepticism grounds an attractive (and novel) reply to Humean skepticism. Thus, I propose that modal skepticism be accepted on the basis of its theoretical utility as a tool for dissolving philosophical paradox

    Bayes and health care research.

    Get PDF
    Bayes’ rule shows how one might rationally change one’s beliefs in the light of evidence. It is the foundation of a statistical method called Bayesianism. In health care research, Bayesianism has its advocates but the dominant statistical method is frequentism. There are at least two important philosophical differences between these methods. First, Bayesianism takes a subjectivist view of probability (i.e. that probability scores are statements of subjective belief, not objective fact) whilst frequentism takes an objectivist view. Second, Bayesianism is explicitly inductive (i.e. it shows how we may induce views about the world based on partial data from it) whereas frequentism is at least compatible with non-inductive views of scientific method, particularly the critical realism of Popper. Popper and others detail significant problems with induction. Frequentism’s apparent ability to avoid these, plus its ability to give a seemingly more scientific and objective take on probability, lies behind its philosophical appeal to health care researchers. However, there are also significant problems with frequentism, particularly its inability to assign probability scores to single events. Popper thus proposed an alternative objectivist view of probability, called propensity theory, which he allies to a theory of corroboration; but this too has significant problems, in particular, it may not successfully avoid induction. If this is so then Bayesianism might be philosophically the strongest of the statistical approaches. The article sets out a number of its philosophical and methodological attractions. Finally, it outlines a way in which critical realism and Bayesianism might work together. </p

    Single-inhaler triple therapy in patients with chronic obstructive pulmonary disease:a systematic review

    Get PDF
    Abstract Background Guidelines recommend that treatment with a long-acting ÎČ2 agonist (LABA), a long-acting muscarinic antagonist (LAMA), and inhaled corticosteroids (ICS), i.e. triple therapy, is reserved for a select group of symptomatic patients with chronic obstructive pulmonary disease (COPD) who continue to exacerbate despite treatment with dual therapy (LABA/LAMA). A number of single-inhaler triple therapies are now available and important clinical questions remain over their role in the patient pathway. We compared the efficacy and safety of single-inhaler triple therapy to assess the magnitude of benefit and to identify patients with the best risk-benefit profile for treatment. We also evaluated and compared study designs and population characteristics to assess the strength of the evidence base. Methods We conducted a systematic search, from inception to December 2018, of randomised controlled trials (RCTs) of single-inhaler triple therapy in patients with COPD. The primary outcome was the annual rate of moderate and severe exacerbations. Results We identified 523 records, of which 15 reports/abstracts from six RCTs were included. Triple therapy resulted in the reduction of the annual rate of moderate or severe exacerbations in the range of 15–52% compared with LAMA/LABA, 15–35% compared to LABA/ICS and 20% compared to LAMA. The patient-based number needed to treat for the moderate or severe exacerbation outcome ranged between approximately 25–50 (preventing one patient from having an event) and the event-based number needed to treat of around 3–11 (preventing one event). The absolute benefit appeared to be greater in patients with higher eosinophil counts or historical frequency of exacerbations and ex-smokers. In the largest study, there was a significantly higher incidence of pneumonia in the triple therapy arm. There were important differences in study designs and populations impacting the interpretation of the results and indicating there would be significant heterogeneity in cross-trial comparisons. Conclusion The decision to prescribe triple therapy should consider patient phenotype, magnitude of benefit and increased risk of adverse events. Future research on specific patient phenotype thresholds that can support treatment and funding decisions is now required from well-designed, robust, clinical trials. Trial registration PROSPERO #CRD42018102125 .https://deepblue.lib.umich.edu/bitstream/2027.42/152151/1/12931_2019_Article_1213.pd

    Can biological quantum networks solve NP-hard problems?

    Full text link
    There is a widespread view that the human brain is so complex that it cannot be efficiently simulated by universal Turing machines. During the last decades the question has therefore been raised whether we need to consider quantum effects to explain the imagined cognitive power of a conscious mind. This paper presents a personal view of several fields of philosophy and computational neurobiology in an attempt to suggest a realistic picture of how the brain might work as a basis for perception, consciousness and cognition. The purpose is to be able to identify and evaluate instances where quantum effects might play a significant role in cognitive processes. Not surprisingly, the conclusion is that quantum-enhanced cognition and intelligence are very unlikely to be found in biological brains. Quantum effects may certainly influence the functionality of various components and signalling pathways at the molecular level in the brain network, like ion ports, synapses, sensors, and enzymes. This might evidently influence the functionality of some nodes and perhaps even the overall intelligence of the brain network, but hardly give it any dramatically enhanced functionality. So, the conclusion is that biological quantum networks can only approximately solve small instances of NP-hard problems. On the other hand, artificial intelligence and machine learning implemented in complex dynamical systems based on genuine quantum networks can certainly be expected to show enhanced performance and quantum advantage compared with classical networks. Nevertheless, even quantum networks can only be expected to efficiently solve NP-hard problems approximately. In the end it is a question of precision - Nature is approximate.Comment: 38 page
    • 

    corecore