113 research outputs found

    Ethical and methodological issues in engaging young people living in poverty with participatory research methods

    Get PDF
    This paper discusses the methodological and ethical issues arising from a project that focused on conducting a qualitative study using participatory techniques with children and young people living in disadvantage. The main aim of the study was to explore the impact of poverty on children and young people's access to public and private services. The paper is based on the author's perspective of the first stage of the fieldwork from the project. It discusses the ethical implications of involving children and young people in the research process, in particular issues relating to access and recruitment, the role of young people's advisory groups, use of visual data and collection of data in young people's homes. The paper also identifies some strategies for addressing the difficulties encountered in relation to each of these aspects and it considers the benefits of adopting participatory methods when conducting research with children and young people

    Non-stationary covariance function modelling in 2D least-squares collocation

    Get PDF
    Standard least-squares collocation (LSC) assumes 2D stationarity and 3D isotropy, and relies on a covariance function to account for spatial dependence in the ob-served data. However, the assumption that the spatial dependence is constant through-out the region of interest may sometimes be violated. Assuming a stationary covariance structure can result in over-smoothing of, e.g., the gravity field in mountains and under-smoothing in great plains. We introduce the kernel convolution method from spatial statistics for non-stationary covariance structures, and demonstrate its advantage fordealing with non-stationarity in geodetic data. We then compared stationary and non-stationary covariance functions in 2D LSC to the empirical example of gravity anomaly interpolation near the Darling Fault, Western Australia, where the field is anisotropic and non-stationary. The results with non-stationary covariance functions are better than standard LSC in terms of formal errors and cross-validation against data not used in the interpolation, demonstrating that the use of non-stationary covariance functions can improve upon standard (stationary) LSC

    Reactive community-based self-administered treatment against residual malaria transmission: study protocol for a randomized controlled trial

    Get PDF
    Background: Systematic treatment of all individuals living in the same compound of a clinical malaria case may clear asymptomatic infections and possibly reduce malaria transmission, where this is focal. High and sustained coverage is extremely important and requires active community engagement. This study explores a communitybased approach to treating malaria case contacts. Methods/design: This is a cluster-randomized trial to determine whether, in low-transmission areas, treating individuals living in the same compound of a clinical malaria case with dihydroartemisinin-piperaquine can reduce parasite carriage and thus residual malaria transmission. Treatment will be administered through the local health system with the approach of encouraging community participation designed and monitored through formative research. The trial goal is to show that this approach can reduce in intervention villages the prevalence of Plasmodium falciparum infection toward the end of the malaria transmission season. Discussion: Adherence and cooperation of the local communities are critical for the success of mass treatment campaigns aimed at reducing malaria transmission. By exploring community perceptions of the changing trends in malaria burden, existing health systems, and reaction to self-administered treatment, this study will develop and adapt a model for community engagement toward malaria elimination that is cost-effective and fits within the existing health system. Trial registration: Clinical trials.gov, NCT02878200. Registered on 25 August 2016

    Dynamic Models of Language Evolution: The Linguistic Perspective

    Get PDF
    Language is probably the key defining characteristic of humanity, an immensely powerful tool which provides its users with an infinitely expressive means of representing their complex thoughts and reflections, and of successfully communicating them to others. It is the foundation on which human societies have been built and the means through which humanity’s unparalleled intellectual and technological achievements have been realized. Although we have a natural intuitive understanding of what a language is, the specification of a particular language is nevertheless remarkably difficult, if not impossible, to pin down precisely. All languages contain many separate yet integral systems which work interdependently to allow the expression of our thoughts and the interpretation of others’ expressions: each has, for instance, a set of basic meaningless sounds (e.g. [e], [l], [s]) which can be combined to make different meaningful words and parts of words (e.g. else, less, sell, -less ); these meaningful units can be combined to make complex words (e.g. spinelessness, selling ), and the words themselves can then be combined in very many complex ways into phrases, clauses and an infinite number of meaningful sentences; finally each of these sentences can be interpreted in dramatically different ways, depending on the contexts in which it is uttered and on who is doing the interpretation. Languages can be analysed at any of these different levels, which make up many of the sub-fields of linguistics, and the primary job of linguistic theorists is to try to explain the rules which best explain these complex combinations

    A nanostructural view of the cell wall disassembly process during fruit ripening and postharvest storage by atomic force microscopy

    Get PDF
    Background: The mechanical properties of parenchyma cell walls and the strength and extension of adhesion areas between adjacent cells, jointly with cell turgor, are main determinants of firmness of fleshy fruits. These traits are modified during ripening leading to fruit softening. Cell wall modifications involve the depolymerisation of matrix glycans and pectins, the solubilisation of pectins and the loss of neutral sugars from pectin side chains. These changes weaken the cell walls and increase cell separation, which in combination with a reduction in cell turgor, bring about textural changes. Atomic force microscopy (AFM) has been used to characterize the nanostructure of cell wall polysaccharides during the ripening and postharvest storage of several fruits. This technique allows the imaging of individual polymers at high magnification with minimal sample preparation. Scope and approach: This paper reviews the main features of the cell wall disassembly process associated to fruit softening from a nanostructural point of view, as has been provided by AFM studies. Key findings and conclusions: AFM studies show that pectin size, ramification and complexity is reduced during fruit ripening and storage, and in most cases these changes correlate with softening. Postharvest treatments that improve fruit quality have been proven to preserve pectin structure, suggesting a clear link between softening and pectin metabolism. Nanostructural characterization of cellulose and hemicellulose during ripening has been poorly explored by AFM and the scarce results available are not conclusive. Globally, AFM could be a powerful tool to gain insights about the bases of textural fruit quality in fresh and stored fruits

    Kinematics and simulations of the stellar stream in the halo of the Umbrella Galaxy

    Get PDF
    We study the dynamics of faint stellar substructures around the Umbrella Galaxy, NGC 4651, which hosts a dramatic system of streams and shells formed through the tidal disruption of a nucleated dwarf elliptical galaxy. We elucidate the basic characteristics of the system (colours, luminosities, stellar masses) using multiband Subaru/Suprime-Cam images. The implied stellar mass ratio of the ongoing merger event is ∼1:50. We identify candidate kinematic tracers (globular clusters, planetary nebulae, H ii regions) and follow up a subset with Keck/DEIMOS (DEep Imaging Multi-object Spectrograph) spectroscopy to obtain velocities. We find that 15 of the tracers are likely associated with halo substructures, including the probable stream progenitor nucleus. These objects delineate a kinematically cold feature in position–velocity phase space. We model the stream using single test particle orbits, plus a rescaled pre-existing N-body simulation. We infer a very eccentric orbit with a period of ∼0.35 Gyr and turning points at ∼2–4 and ∼40 kpc, implying a recent passage of the satellite through the disc, which may have provoked the visible disturbances in the host galaxy. This work confirms that the kinematics of low surface brightness substructures can be recovered and modelled using discrete tracers – a breakthrough that opens up a fresh avenue for unravelling the detailed physics of minor merging

    Irbesartan in Marfan syndrome (AIMS): a double-blind, placebo-controlled randomised trial

    Get PDF
    BACKGROUND: Irbesartan, a long acting selective angiotensin-1 receptor inhibitor, in Marfan syndrome might reduce aortic dilatation, which is associated with dissection and rupture. We aimed to determine the effects of irbesartan on the rate of aortic dilatation in children and adults with Marfan syndrome. METHODS: We did a placebo-controlled, double-blind randomised trial at 22 centres in the UK. Individuals aged 6-40 years with clinically confirmed Marfan syndrome were eligible for inclusion. Study participants were all given 75 mg open label irbesartan once daily, then randomly assigned to 150 mg of irbesartan (increased to 300 mg as tolerated) or matching placebo. Aortic diameter was measured by echocardiography at baseline and then annually. All images were analysed by a core laboratory blinded to treatment allocation. The primary endpoint was the rate of aortic root dilatation. This trial is registered with ISRCTN, number ISRCTN90011794. FINDINGS: Between March 14, 2012, and May 1, 2015, 192 participants were recruited and randomly assigned to irbesartan (n=104) or placebo (n=88), and all were followed for up to 5 years. Median age at recruitment was 18 years (IQR 12-28), 99 (52%) were female, mean blood pressure was 110/65 mm Hg (SDs 16 and 12), and 108 (56%) were taking β blockers. Mean baseline aortic root diameter was 34·4 mm in the irbesartan group (SD 5·8) and placebo group (5·5). The mean rate of aortic root dilatation was 0·53 mm per year (95% CI 0·39 to 0·67) in the irbesartan group compared with 0·74 mm per year (0·60 to 0·89) in the placebo group, with a difference in means of -0·22 mm per year (-0·41 to -0·02, p=0·030). The rate of change in aortic Z score was also reduced by irbesartan (difference in means -0·10 per year, 95% CI -0·19 to -0·01, p=0·035). Irbesartan was well tolerated with no observed differences in rates of serious adverse events. INTERPRETATION: Irbesartan is associated with a reduction in the rate of aortic dilatation in children and young adults with Marfan syndrome and could reduce the incidence of aortic complications

    From staff-mix to skill-mix and beyond: towards a systemic approach to health workforce management

    Get PDF
    Throughout the world, countries are experiencing shortages of health care workers. Policy-makers and system managers have developed a range of methods and initiatives to optimise the available workforce and achieve the right number and mix of personnel needed to provide high-quality care. Our literature review found that such initiatives often focus more on staff types than on staff members' skills and the effective use of those skills. Our review describes evidence about the benefits and pitfalls of current approaches to human resources optimisation in health care. We conclude that in order to use human resources most effectively, health care organisations must consider a more systemic approach - one that accounts for factors beyond narrowly defined human resources management practices and includes organisational and institutional conditions

    Global economic burden of unmet surgical need for appendicitis

    Get PDF
    Background: There is a substantial gap in provision of adequate surgical care in many low-and middle-income countries. This study aimed to identify the economic burden of unmet surgical need for the common condition of appendicitis. Methods: Data on the incidence of appendicitis from 170 countries and two different approaches were used to estimate numbers of patients who do not receive surgery: as a fixed proportion of the total unmet surgical need per country (approach 1); and based on country income status (approach 2). Indirect costs with current levels of access and local quality, and those if quality were at the standards of high-income countries, were estimated. A human capital approach was applied, focusing on the economic burden resulting from premature death and absenteeism. Results: Excess mortality was 4185 per 100 000 cases of appendicitis using approach 1 and 3448 per 100 000 using approach 2. The economic burden of continuing current levels of access and local quality was US 92492millionusingapproach1and92 492 million using approach 1 and 73 141 million using approach 2. The economic burden of not providing surgical care to the standards of high-income countries was 95004millionusingapproach1and95 004 million using approach 1 and 75 666 million using approach 2. The largest share of these costs resulted from premature death (97.7 per cent) and lack of access (97.0 per cent) in contrast to lack of quality. Conclusion: For a comparatively non-complex emergency condition such as appendicitis, increasing access to care should be prioritized. Although improving quality of care should not be neglected, increasing provision of care at current standards could reduce societal costs substantially

    Attention and binding in visual working memory : two forms of attention and two kinds of buffer storage

    Get PDF
    We review our research on the episodic buffer in the multicomponent model of working memory (Baddeley, 2000), making explicit the influence of Anne Treisman’s work on the way our research has developed. The crucial linking theme concerns binding, whereby the individual features of an episode are combined as integrated representations. We summarize a series of experiments on visual working memory that investigated the retention of feature bindings and individual features. The effects of cognitive load, perceptual distraction, prioritization, serial position, and their interactions form a coherent pattern. We interpret our findings as demonstrating contrasting roles of externally driven and internally driven attentional processes, as well as a distinction between visual buffer storage and the focus of attention. Our account has strong links with Treisman’s concept of focused attention and aligns with a number of contemporary approaches to visual working memory
    corecore