941 research outputs found

    Attitudes and Family Farm Business Performance

    Get PDF
    This study aimed to identify the top performing and the bottom performing family farm businesses based upon business performance, and then compare and contrast the perceptions and attitudes towards acquiring management skills and attributes that lead to improved business performance. Using extensive interviews data were obtained from 200 family farm businesses from New South Wales and Victoria covering a range of enterprises. The key findings were, firstly, that for all farmers the maintaining of a stable family relationship was of paramount importance in the running of the farm business. Secondly, that the Top 20% of farmers had high levels of self-efficacy and thus possessed the capability and the competence to perform tasks successfully. High performers also were more committed to the creation of long-term wealth and viewed business skills as a higher priority for training. Low performers were more highly committed to the farm's environmental health, placed a greater emphasis on production and sustainability for training and were more likely to give a lower priority to business issues. Both groups agreed that formal training that involved practical farmers with education skills providing them with tailor-made modules were best suited to their personal learning needs. The consistently high priority of family and business issues suggests that the opportunity exists to integrate the training of attitudes and skills with family, sustainable business practices and community issues. The study was funded by the Rural Industries Research and Development Corporation.Farm Management,

    What impact did a Paediatric Early Warning system have on emergency admissions to the paediatric intensive care unit? An observational cohort study

    Get PDF
    Summary The ideology underpinning Paediatric Early Warning systems (PEWs) is that earlier recognition of deteriorating in-patients would improve clinical outcomes. Objective To explore how the introduction of PEWs at a tertiary children's hospital affects emergency admissions to the Paediatric Intensive Care Unit (PICU) and the impact on service delivery. To compare ‘in-house’ emergency admissions to PICU with ‘external’ admissions transferred from District General Hospitals (without PEWs). Method A before-and-after observational study August 2005–July 2006 (pre), August 2006–July 2007 (post) implementation of PEWs at the tertiary children's hospital. Results The median Paediatric Index of Mortality (PIM2) reduced; 0.44 vs 0.60 (p < 0.001). Fewer admissions required invasive ventilation 62.7% vs 75.2% (p = 0.015) for a shorter median duration; four to two days. The median length of PICU stay reduced; five to three days (p = 0.002). There was a non-significant reduction in mortality (p = 0.47). There was no comparable improvement in outcome seen in external emergency admissions to PICU. A 39% reduction in emergency admission total beds days reduced cancellation of major elective surgical cases and refusal of external PICU referrals. Conclusions Following introduction of PEWs at a tertiary children's hospital PIM2 was reduced, patients required less PICU interventions and had a shorter length of stay. PICU service delivery improved

    Finding Fault? Divorce Law and Practice in England and Wales

    Get PDF
    This is the final version of the report. Available from Nuffield Foundation via the link in this record.1. Key messages The law of divorce in England and Wales has been subject to criticism for decades, most recently following the rare defended case of Owens v Owens. This major research study aimed to explore how the law is working in practice. The current law and use of fault The sole ground for divorce in England and Wales is the irretrievable breakdown of the marriage. But a divorce may be granted only if one of five ‘Facts’ is proved. Whilst many people might assume this is required, it is not necessary to prove that that ‘Fact’ was a cause of the breakdown. Three Facts are fault-based: adultery, behaviour, and desertion. Two Facts are based on separation: two years if the other spouse consents to divorce, five years if they do not. In 2015, 60% of English and Welsh divorces were granted on adultery or behaviour. In Scotland, where different procedural and related legal rules create different incentive structures, it was just 6%. Elsewhere, fault has been abolished or is just one option, and often a practically insignificant one, among several divorce grounds. The continuing problems of fault Academic research and Law Commission reviews from the 1970s onwards reported serious problems with the divorce law, including the lack of honesty of the system with the parties exaggerating behaviour allegations to get a quick divorce, while the court could do little more than ‘pretend’ to inquire into allegations. This study found that those problems continue and have worsened in some respects. Fault, especially behaviour, continues to be relied on to secure a faster divorce. The consequence is that parties often feel under pressure to exaggerate allegations or retro-fit the reasons for their separation into one of the legal Facts, even though the court’s expectations of what is required to make out each Fact is now actually very low, particularly for behaviour. The court has a duty to inquire into allegations but in practice in undefended cases only has the capacity to take the petitioner’s allegations at face value. That is procedurally unfair for the great majority of respondents who cannot defend themselves against the allegations. Parties embarking on the process might reasonably assume that the law is underpinned by a fault-based logic: that petitions should reflect who and what was to blame for the relationship breakdown. Yet whilst the law invites parties to rely on fault-based Facts, it does not require the court to adjudicate on responsibility in that way – not least because it will very often be impossible to allocate blame accurately in this context. Yet respondents on the receiving end of fault-based petitions inevitably feel cast as the ‘guilty’ party. The study found no evidence that fault prevents or slows down the decision to divorce and some evidence that it may shorten the time from break up to filing. We also found, as previously, that producing evidence of fault can create or exacerbate unnecessary conflict with damaging consequences for children and contrary to the thrust of family law policy. 10 The current divorce law is now nearly 50 years old. Its apparent rationale and operation are at odds with a modern, transparent, problem-solving family justice system that seeks to minimise the consequences of relationship breakdown for both adults and children. The need for law reform to finally remove fault The study shows that we already have something tantamount to immediate unilateral divorce ‘on demand’, but masked by an often painful, and sometimes destructive, legal ritual with no obvious benefits for the parties or the state. A clearer and more honest approach, that would also be fairer, more child-centred and cost-effective, would be to reform the law to remove fault entirely. We propose a notification system where divorce would be available if one or both parties register that the marriage has broken down irretrievably and that intention is confirmed by one or both parties after a minimum period of six months.Nuffield Foundatio

    Ce que les gens pensent de l'eau : leçon de communication et d'implication des citoyens

    Get PDF

    Prev Chronic Dis

    Get PDF
    IntroductionThe objective of this cross-sectional study was to examine the nutrition literacy status of adults in the Lower Mississippi Delta.MethodsSurvey instruments included the Newest Vital Sign and an adapted version of the Health Information National Trends Survey. A proportional quota sampling plan was used to represent educational achievement of residents in the Delta region. Participants included 177 adults, primarily African Americans (81%). Descriptive statistics, \ucf\u20212 analysis, analysis of variance, and multivariate analysis of covariance tests were used to examine survey data.ResultsResults indicated that 24% of participants had a high likelihood of limited nutrition literacy, 28% had a possibility of limited nutrition literacy, and 48% had adequate nutrition literacy. Controlling for income and education level, the multivariate analysis of covariance models revealed that nutrition literacy was significantly associated with media use for general purposes (F = 2.79, P = .005), media use for nutrition information (F = 2.30, P = .04), and level of trust from nutrition sources (F = 2.29, P = .005). Overall, the Internet was the least trusted and least used source for nutrition information. Only 12% of participants correctly identified the 2005 MyPyramid graphic, and the majority (78%) rated their dietary knowledge as poor or fair.ConclusionCompared with other national surveys, rates of limited health literacy among Delta adults were high. Nutrition literacy status has implications for how people seek nutrition information and how much they trust it. Understanding the causes and consequences of limited nutrition literacy may be a step toward reducing the burden of nutrition-related chronic diseases among disadvantaged rural communities

    A regional application of the MAGIC model in Wales: calibration and assessment of future recovery using a Monte-Carlo approach

    No full text
    International audienceA survey and resurvey of 77 headwater streams in Wales provides an opportunity for assessing changes in streamwater chemistry in the region. The Model of Acidification of Groundwater In Catchment (MAGIC) has been calibrated to the second of two surveys, taken in 1994-1995, using a Monte-Carlo methodology. The first survey, 1983-1984, provides a basis for model validation. The model simulates a significant decline of water quality across the region since industrialisation. Agreed reductions in sulphur (S) emissions in Europe in accordance with the Second S Protocol will result in a 49% reduction of S deposition across Wales from 1996 to 2010. In response to these reductions, the proportion of streams in the region with mean annual acid neutralising capacity (ANC) > 0 is predicted to increase from 81% in 1995 to 90% by 2030. The greatest recovery between 1984 and 1995 and into the future is at those streams with low ANC. In order to ensure that streams in the most heavily acidified areas of Wales recover to ANC zero by 2030, a reduction of S deposition of 80-85% will be required

    Uptake of Direct Acting Antivirals for Hepatitis C Virus in a New England Medicaid Population, 2014-2017

    Get PDF
    Introduction Introduction of the direct acting antiviral (DAA) sofosbuvir (SOV) in 2013 offered significant improvement over previous options for hepatitis C virus (HCV) treatment. Initial uptake was low in Medicaid and other populations, perhaps in part due to high drug cost and prior authorization (PA) restrictions related to fibrosis stage, prescribing provider specialty, and sobriety. Both the subsequent introduction of ledipasvir/sofosbuvir (LDV/SOV), an all-oral regimen for most genotypes, and lifting of PA restrictions were expected to increase overall uptake, but little is known about recent prescribing patterns. We examined trends in DAA uptake in a Medicaid population and identified the effect of these two events on treatment initiation. Study Design An interrupted time series (ITS) design utilized enrollment, medical, and pharmacy claims from Medicaid enrollees in three New England states, 12/2013-12/2017. Trends in treatment uptake, defined as 1+ pharmacy claim for a DAA, were examined overall, by demographic characteristics, and prior to and after two time points: 10/2014 (LDV/SOV approval date) and 7/2016 (date PA restrictions affecting two-thirds of members were lifted). Chi-square evaluated demographic differences, segmented regression models examined trends. Study Population The population included members ages 18-64 years with HCV (2+ claims with ICD-9/10 code for HCV or 1+ claim for chronic HCV). Eligible individuals remained in the sample until treatment initiation or Medicaid disenrollment. Findings The analytic sample averaged 30,433 members with HCV per month, mean age 42.9 years, 60% male. In 2014 3.3% of eligible members initiated treatment, increasing to 7.7% in 2017 (p = Conclusion While initial uptake of DAAs was low in this multi-state Medicaid population, treatment initiation among eligible members increased through 2017. Introduction of new medications and lifting of PA restrictions led to an immediate increase in uptake followed by relatively flat monthly utilization. Policy implications Sharp increases in uptake after LDV/SOV introduction may indicate warehousing of members in anticipation of LDV/SOV approval; increases after PA restrictions were lifted indicates demand for treatment among those affected by restrictions. As a large percentage of the Medicaid HCV population remains untreated, planned provider interviews will help to understand barriers and facilitators of treatment for HCV

    Optimising paediatric afferent component early warning systems : a hermeneutic systematic literature review and model development

    Get PDF
    Objective: To identify the core components of successful early warning systems for detecting and initiating action in response to clinical deterioration in paediatric inpatients. Methods: A hermeneutic systematic literature review informed by translational mobilisation theory and normalisation process theory was used to synthesise 82 studies of paediatric and adult early warning systems and interventions to support the detection of clinical deterioration and escalation of care. This method, which is designed to develop understanding, enabled the development of a propositional model of an optimal afferent component early warning system. Results: Detecting deterioration and initiating action in response to clinical deterioration in paediatric inpatients involves several challenges, and the potential failure points in early warning systems are well documented. Track and trigger tools (TTT) are commonly used and have value in supporting key mechanisms of action but depend on certain preconditions for successful integration into practice. Several supplementary interventions have been proposed to improve the effectiveness of early warning systems but there is limited evidence to recommend their wider use, due to the weight and quality of the evidence; the extent to which systems are conditioned by the local clinical context; and the need to attend to system component relationships, which do not work in isolation. While it was not possible to make empirical recommendations for practice, the review methodology generated theoretical inferences about the core components of an optimal system for early warning systems. These are presented as a propositional model conceptualised as three subsystems: detection, planning and action. Conclusions: There is a growing consensus of the need to think beyond TTTs in improving action to detect and respond to clinical deterioration. Clinical teams wishing to improve early warning systems can use the model to consider systematically the constellation of factors necessary to support detection, planning and action and consider how these arrangements can be implemented in their local context

    Taraxerol abundance as a proxy for in situ Mangrove sediment

    Get PDF
    Mangrove sediments are valuable archives of relative sea-level change if they can be distinguished in the stratigraphic record from other organic-rich depositional environments (e.g., freshwater swamps). Proxies for establishing environment of deposition can be poorly preserved (e.g., foraminifera) in mangrove sediment. Consequently, differentiating mangrove and freshwater sediment in the stratigraphic record is often subjective. We explore if biomarkers can objectively identify mangrove sediment with emphasis on their utility for reconstructing relative sea level. Our approach is specific to identifying in situ sediment, which has received less attention than identifying allochthonous mangrove organic matter. To characterize mangrove and non-mangrove (freshwater) environments, we measured n-alkane, sterol, and triterpenoid abundances in surface sediments at three sites in the Federated States of Micronesia. Elevated taraxerol abundance is diagnostic of sediment accumulating in mangroves and taraxerol is particularly abundant beneath monospecific stands of Rhizophora spp. Taraxerol was undetectable in freshwater sediment. Other triterpenoids are more abundant in mangrove sediment than in freshwater sediment. Using cores from Micronesian mangroves, we examine if biomarkers in sediments are indicative of in situ deposition in a mangrove, and have utility as a relative sea-level proxy. Taraxerol concentrations in cores are comparable to surface mangrove sediments, which indicates deposition in a mangrove. This interpretation is supported by pollen assemblages. Downcore taraxerol variability may reflect changing inputs from Rhizophora spp. rather than diagenesis. We propose that taraxerol is a proxy that differentiates between organic sediment that accumulated in mangrove vs. freshwater environments, lending it utility for reconstructing relative sea level

    The feasibility of domestic raintanks contributing to community-oriented urban flood resilience

    Get PDF
    This interdisciplinary study investigates the technical and social feasibility of developing a domestic raintank programme to increase urban flood resilience. Hydrological modelling of different types of tank was used to determine the advantages and disadvantages of different models in controlling runoff. Qualitative socio-cultural interviews with local people revealed that raintanks were broadly acceptable to the local community. However, interviews with representatives from flood authorities suggest that resource constraints and technocratic industry norms focused on physical flood risk mitigate against consideration of a raintank programme. Our research suggests that there are transformative advantages to a more community-oriented approach to flood resilience, particularly the potential to change the relationship between the public and flood authorities away from a traditional model that pictures the former as passive, towards a process of mutual learning and two-way communication. Our research illustrates that this is not merely a matter of ‘good practice’, but a shift that can produce new practical solutions that a technical perspective alone cannot reveal
    • 

    corecore