105 research outputs found

    Children, Food and Poverty: Food Insecurity Among Primary School Students in the Wellington Region

    No full text
    This research utilises a questionnaire, completed by 115 primary school principals in the Wellington region, to explore the link between poverty and food insecurity among children. Principals recorded the number of children estimated to be regularly hungry at school for the month of May 2004, and the number of children who regularly came to school without having eaten breakfast and who regularly had no lunch during that month. Principals also provided information on how their school responds to hungry children and gave their opinions on whether schools were responsible for solving food insecurity issues among children. Principals' responses were analysed within a critical realist sociological perspective. This research confirms that a small but significant number of primary school children in the Wellington region experience serious food insecurity, and that food insecurity is strongly correlated with poverty. Two-thirds of children estimated to be regularly hungry are from schools in low socio-economic areas (Decile 1 to 4 schools), and nearly three quarters of children that regularly do not have lunch come from these schools. Ten percent of children in Decile 1 and 2 schools were estimated by their principal to be regularly hungry throughout the school day during May 2004. The responses of schools to hungry children were in most instances inadequate and often ad-hoc, showing little consideration of the outcomes for children. However, the small number of schools in the Wellington region who have developed detailed policy and procedures to respond to hungry children appear to be successful in limiting stigmatisation of children and their families. This thesis argues that responses to food insecure children must consider the causes of food insecurity, and in order to prevent stigmatisation, should be founded on the principle of social justice rather than charity

    21st century fisheries management: a spatio-temporally explicit tariff-based approach combining multiple drivers and incentivising responsible fishing

    Get PDF
    Abstract Kraak, S. B. M., Reid, D. G., Gerritsen, H. D., Kelly, C. J., Fitzpatrick, M., Codling, E. A., and Rogan, E. 2012. 21st century fisheries management: a spatio-temporally explicit tariff-based approach combining multiple drivers and incentivising responsible fishing. – ICES Journal of Marine Science, 69: 590–601. Traditionally fisheries management has focused on biomass and mortality, expressed annually and across large management units. However, because fish abundance varies at much smaller spatio-temporal scales, fishing mortality can potentially be controlled more effectively if managed at finer scale. The ecosystem approach requires more indicators at finer scales as well. Incorporating ecosystem targets would need additional management tools with potentially conflicting results. We present a simple, integrated, management approach that provides incentives for “good behaviour”. Fishers would be given a number of fishing-impact credits, called real-time incentives (RTIs), to spend according to spatio-temporally varying tariffs per fishing day. RTI quotas and tariffs could be based on commercial stocks and ecosystem targets. Fishers could choose how to spend their RTIs, e.g. by limited fishing in high-catch or sensitive areas or by fishing longer in lower-catch or less sensitive areas. The RTI system does not prescribe and forbid, but instead allows fishers to fish wherever and whenever they want; ecosystem costs are internalized and fishers have to take them into account in their business decisions. We envisage no need for traditional landings or catch quotas for the fleets while operating under the scheme. The approach could facilitate further devolution of responsibility to industry.</jats:p

    Systematic review on barriers and facilitators of complex interventions for residents with dementia in long-term care

    Get PDF
    Objectives:Psychotropic drugs are frequently and sometimes inappropriately used for the treatment of neuropsychiatric symptoms of people with dementia, despite their limited efficacy and side effects. Interventions to address neuropsychiatric symptoms and psychotropic drug use are multifactorial and often multidisciplinary. Suboptimal implementation of these complex interventions often limits their effectiveness. This systematic review provides an overview of barriers and facilitators influencing the implementation of complex interventions targeting neuropsychiatric symptoms and psychotropic drug use in long-term care.Design:To identify relevant studies, the following electronic databases were searched between 28 May and 4 June: PubMed, Web of Science, PsycINFO, Cochrane, and CINAHL. Two reviewers systematically reviewed the literature, and the quality of the included studies was assessed using the Critical Appraisal Skills Programme qualitative checklist. The frequency of barriers and facilitators was addressed, followed by deductive thematic analysis describing their positive of negative influence. The Consolidated Framework for Implementation Research guided data synthesis.Results:Fifteen studies were included, using mostly a combination of intervention types and care programs, as well as different implementation strategies. Key factors to successful implementation included strong leadership and support of champions. Also, communication and coordination between disciplines, management support, sufficient resources, and culture (e.g. openness to change) influenced implementation positively. Barriers related mostly to unstable organizations, such as renovations to facility, changes toward self-directed teams, high staff turnover, and perceived work and time pressures.Conclusions:Implementation is complex and needs to be tailored to the specific needs and characteristics of the organization in question. Champions should be carefully chosen, and the application of learned actions and knowledge into practice is expected to further improve implementation

    Time trends in psychotropic drug prescriptions in Dutch nursing home residents with dementia between 2003 and 2018

    Get PDF
    Objective: Several European studies investigated the trends in psychotropic drug prescriptions (PDPs) among nursing home (NH) residents and reported a decline in antipsychotics prescriptions. Since the Dutch long-term care system differs from other European systems (e.g. higher threshold for NH admission and trained elderly care physicians), this study explores the trends in PDPs in Dutch NH residents with dementia. Methods: The study used data from nine studies, comprising two cross-sectional studies, one cohort study, and six cluster-randomized controlled trials, collected in Dutch NHs between 2003 and 2018. With multilevel logistic regression analysis, NHs as a random effect, we estimated the trends in PDPs overall and for five specific psychotropic drug groups (antipsychotics, antidepressants, anxiolytics, hypnotics, and anti-dementia drugs), adjusting for confounders: age, gender, severity of dementia, severity of neuropsychiatric symptoms, and length of stay in NHs. Results: The absolute prescription rate of antipsychotics was 37.5% in 2003 and decreased (OR = 0.947, 95% CI [0.926, 0.970]) every year. The absolute prescription rate of anti-dementia drugs was 0.8% in 2003 and increased (OR = 1.162, 95% CI [1.105, 1.223]) per year. The absolute rate of overall PDPs declined from 62.7% in 2003 to 40.4% in 2018. Conclusions: Among Dutch NH residents with dementia, the odds of antipsychotics prescriptions decreased by 5.3% per year while the odds of anti-dementia drug prescriptions increased by 16.2%. There were no distinct trends in antidepressants, anxiolytics, and hypnotics prescriptions. However, overall PDPs were still high. The PDPs in NH residents remain an issue of concern

    Outcomes and risk score for distal pancreatectomy with celiac axis resection (DP-CAR) : an international multicenter analysis

    Get PDF
    Background: Distal pancreatectomy with celiac axis resection (DP-CAR) is a treatment option for selected patients with pancreatic cancer involving the celiac axis. A recent multicenter European study reported a 90-day mortality rate of 16%, highlighting the importance of patient selection. The authors constructed a risk score to predict 90-day mortality and assessed oncologic outcomes. Methods: This multicenter retrospective cohort study investigated patients undergoing DP-CAR at 20 European centers from 12 countries (model design 2000-2016) and three very-high-volume international centers in the United States and Japan (model validation 2004-2017). The area under receiver operator curve (AUC) and calibration plots were used for validation of the 90-day mortality risk model. Secondary outcomes included resection margin status, adjuvant therapy, and survival. Results: For 191 DP-CAR patients, the 90-day mortality rate was 5.5% (95 confidence interval [CI], 2.2-11%) at 5 high-volume (1 DP-CAR/year) and 18% (95 CI, 9-30%) at 18 low-volume DP-CAR centers (P=0.015). A risk score with age, sex, body mass index (BMI), American Society of Anesthesiologists (ASA) score, multivisceral resection, open versus minimally invasive surgery, and low- versus high-volume center performed well in both the design and validation cohorts (AUC, 0.79 vs 0.74; P=0.642). For 174 patients with pancreatic ductal adenocarcinoma, the R0 resection rate was 60%, neoadjuvant and adjuvant therapies were applied for respectively 69% and 67% of the patients, and the median overall survival period was 19months (95 CI, 15-25months). Conclusions: When performed for selected patients at high-volume centers, DP-CAR is associated with acceptable 90-day mortality and overall survival. The authors propose a 90-day mortality risk score to improve patient selection and outcomes, with DP-CAR volume as the dominant predictor

    Comparison of the efficacy of a neutral wrist splint and wrist splint with lumbrical unit for the treatment of patients with carpal tunnel syndrome

    Get PDF
    Purpose: The purpose of this study was to compare the effect of a neutral wrist splint or a wrist splint with an additional metacarpophalangeal (MCP) unit on pain, function, grip and pinch strength in patients with mild-to-moderate carpal tunnel syndrome (CTS). Methods: Twenty four patients received conservative treatment using either the neutral wrist splint or wrist splint with the MCP unit for a period of 6 weeks. Primary outcome measures were pain, function, grip and pinch strength. Data was collected immediately before and after using the two types of splints at baseline (0 weeks) and 6 weeks. Statistical analysis was performed using the paired t-test and independent T-test. Results: Compared to baseline, both the neutral wrist splint and the wrist splint with an MCP unit significantly decreased pain, increased function and pinch and grip strength. Comparisons of the two types of splints for grip (P =0.675) and pinch strength (P =0.650) revealed that there were no significant differences between the two after 6 weeks of wear. However, there were significant differences in pain levels (P =0.022) and the DASH score (P =0.027) between the two types of splints from baseline to 6 weeks. Conclusion: The wrist splint with an MCP unit was more effective than the neutral wrist splint in pain reduction and improvement of function

    Characterizing benthic macroinvertebrate and algal biological condition gradient models for California wadeable Streams, USA

    Get PDF
    The Biological Condition Gradient (BCG) is a conceptual model that describes changes in aquatic communities under increasing levels of anthropogenic stress. The BCG helps decision-makers connect narrative water quality goals (e.g., maintenance of natural structure and function) to quantitative measures of ecological condition by linking index thresholds based on statistical distributions (e.g., percentiles of reference distributions) to expert descriptions of changes in biological condition along disturbance gradients. As a result, the BCG may be more meaningful to managers and the public than indices alone. To develop a BCG model, biological response to stress is divided into 6 levels of condition, represented as changes in biological structure (abundance and diversity of pollution sensitive versus tolerant taxa) and function. We developed benthic macroinvertebrate (BMI) and algal BCG models for California perennial wadeable streams to support interpretation of percentiles of reference-based thresholds for bioassessment indices (i.e., the California Stream Condition Index [CSCI] for BMI and the Algal Stream Condition Index [ASCI] for diatoms and soft-bodied algae). Two panels (one of BMI ecologists and the other of algal ecologists) each calibrated a general BCG model to California wadeable streams by first assigning taxa to specific tolerance and sensitivity attributes, and then independently assigning test samples (264 BMI and 248 algae samples) to BCG Levels 1–6. Consensus on the assignments was developed within each assemblage panel using a modified Delphi method. Panels then developed detailed narratives of changes in BMI and algal taxa that correspond to the 6 BCG levels. Consensus among experts was high, with 81% and 82% expert agreement within 0.5 units of assigned BCG level for BMIs and algae, respectively. According to both BCG models, the 10th percentiles index scores at reference sites corresponded to a BCG Level 3, suggesting that this type of threshold would protect against moderate changes in structure and function while allowing loss of some sensitive taxa. The BCG provides a framework to interpret changes in aquatic biological condition along a gradient of stress. The resulting relationship between index scores and BCG levels and narratives can help decision-makers select thresholds and communicate how these values protect aquatic life use goals
    • …
    corecore