2,267 research outputs found
Clima e estado do tempo. Factores e elementos do clima. Classificação do clima
Clima e estado do tempo. Fatores e elementos do clima. Classificação do clima
Purple nustedge (Cyperus rotundus L.) control through climbing legumes such as Mucuna pruriens L. and Lablab purpureus L.
The adoption of Conservation Agriculture in Mozambique poses new challenges for smallholder farmers. One of these challenges is the control of perennial weeds without herbicides which is beyond the reach of this group of farmers in Cabo Delgado due to: a) High prices (low-income farmers), and
b) Cabo Delgado is a remote area where aff ordable access to herbicides and other inputs is not yet possible. Looking for sustainable solutions according to local agro-ecological and socio-economic conditions of the region was the aim of the on-farm research carried out. The present study aimed at testing the effi ciency of two cover crops, Mucuna pruriens L. and Lablab purpureus L. in the
control of purple nustedge (Cyperus rotundos L.) in Conservation Agriculture systems. The trials were conducted in the village of Nangua, in the province of Cabo Delgado during the rainy seasons of 2014/15 and 2015/16 crop years in a field that was abandoned due to purple nustedge weed
infestation. Two cover crops, mucuna and lablab, were established in 12 m² plots, in three replications.
Three counts of the quantity of purple nustedge were made in these plots: 1st count, 1 day before sowing; 2nd count, 30 days after germination, and 3rd count, 60 days after germination. Before the cover crops were sown, the purple nustedge counts were made in 1 m² area in 2 sites located in each plot, during two seasons. In the first year, there was a decrease in the number of plants of purple nustedge in the plots where both legumes were grown. Both legumes showed greater efficiency in the control of purple nustedge with increase in their duration in the field mainly between 30 days and 60 days after sowing. Results show that mucuna and lablab can replace each other in the control
of purple nustedge because the effect of the application of both cultures is indifferent. Mucuna and lablab usage as cover crop in Conservation Agriculture Systems favors dormancy of the bulbs and creates unfavorable conditions for the viability of purple nustedge seeds and thus decreases their
proliferation capacity in field crops
No-Tillage in Europe - State of the Art: Constraints and Perspectives
No-tillage in Europe contains a review of developments over the last three decades beginning in the late 1960s. Reasons for attempts to introduce this soil conserving production method are outlined and obstacles affecting the uptake of no-tillage throughout Europe are identified. Updated data are provided for the uptake of both conservation tillage and no-tillage in the member countries of the European Conservation Agriculture Federation.
Further explanations for the low uptake of no-tillage and even conservation tillage when compared to other regions in the world are explored. The specificity of European conditions whether natural, human or political are used to provide arguments against the successful adoption of no-tillage in Europe.
However, increased awareness of farmers, politicians and society as a whole that soils are a non-renewable resource are leading to gradual changes in the overall approach to soil conservation. The implementation of a European Soil Framework Directive is considered to be an important step towards the recognition that conservation tillage and no-tillage is both an economical and ecological sustainable method for crop production. It is anticipated that this will promote the concept of Conservation Agriculture and increase adoption levels throughout Europe
Feeding the soil AND feeding the cow – Conservation Agriculture in Kenya
One of the main obstacles to the implementation of Conservation Agriculture (CA) in subSaharan Africa is the priority given to using crop residues as cattle feed rather than mulching material. As documented in past projects (e.g. CA-SARD, CA2Africa, ABACO), in this way the CA approach will not reach its full potential - particularly in countries with a limited biomass production due to climatic conditions. To identify pathways for enabling an implementation
of CA that is not in conflict with other goals of farmers’ livelihoods (e.g. livestock
farming), we used a transformative learning approach with farmers and other stakeholders in Laikipia County (Kenya). The learning elements comprised: a timeline that encompasses the past promotion activities; stakeholder mapping which highlights the various stakeholders involved and their influence; non-scripted participatory videos filmed by the stakeholders themselves
that show the farming system from different perspectives; focus group discussions structured by the Qualitative expert Assessment Tool for CA adoption in Africa (QAToCA). Challenges to CA adoption that were jointly identified include the competition for fodder, a lack of financial resources to get started with CA. There are knowledge gaps on proper application of CA equipment, on the fodder production and conservation options and, lastly, on sustainable crop-livestock production systems. Furthermore, farmers feel disconnected from existing governmental support. However, our findings highlight solutions which enable feeding the soil
“and” feeding the cow. Some farmers already have started to grow forages on their farms in order to reduce dependence on crop residues as a feeding source – an approach which had not been promoted during past projects. This shows the importance of an enabling environment provided by government programs which supports long-term extension efforts combined with farmers’ willingness to jointly learn towards a more sustainable agriculture. On farms where both systems (CA and conventional) are practised, women play an important role by experimenting with CA practices, thereby realising promising results in terms of yield and drought
resilience. Furthermore, our findings underline the need for a long-term monitoring of innovation processes which is often not possible within short-term term research projects and promotion programs
Validation study of a web-based assessment of functional recovery after radical prostatectomy
<p>Abstract</p> <p>Background</p> <p>Good clinical care of prostate cancer patients after radical prostatectomy depends on careful assessment of post-operative morbidities, yet physicians do not always judge patient symptoms accurately. Logistical problems associated with using paper questionnaire limit their use in the clinic. We have implemented a web-interface ("STAR") for patient-reported outcomes after radical prostatectomy.</p> <p>Methods</p> <p>We analyzed data on the first 9 months of clinical implementation to evaluate the validity of the STAR questionnaire to assess functional outcomes following radical prostatectomy. We assessed response rate, internal consistency within domains, and the association between survey responses and known predictors of sexual and urinary function, including age, time from surgery, nerve sparing status and co-morbidities.</p> <p>Results</p> <p>Of 1581 men sent an invitation to complete the instrument online, 1235 responded for a response rate of 78%. Cronbach's alpha was 0.84, 0.86 and 0.97 for bowel, urinary and sexual function respectively. All known predictors of sexual and urinary function were significantly associated with survey responses in the hypothesized direction.</p> <p>Conclusions</p> <p>We have found that web-based assessment of functional recovery after radical prostatectomy is practical and feasible. The instrument demonstrated excellent psychometric properties, suggested that validity is maintained when questions are transferred from paper to electronic format and when patients give responses that they know will be seen by their doctor and added to their clinic record. As such, our system allows ready implementation of patient-reported outcomes into routine clinical practice.</p
Split dose and MiraLAX-based purgatives to enhance bowel preparation quality becoming common recommendations in the US
Objectives: Rates of suboptimal bowel preparation up to 30% have been reported. Liberalized precolonoscopy diet, split dose purgative, and the use of MiraLAX-based bowel preparation (MBBP) prior to colonoscopy are recently developed measures to improve bowel preparation quality but little is known about the utilization prevalence of these measures. We examined the patterns of utilization of these newer approaches to improve precolonoscopy bowel preparation quality among American gastroenterologists.
Methods: Surveys were distributed to a random sample of members of the American College of Gastroenterologists. Participants were queried regarding demographics, practice characteristics, and bowel preparation recommendations including recommendations for liberal dietary restrictions, split dose purgative, and the use of MBBP. Approaches were evaluated individually and in combination.
Results: Of the 999 eligible participants, 288 responded; 15.2% recommended a liberal diet, 60.0% split dose purgative, and 37.4% MBBP. Diet recommendations varied geographically with gastroenterologists in the West more likely to recommend a restrictive diet (odds ratio [OR] 2.98, 95% confidence interval [CI] 1.16–7.67) and physicians in the Northeast more likely to recommend a liberal diet more likely. Older physicians more often recommended split dosing (OR 1.04, 95% CI 1.04–2.97). Use of MBBP was more common in suburban settings (OR 2.14, 95% CI 1.23–3.73). Evidence suggests that physicians in private practice were more likely to prescribe split dosing (p = 0.03) and less often recommended MBBP (p = 0.02). Likelihood of prescribing MBBP increased as weekly volume of colonoscopy increased (p = 0.03).
Conclusions: To enhance bowel preparation quality American gastroenterologists commonly use purgative split dosing. The use of MBBP is becoming more prevalent while a liberalized diet is infrequently recommended. Utilization of these newer approaches to improve bowel preparation quality varies by physician and practice characteristics. Further evaluation of the patterns of usage of these measures is indicated
Shortened surveillance intervals following suboptimal bowel preparation for colonoscopy: Results of a national survey
Purpose: Suboptimal bowel preparation can result in decreased neoplasia detection, shortened surveillance intervals, and increased costs. We assessed bowel preparation recommendations and the relationship to self-reported proportion of suboptimal bowel preparations in practice; and evaluated the impact of suboptimal bowel preparation on colonoscopy surveillance practices. A random sample of a national organization of gastroenterologists in the U.S. was surveyed.
Methods: Demographic and practice characteristics, bowel preparation regimens, and proportion of suboptimal bowel preparations in practice were ascertained. Recommended follow-up colonoscopy intervals were evaluated for optimal and suboptimal bowel preparation and select clinical scenarios.
Results: We identified 6,777 physicians, of which 1,354 were randomly selected; 999 were eligible, and 288 completed the survey. Higher proportion of suboptimal bowel preparations/week (≥10 %) was associated with hospital/university practice, teaching hospital affiliation, greater than 25 % Medicaid insured patients, recommendation of PEG alone and sulfate-free. Those reporting greater than 25 % Medicare and privately insured patients, split dose recommendation, and use of MoviPrep® were associated with a less than 10 % suboptimal bowel preparations/week. Shorter surveillance intervals for three clinical scenarios were reported for suboptimal preparations and were shortest among participants in the Northeast who more often recommended early follow-up for normal findings and small adenomas. Those who recommended 4-l PEG alone more often advised less than 1 year surveillance interval for a large adenoma.
Conclusions: Our study demonstrates significantly shortened surveillance interval recommendations for suboptimal bowel preparation and that these interval recommendations vary regionally in the United States. Findings suggest an interrelationship between dietary restriction, purgative type, and practice and patient characteristics that warrant additional research
Gastroenterologists' Perceived Barriers to Optimal Pre-Colonoscopy Bowel Preparation: Results of a National Survey
Poor quality bowel preparation has been reported in almost one third of all colonoscopies. To better understand factors associated with poor bowel preparation, we explored perceived patient barriers to optimal pre-colonoscopy bowel preparation from the perspective of the gastroenterologist. A random sample of physician members of the American College of Gastroenterology was surveyed via the internet and postal mailing. Demographic and practice characteristics and practice-related and perceived patient barriers to optimal bowel preparation were assessed among 288 respondents. Lack of time, no patient education reimbursement, and volume of information were not associated with physician level of suboptimal bowel preparation. Those reporting greater than or equal to 10 % suboptimal bowel preparations were more likely to believe patients lack understanding of the importance of following instructions, have problems with diet, and experience trouble tolerating the purgative. Bowel preparation instruction communication and unmet patient educational needs contribute to suboptimal bowel preparation. Educational interventions should address both practice and patient-related factors
- …