157 research outputs found

    Influences of exocrine pancreatic insufficiency on nutrient digestibility, growth parameters as well as anatomical and histological morphology of the intestine in a juvenile pig model

    Full text link
    In a pig model, pancreatic duct ligation (PL) leads to a complete loss of exocrine function, causing an exocrine pancreatic insufficiency (EPI) without affecting endocrine function, allowing research of clinical effects and therapy options. This study aimed to investigate effects of experimentally induced EPI in juvenile pigs on digestion and intestinal morphology. Eight female juvenile cross-bred pigs (BW 54.8 kg at the start of the study) were included. Three animals were considered as a control (CON group), and in five animals the ductus pancreaticus accessorius was ligated (PL group). During the 10-week trial period, body weight and body measurements were recorded regularly. At the end of the trial, gastrointestinal tract (GIT) was investigated macroscopically and histologically and weight and digesta samples of individual segments were obtained. The pigs in the CON showed a significantly higher apparent total tract digestibility of crude protein and crude fat (87.8 and 79.9%, respectively) compared to PL (52.4 and 16.6%, respectively). Significant differences were noted in relative weights of duodenum, jejunum and colon (with and without digesta) and also in absolute weights of jejunum and colon. The mean number of nuclei in the transverse section in stratum circulare were significantly higher in all intestinal segments in CON compared to PL. Overall, EPI results in impaired nutrient digestibility with a greater filling of the GIT with digesta. The elongation of the small intestine does not represent “stretching” of the intestine, but rather increased synthesis of intestinal tissue

    Cost of porcine reproductive and respiratory syndrome virus at individual farm level – An economic disease model

    Get PDF
    Porcine reproductive and respiratory syndrome (PRRS) is reported to be among the diseases with the highest economic impact in modern pig production worldwide. Yet, the economic impact of the disease at farm level is not well understood as, especially in endemically infected pig herds, losses are often not obvious. It is therefore difficult for farmers and veterinarians to appraise whether control measures such as virus elimination or vaccination will be economically beneficial for their farm. Thus, aim of this study was to develop an epidemiological and economic model to determine the costs of PRRS for an individual pig farm. In a production model that simulates farm outputs, depending on farm type, farrowing rhythm or length of suckling period, an epidemiological model was integrated. In this, the impact of PRRS infection on health and productivity was estimated. Financial losses were calculated in a gross margin analysis and a partial budget analysis based on the changes in health and production parameters assumed for different PRRS disease severities. Data on the effects of endemic infection on reproductive performance, morbidity and mortality, daily weight gain, feed efficiency and treatment costs were obtained from literature and expert opinion. Nine different disease scenarios were calculated, in which a farrow-to-finish farm (1000 sows) was slightly, moderately or severely affected by PRRS, based on changes in health and production parameters, and either in breeding, in nursery and fattening or in all three stages together. Annual losses ranged from a median of € 75â€Č724 (90% confidence interval (C.I.): € 78â€Č885–€ 122â€Č946), if the farm was slightly affected in nursery and fattening, to a median of € 650â€Č090 (90% C.I. € 603â€Č585–€ 698â€Č379), if the farm was severely affected in all stages. Overall losses were slightly higher if breeding was affected than if nursery and fattening were affected. In a herd moderately affected in all stages, median losses in breeding were € 46â€Č021 and € 422â€Č387 in fattening, whereas costs were € 25â€Č435 lower in nursery, compared with a PRRSV-negative farm. The model is a valuable decision-support tool for farmers and veterinarians if a farm is proven to be affected by PRRS (confirmed by laboratory diagnosis). The output can help to understand the need for interventions in case of significant impact on the profitability of their enterprise. The model can support veterinarians in their communication to farmers in cases where costly disease control measures are justified

    Simulating future salinity dynamics in a coastal marshland under different climate scenarios

    Get PDF
    Salinization is a well‐known problem in agricultural areas worldwide. In the last 20–30 yr, rising salinity in the upper, unconfined aquifer has been observed in the Freepsumer Meer, a grassland near the German North Sea coast. For investigating long‐term development of salinity and water balance during 1961–2099, the one‐dimensional Soil–Water–Atmosphere–Plant (SWAP) model was set up and calibrated for a soil column in the area. The model setup involves a deep aquifer as the source of salt through upward seepage. In the vertical salt transport equation, dispersion and advection are included. Six different regional outputs of statistical downscaling methods were used as climate scenarios. These comprise different rates of increasing surface temperature and different trends in seasonal rainfall. The simulation results exhibit opposing salinity trends for topsoil and deeper layers. Although projections of some scenarios entail decreasing salinities near the surface, most of them project a rise in subsoil salinity, with the strongest trends of up to +0.9 mg cm−3 100 yr−1 at −65 cm. The results suggest that topsoil salinity trends in the study area are affected by the magnitude of winter rainfall trends, whereas high subsoil salinities correspond to low winter rainfall and high summer temperature. How these projected trends affect the vegetation and thereby future land use will depend on the future management of groundwater levels in the area

    The optimal cut-off values for tumor size, number of lesions, and CEA levels in patients with surgically treated colorectal cancer liver metastases: An international, multi-institutional study

    Get PDF
    Background and Objectives Despite the long-standing consensus on the importance of tumor size, tumor number and carcinoembryonic antigen (CEA) levels as predictors of long-term outcomes among patients with colorectal liver metastases (CRLM), optimal prognostic cut-offs for these variables have not been established. Methods Patients who underwent curative-intent resection of CRLM and had available data on at least one of the three variables of interest above were selected from a multi-institutional dataset of patients with known KRAS mutational status. The resulting cohort was randomly split into training and testing datasets and recursive partitioning analysis was employed to determine optimal cut-offs. The concordance probability estimates (CPEs) for these optimal cut offs were calculated and compared to CPEs for the most widely used cut-offs in the surgical literature. Results A total of 1643 patients who met eligibility criteria were identified. Following recursive partitioning analysis in the training dataset, the following cut-offs were identified: 2.95 cm for tumor size, 1.5 for tumor number and 6.15 ng/ml for CEA levels. In the entire dataset, the calculated CPEs for the new tumor size (0.52), tumor number (0.56) and CEA (0.53) cut offs exceeded CPEs for other commonly employed cut-offs. Conclusion The current study was able to identify optimal cut-offs for the three most commonly employed prognostic factors in CRLM. While the per variable gains in discriminatory power are modest, these novel cut-offs may help produce appreciable increases in prognostic performance when combined in the context of future risk scores.publishedVersio

    The optimal cut‐off values for tumor size, number of lesions, and CEA levels in patients with surgically treated colorectal cancer liver metastases: An international, multi‐institutional study

    Get PDF
    Background and Objectives: Despite the long-standing consensus on the importance of tumor size, tumor number and carcinoembryonic antigen (CEA) levels as predictors of long-term outcomes among patients with colorectal liver metastases (CRLM), optimal prognostic cut-offs for these variables have not been established. Methods: Patients who underwent curative-intent resection of CRLM and had available data on at least one of the three variables of interest above were selected from a multi-institutional dataset of patients with known KRAS mutational status. The resulting cohort was randomly split into training and testing datasets and recursive partitioning analysis was employed to determine optimal cut-offs. The concordance probability estimates (CPEs) for these optimal cut offs were calculated and compared to CPEs for the most widely used cut-offs in the surgical literature. Results: A total of 1643 patients who met eligibility criteria were identified. Following recursive partitioning analysis in the training dataset, the following cut-offs were identified: 2.95 cm for tumor size, 1.5 for tumor number and 6.15 ng/ml for CEA levels. In the entire dataset, the calculated CPEs for the new tumor size (0.52), tumor number (0.56) and CEA (0.53) cut offs exceeded CPEs for other commonly employed cut-offs. Conclusion: The current study was able to identify optimal cut-offs for the three most commonly employed prognostic factors in CRLM. While the per variable gains in discriminatory power are modest, these novel cut-offs may help produce appreciable increases in prognostic performance when combined in the context of future risk scores

    Safety of liver resection and effect on quality of life in patients with benign hepatic disease: Single center experience

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Although liver resection has long been established for selected patients with benign hepatic disease, the success of surgical treatment of these patients cannot be evaluated exclusively through postoperative morbidity and mortality. Therefore, the aim of the study was to prove the safety of liver resection in the treatment of benign liver tumors and to evaluate the effect of surgical treatment on the patients' qauality of life.</p> <p>Methods</p> <p>A total of 146 patients who underwent liver resection because of benign liver tumors were included in this study. Postoperative outcome was assessed and patients evaluated their quality of life before surgery and at the present time using the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire Core-30 (QLQ C-30).</p> <p>Results</p> <p>The rate of serious (> grade 2) complications was 4.1% with no postoperative death. The quality of life assessment revealed an overall improvement of general health status after resection (0.7 vs. 0.56, p < 0.001) and additionally a significant reduction of 6 out of 9 symptoms. Furthermore, compelling benefits in the patients' social and emotional coping could be detected after surgery.</p> <p>Conclusions</p> <p>Liver resection for benign liver disease is a safe procedure and leads to a significant improvement of quality of life in selected patients.</p

    Who Is at Risk for Diagnostic Discrepancies? Comparison of Pre- and Postmortal Diagnoses in 1800 Patients of 3 Medical Decades in East and West Berlin

    Get PDF
    <div><h3>Background</h3><p>Autopsy rates in Western countries consistently decline to an average of <5%, although clinical autopsies represent a reasonable tool for quality control in hospitals, medically and economically. Comparing pre- and postmortal diagnoses, diagnostic discrepancies as uncovered by clinical autopsies supply crucial information on how to improve clinical treatment. The study aimed at analyzing current diagnostic discrepancy rates, investigating their influencing factors and identifying risk profiles of patients that could be affected by a diagnostic discrepancy.</p> <h3>Methods and Findings</h3><p>Of all adult autopsy cases of the CharitĂ© Institute of Pathology from the years 1988, 1993, 1998, 2003 and 2008, the pre- and postmortal diagnoses and all demographic data were analyzed retrospectively. Based on power analysis, 1,800 cases were randomly selected to perform discrepancy classification (class I-VI) according to modified Goldman criteria. The rate of discrepancies in major diagnoses (class I) was 10.7% (95% CI: 7.7%–14.7%) in 2008 representing a reduction by 15.1%. Subgroup analysis revealed several influencing factors to significantly correlate with the discrepancy rate. Cardiovascular diseases had the highest frequency among class-I-discrepancies. Comparing the 1988-data of East- and West-Berlin, no significant differences were found in diagnostic discrepancies despite an autopsy rate differing by nearly 50%. A risk profile analysis visualized by intuitive heatmaps revealed a significantly high discrepancy rate in patients treated in low or intermediate care units at community hospitals. In this collective, patients with genitourinary/renal or infectious diseases were at particularly high risk.</p> <h3>Conclusions</h3><p>This is the current largest and most comprehensive study on diagnostic discrepancies worldwide. Our well-powered analysis revealed a significant rate of class-I-discrepancies indicating that autopsies are still of value. The identified risk profiles may aid both pathologists and clinicians to identify patients at increased risk for a discrepant diagnosis and possibly suboptimal treatment intra vitam.</p> </div
    • 

    corecore