464 research outputs found
Escapement of the Cape rock lobster (Jasus lalandii ) through the mesh and entrance of commercial traps
Metal-framed traps covered with polyethylene mesh used in the fishery for the South African Cape rock lobster (Jasus lalandii) incidentally capture large numbers of undersize (<75 mm CL) specimens. Air-exposure, handling, and release
procedures affect captured rock lobsters and reduce the productivity of the stock, which is heavily fished.
Optimally, traps should retain legalsize rock lobsters and allow sublegal animals to escape before traps are hauled. Escapement, based on lobster morphometric measurements, through meshes of 62 mm, 75 mm, and 100 mm was investigated theoretically under controlled conditions in an aquarium, and during field trials. SELECT models were used to model
escapement, wherever appropriate. Size-selectivity curves based on the logistic model fitted the aquarium and field data better than asymmetrical Richards curves. The lobster length at 50% retention (L50) on the escapement curve for 100-mm mesh in the aquarium (75.5 mm CL) approximated the minimum legal size (75 mm CL); however estimates of L50 increased to 77.4 mm in field trials where trapentrances
were sealed, and to 82.2 mm where trap-entrances were open.
Therfore, rock lobsters that cannot escape through the mesh of sealed field traps do so through the trap entrance of open traps. By contrast, the wider selection range and lower
L25 of field, compared to aquarium, trials (SR = 8.2 mm vs. 2.6 mm; L25 =73.4 mm vs. 74.1 mm), indicate that small lobsters that should be able to escape from 100-mm mesh
traps do not always do so. Escapement from 62-mm mesh traps with open entrance funnels increased by 40−60% over sealed traps. The findings of this study with a known size
distribution, are related to those of a recent indirect (comparative) study for the same species, and implications for trap surveys, commercial catch rates, and ghost fishing are discussed
Changes in Circulating Procalcitonin Versus C-Reactive Protein in Predicting Evolution of Infectious Disease in Febrile, Critically Ill Patients
Objective:Although absolute values for C-reactive protein (CRP) and procalcitonin (PCT) are well known to predict sepsis in the critically ill, it remains unclear how changes in CRP and PCT compare in predicting evolution of: infectious disease, invasiveness and severity (e.g. development of septic shock, organ failure and non-survival) in response to treatment. The current study attempts to clarify these aspects.Methods:In 72 critically ill patients with new onset fever, CRP and PCT were measured on Day 0, 1, 2 and 7 after inclusion, and clinical courses were documented over a week with follow up to Day 28. Infection was microbiologically defined, while septic shock was defined as infection plus shock. The sequential organ failure assessment (SOFA) score was assessed.Results:From peak at Day 0-2 to Day 7, CRP decreased when (bloodstream) infection and septic shock (Day 0-2) resolved and increased when complications such as a new (bloodstream) infection or septic shock (Day 3-7) supervened. PCT decreased when septic shock resolved and increased when a new bloodstream infection or septic shock supervened. Increased or unchanged SOFA scores were best predicted by PCT increases and Day 7 PCT, in turn, was predictive for 28-day outcome.Conclusion:The data, obtained during ICU-acquired fever and infections, suggest that CRP may be favoured over PCT courses in judging response to antibiotic treatment. PCT, however, may better indicate the risk of complications, such as bloodstream infection, septic shock, organ failure and mortality, and therefore might help deciding on safe discontinuation of antibiotics. The analysis may thus help interpreting current literature and design future studies on guiding antibiotic therapy in the ICU
Neutrophil Gelatinase-Associated Lipocalin as a Diagnostic Marker for Acute Kidney Injury in Oliguric Critically Ill Patients: A Post-Hoc Analysis
__Background:__ Oliguria occurs frequently in critically ill patients, challenging clinicians to distinguish functional adaptation from serum-creatinine-defined acute kidney injury (AKIsCr). We investigated neutrophil gelatinase-associated lipocalin (NGAL)'s ability to differentiate between these 2 conditions.
__Methods:__ This is a post-hoc analysis of a prospective cohort of adult critically ill patients. Patients without oliguria within the first 6 h of admission were excluded. Plasma and urinary NGAL were measured at 4 h after admission. AKIsCr was defined using the AKI network criteria with pre-admission serum creatinine or lowest serum creatinine value during the admission as the baseline value. Hazard ratios for AKIsCr occurrence within 72 h were calculated using Cox regression and adjusted for risk factors such as sepsis, pre-admission serum creatinine, and urinary output. Positive predictive values (PPV) and negative predictive values (NPV) were calculated for the optimal cutoffs for NGAL.
__Results:__ Oliguria occurred in 176 patients, and 61 (35%) patients developed AKIsCr. NGAL was a predictor for AKIsCr in univariate and multivariate analysis. When NGAL was added to a multivariate model including sepsis, pre-admission serum creatinine and lowest hourly urine output, it outperformed the latter model (plasma p = 0.001; urinary p = 0.048). Cutoff values for AKIsCr were 280 ng/ml for plasma (PPV 80%; NPV 79%), and 250 ng/ml for urinary NGAL (PPV 58%; NPV 78%).
__Conclusions:__ NGAL can be used to distinguish oliguria due to the functional adaptation from AKIsCr, directing resources to patients more likely to develop AKIsCr
Global end-diastolic volume increases to maintain fluid responsiveness in sepsis-induced systolic dysfunction
Background: Sepsis-induced cardiac dysfunction may limit fluid responsiveness and the mechanism thereof remains unclear. Since cardiac function may affect the relative value of cardiac filling pressures, such as the recommended central venous pressure (CVP), versus filling volumes in guiding fluid loading, we studied these parameters as determinants of fluid responsiveness, according to cardiac function.Methods: A delta CVP-guided, 90 min colloid fluid loading protocol was performed in 16 mechanically ventilated patients with sepsis-induced hypotension and three 30 min consecutive fluid loading steps of about 450 mL per patient were evaluated. Global end-diastolic volume index (GEDVI), cardiac index (CI) and global ejection fraction (GEF) were assessed from transpulmonary dilution. Baseline and changes in CVP and GEDVI were compared among responding (CI increase ≥10% and ≥15%) and non-responding fluid loading steps, in patient with low (<20%, n = 9) and near-normal (≥20%) GEF (n = 7) at baseline.Results: A low GEF was in line with other indices of impaired cardiac (left ventricular) function, prior to and after fluid loading. Of 48 fluid loading steps, 9 (of 27) were responding when GEF <20% and 6 (of 21) when GEF ≥20. Prior to fluid loading, CVP did not differ between responding and non-responding steps and levels attained were 23 higher in the latter, regardless of GEF (P = 0.004). Prior to fluid loading, GEDVI (and CI) was higher in responding (1007 ± 306 mL/m2) than non-responding steps (870 ± 236 mL/m2) when GEF was low (P = 0.002), but did not differ when GEF was near-normal. Increases in GEDVI were associated with increases in CI and fluid responsiveness, regardless of GEF (P < 0.001).Conclusions: As estimated from transpulmonary dilution, about half of patients with sepsis-induced hypotension have systolic cardiac dysfunction. During dysfunction, cardiac dilation with a relatively high baseline GEDVI maintains fluid responsiveness by further dilatation (increase in GEDVI rather than of CVP) as in patients without dysfunction. Absence of fluid responsiveness during systolic cardiac dysfunction may be caused by diastolic dysfunction and/or right ventricular dysfunction
Targeting urine output and 30-day mortality in goal-directed therapy: A systematic review with meta-analysis and meta-regression.
Background: Oliguria is associated with a decreased kidney- and organ perfusion, leading to orga
Procalcitonin to guide taking blood cultures in the intensive care unit; a cluster-randomized controlled trial
Objectives: We aimed to study the safety and efficacy of procalcitonin in guiding blood cultures taking in critically ill patients with suspected infection. Methods: We performed a cluster-randomized, multi-centre, single-blinded, cross-over trial. Patients suspected of infection in whom taking blood for culture was indicated were included. The participating intensive care units were stratified and randomized by treatment regimen into a control group and a procalcitonin-guided group. All patients included in this trial followed the regimen that was allocated to the intensive care unit for that period. In both groups, blood was drawn at the same moment for a procalcitonin measurement and blood cultures. In the procalcitonin-guided group, blood cultures were sent to the department of medical microbiology when the procalcitonin was>0.25 ng/mL. The main outcome was safety, expressed as mortality at day 28 and day 90. Results: The control group included 288 patients and the procalcitonin-guided group included 276 patients. The 28- and 90-day mortality rates in the procalcitonin-guided group were 29% (80/276) and 38% (105/276), respectively. The mortality rates in the control group were 32% (92/288) at day 28 and 40% (115/288) at day 90. The intention-to-treat analysis showed hazard ratios of 0.85 (95% CI 0.62-1.17) and 0.89 (95% CI 0.67-1.17) for 28-day and 90-day mortality, respectively. The results were deemed non-inferior because the upper limit of the 95% CI was below the margin of 1.20. Conclusion: Applying procalcitonin to guide blood cultures in critically ill patients with suspected infection seems to be safe, but the benefits may be limited. Trial registration: . ClinicalTrials.gov identifier: ID . NCT01847079. Registered on 24 April 2013, retrospectively registered
Vitamin D deficiency as a risk factor for infection, sepsis and mortality in the critically ill: systematic review and meta-analysis
INTRODUCTION: In Europe, vitamin D deficiency is highly prevalent varying between 40% and 60% in the healthy general adult population. The consequences of vitamin D deficiency for sepsis and outcome in critically ill patients remain controversial. We therefore systematically reviewed observational cohort studies on vitamin D deficiency in the intensive care unit.METHODS: Fourteen observational reports published from January 2000 to March 2014, retrieved from Pubmed and Embase, involving 9,715 critically ill patients and serum 25-hydroxyvitamin D3 (25 (OH)-D) concentrations, were meta-analysed.RESULTS: Levels of 25 (OH)-D less than 50 nmol/L were associated with increased rates of infection (risk ratio (RR) 1.49, 95% (confidence interval (CI) 1.12 to 1.99), P = 0.007), sepsis (RR 1.46, 95% (CI 1.27 to 1.68), P <0.001), 30-day mortality (RR 1.42, 95% (CI 1.00 to 2.02), P = 0.05), and in-hospital mortality (RR 1.79, 95% (CI 1.49 to 2.16), P <0.001). In a subgroup analysis of adjusted data including vitamin D deficiency as a risk factor for 30-day mortality the pooled RR was 1.76 (95% CI 1.37 to 2.26, P <0.001).CONCLUSIONS: This meta-analysis suggests that vitamin D deficiency increases susceptibility for severe infections and mortality of the critically ill
Funding models in palliative care: lessons from international experience
Background:Funding models influence provision and development of palliative care services. As palliative care integrates into mainstream health care provision, opportunities to develop funding mechanisms arise. However, little has been reported on what funding models exist or how we can learn from them.Aim:To assess national models and methods for financing and reimbursing palliative care.Design:Initial literature scoping yielded limited evidence on the subject as national policy documents are difficult to identify, access and interpret. We undertook expert consultations to appraise national models of palliative care financing in England, Germany, Hungary, Republic of Ireland, New Zealand, The Netherlands, Norway, Poland, Spain, Sweden, Switzerland, the United States and Wales. These represent different levels of service development and a variety of funding mechanisms.Results:Funding mechanisms reflect country-specific context and local variations in care provision. Patterns emerging include the following:Provider payment is rarely linked to population need and often perpetuates existing inequitable patterns in service provision.Funding is frequently characterised as a mixed system of charitable, public and private payers.The basis on which providers are paid for services rarely reflects individual care input or patient needs.Conclusion:Funding mechanisms need to be well understood and used with caution to ensure best practice and minimise perverse incentives. Before we can conduct cross-national comparisons of costs and impact of palliative care, we need to understand the funding and policy context for palliative care in each country of interest
The adrenocorticotropic hormone-induced cortisol response in acute pancreatitis
The evidence that severe acute pancreatitis can result in critical illness-related corticosteroid insufficiency following impaired adrenal secretion is accumulating. The study by Peng and coworkers in Critical Care certainly contributes to that idea, even though the question whether relative adrenal insufficiency should prompt for treatment by substitution doses of corticosteroids remains unresolved. The study is discussed in terms of the risk factors, circumstances and significance of impaired corticosteroid secretion by adrenals in severe acute pancreatitis
- …
