16 research outputs found

    Cholestasis impaired spatial and non-spatial novelty detection in mice

    Get PDF
    Bile duct ligation (BDL) is shown to induce cholestasis-related liver function impairments as well as consequent cognitive dysfunctions (i.e. impaired learning and memory formation). This study investigates the effects of cholestasis (14, 21 and 28 days post bile duct ligation) on spatial and non-spatial novelty detection, using a non-associative task. Male mice weighing 30-35 g were used. Cholestasis was induced by ligation of the main bile duct using two ligatures and transecting the duct at the midpoint between them. Open field paradigm was employed to assess the spatial and non-spatial memories retention. Our data showed that cholestasis (28 days after bile duct ligation) decrease and increased duration time of displace and non-displace objects respectively, indicating spatial memory deficit. Moreover, this intervention (28 days after bile duct ligation) decreased and did not alter duration time of substitute and non-substitute objects respectively, suggesting non-spatial memory deficit. Moreover, the data postulated that 14 and 21 days post bile duct ligation both spatial and non-spatial memories did not alter. Our results suggested that cholestasis (28 but not 14 and 21 days post bile duct ligation) impaired spatial and non-spatial memory in the mice

    A Scheduling Algorithm to Maximize Storm Throughput in Heterogeneous Cluster

    Full text link
    In the most popular distributed stream processing frameworks (DSPFs), programs are modeled as a directed acyclic graph. This model allows a DSPF to benefit from the parallelism power of distributed clusters. However, choosing the proper number of vertices for each operator and finding an appropriate mapping between these vertices and processing resources have a determinative effect on overall throughput and resource utilization; while the simplicity of current DSPFs' schedulers leads these frameworks to perform poorly on large-scale clusters. In this paper, we present the design and implementation of a heterogeneity-aware scheduling algorithm that finds the proper number of the vertices of an application graph and maps them to the most suitable cluster node. We start to scale up the application graph over a given cluster gradually, by increasing the topology input rate and taking new instances from bottlenecked vertices. Our experimental results on Storm Micro-Benchmark show that 1) the prediction model estimate CPU utilization with 92% accuracy. 2) Compared to default scheduler of Storm, our scheduler provides 7% to 44% throughput enhancement. 3) The proposed method can find the solution within 4% (worst case) of the optimal scheduler which obtains the best scheduling scenario using an exhaustive search on problem design space

    Assessment of neuropsychiatric indicators in children and adolescents with primary brain tumors and other brain space occupying lesions before and after surgery

    Get PDF
      Objective Cognitive abilities may be impaired due to brain lesions in children and adolescents. This study aimed to investigate neuropsychiatric indicators in children and adolescents with primary brain tumor and other brain space occupying lesions (SOL) before and after surgical procedure. Methods & materials the current study is a pre and post study which was conducted on 81 patients with brain space occupying lesions aged less than 18. Patients with metastatic brain tumors were excluded. The study was performed between 20 December 2016 to 20 December 2017 on patients hospitalized in neurosurgery ward of Imam Reza university hospital, Tabriz, Iran. Before and after surgical procedure, Digit span forward and backward Task (to assess working memory), Stroop task and Trail Making Task A & B (to assess attention) and Rey Osterrieth Complex Figure Test (to assess Visual Spatial Memory) were done. Then, scores of tests were compared with normal values as well as the post-surgery scores. Results the most prevalent type of space occupying brain lesion was medulloblastoma and the most common region of involvement was posterior fossa tumor. Scores of all tests after surgery comparing to before surgery were significantly improved (P<0.05). In assessment of Digit span forward and backward Task with standard scores, there was no significant difference among scores of patients before surgery with the standard value (P>0.05). Regarding scores of various stages of Rey Osterrieth Complex Figure Test, the  scores of immediate recall stage was significantly low (P<0.05). Among Trail Making Task A & B and stroop task, before surgery, just the Trail Making Task A & B was significantly increased (P<0.05). Scores of Trail Making Task A was significantly higher in patients with medulloblastoma and anatomically in left temporal tumors which indicate greater damage of attention field (P<0.05). In addition, in cerebellar tumor, scores of immediate recall stage of Rey Osterrieth Complex Figure Test was significantly lower (P<0.05). Conclusions Visuo-Spatial Memory and attention in pre-surgery assessments was significantly impaired comparing to general population (P<0.05). Working memory, Visuo-Spatial Memory and attention showed improvement comparing to pre-surgery. Deficits in attention domain was greater in medulloblastoma

    Intraperitoneal drain placement and outcomes after elective colorectal surgery: international matched, prospective, cohort study

    Get PDF
    Despite current guidelines, intraperitoneal drain placement after elective colorectal surgery remains widespread. Drains were not associated with earlier detection of intraperitoneal collections, but were associated with prolonged hospital stay and increased risk of surgical-site infections.Background Many surgeons routinely place intraperitoneal drains after elective colorectal surgery. However, enhanced recovery after surgery guidelines recommend against their routine use owing to a lack of clear clinical benefit. This study aimed to describe international variation in intraperitoneal drain placement and the safety of this practice. Methods COMPASS (COMPlicAted intra-abdominal collectionS after colorectal Surgery) was a prospective, international, cohort study which enrolled consecutive adults undergoing elective colorectal surgery (February to March 2020). The primary outcome was the rate of intraperitoneal drain placement. Secondary outcomes included: rate and time to diagnosis of postoperative intraperitoneal collections; rate of surgical site infections (SSIs); time to discharge; and 30-day major postoperative complications (Clavien-Dindo grade at least III). After propensity score matching, multivariable logistic regression and Cox proportional hazards regression were used to estimate the independent association of the secondary outcomes with drain placement. Results Overall, 1805 patients from 22 countries were included (798 women, 44.2 per cent; median age 67.0 years). The drain insertion rate was 51.9 per cent (937 patients). After matching, drains were not associated with reduced rates (odds ratio (OR) 1.33, 95 per cent c.i. 0.79 to 2.23; P = 0.287) or earlier detection (hazard ratio (HR) 0.87, 0.33 to 2.31; P = 0.780) of collections. Although not associated with worse major postoperative complications (OR 1.09, 0.68 to 1.75; P = 0.709), drains were associated with delayed hospital discharge (HR 0.58, 0.52 to 0.66; P < 0.001) and an increased risk of SSIs (OR 2.47, 1.50 to 4.05; P < 0.001). Conclusion Intraperitoneal drain placement after elective colorectal surgery is not associated with earlier detection of postoperative collections, but prolongs hospital stay and increases SSI risk

    Vermicompost quality and earthworm reproduction in different organic waste substrates

    No full text
    Purpose The present study aims to evaluate the changes in parameters affecting the quality of vermicompost produced by the earthworm 'Eisenia fetida' on different organic waste substrates using multivariate analysis, variance analysis, factor analysis and principal component analysis (PCA).Method A completely randomized design experiment was conducted with a 2 × 8 factorial arrangement of experimental and control treatments in triplicate per treatment. We investigated the growth and reproduction of earthworms and the characteristics of vermicompost produced on different organic wastes and residues represented by carrot pulp (C), potato peel (P), vegetables (V) and sawdust (S) blended with cattle manure (as the main substrate) at two levels of 100 and 150 grams of each treatment in two kilograms of manure. Chemical parameters include pH, electrical conductivity (EC), carbon/nitrogen ratio (C/N), phosphorus (P), potassium (K), calcium (Ca), magnesium (Mg), iron (Fe) and copper (Cu) were measured in the vermicompost produced.Results The results of cluster analysis and PCA grouped nine substrate combinations into three categories with similar qualitative characteristics. The first two principal components in PCA revealed that the major parameters responsible for the qualitative changes in the produced vermicompost were iron, copper, calcium, magnesium, potassium, phosphorus and nitrogen.Conclusion Current results suggested that the treatment CPVS and the treatment S (sawdust) provided the optimal conditions for the growth and reproduction of earthworms and the production of high-quality vermicompost

    Defining the at risk patients for contrast induced nephropathy after coronary angiography; 24-h urine creatinine versus Cockcroft-Gault equation or serum creatinine level

    No full text
    Background: Definitions of chronic kidney disease (CKD) in many catheterization laboratories have relied on the serum creatinine (Scr) rather than glomerular filtration rate (GFR). Regarding that CKD is the primary predisposing factor for contrast induced nephropathy (CIN), we compared the sensitivity of calculated GFR by 24-h Urine creatinine with Cockcroft-Gault (CG) equation and Scr level to define at risk patients for CIN who were undergone coronary angiography (CAG). Materials and Methods: Two hundred fifty four subjects who were candidate for CAG and had normal creatinine level were enrolled. Before CAG, GFR was calculated from a 24-h urine collection, CG equation and a single Scr sample regarding to previously described protocol. Contrast volume used for each case <100 ml. CIN was defined as a 0.5 mg/dL or 25% elevation in the Scr. Results : CIN occurred in 10.6%. Baseline GFR, the volume of contrast agent, and diabetes were the independent risk factors for CIN. GFR was less than 60 ml/min/1.73 m2 in 28% and 23.2% of patients regarding to 24-h urine creatinine and CG equation, respectively. In CIN prediction, 24-h urine creatinine estimated GFR had 85.2%, 59.3% and CG equation GFR had 78.9%, 81.1% sensitivity and specificity, respectively. Conclusion: Although, GFR estimated by CG equation has less sensitivity than GFR calculated from 24-h creatinine in CIN probability, but it is better than Scr alone and because of cost-effectiveness and convenience using of this method, we suggest at least using CG equation for GFR calculation before CIN, especially in diabetic and/or older than 60 years cases

    Prevalence of latent tuberculosis infection among tuberculosis laboratory workers in Iran

    No full text
    OBJECTIVES The risk of transmission of Mycobacterium tuberculosis from patients to health care workers (HCWs) is a neglected problem in many countries, including Iran. The aim of this study was to estimate the prevalence of latent tuberculosis (TB) infection (LTBI) among TB laboratory staff in Iran, and to elucidate the risk factors associated with LTBI. METHODS All TB laboratory staff (689 individuals) employed in the TB laboratories of 50 Iranian universities of medical sciences and a random sample consisting of 317 low-risk HCWs were included in this cross-sectional study. Participants with tuberculin skin test indurations of 10 mm or more were considered to have an LTBI. RESULTS The prevalence of LTBI among TB laboratory staff and low-risk HCWs was 24.83% (95% confidence interval [CI], 21.31 to 27.74%) and 14.82% (95% CI, 11.31 to 19.20%), respectively. No active TB cases were found in either group. After adjusting for potential confounders, TB laboratory staff were more likely to have an LTBI than low-risk HCWs (prevalence odds ratio, 2.06; 95% CI, 1.35 to 3.17). CONCLUSIONS This study showed that LTBI are an occupational health problem among TB laboratory staff in Iran. This study reinforces the need to design and implement simple, effective, and affordable TB infection control programs in TB laboratories in Iran

    Comparison of the tuberculin skin test and the QuantiFERON-TB Gold test in detecting latent tuberculosis in health care workers in Iran

    No full text
    OBJECTIVES: The tuberculin skin test (TST) and the QuantiFERON-TB Gold test (QFT) are used to identify latent tuberculosis infections (LTBIs). The aim of this study was to determine the agreement between these two tests among health care workers in Iran. METHODS: This cross-sectional study included 177 tuberculosis (TB) laboratory staff and 67 non-TB staff. TST indurations of 10 mm or more were considered positive. The Student’s t-test and the chi-square test were used to compare the mean score and proportion of variables between the TB laboratory staff and the non-TB laboratory staff. Kappa statistics were used to evaluate the agreement between these tests, and logistic regression was used to assess the risk factors associated with positive results for each test. RESULTS: The prevalence of LTBIs according to both the QFT and the TST was 17% (95% confidence interval [CI], 12% to 21%) and 16% (95% CI, 11% to 21%), respectively. The agreement between the QFT and the TST was 77.46%, with a kappa of 0.19 (95% CI, 0.04 to 0.34). CONCLUSIONS: Although the prevalence of LTBI based on the QFT and the TST was not significantly different, the kappa statistic was low between these two tests for the detection of LTBIs
    corecore