46 research outputs found

    Anti-inflammatory and Analgesic Properties of Salvigenin, Salvia officinalis Flavonoid Extracted

    Get PDF
    Background and aims: Inflammation is one of the defense mechanisms of body and unpleasant sensation of pain is caused by tissue damage. Mostly, inflammation occurs through the release of inflammatory mediators. Salvia officinalis is one of the most valuable medicinal kind of mint order. Salvigenin is one of the active flavonoids existing in this plant. The aim of this study was to evaluate the anti-inflammatory and analgesic effect of salvigenin, Salvia officinalis flavonoid extracted. Methods: In this laboratory experimental study, plant was extracted and the column chromatography was used to purify prepared extracts. 100 male albino mice and 48 male wistar rats were selected. In the hot plate test and in the writhing test, animals were divided randomly into 5 groups. Group 1 (received 10 mg/kg normal saline), groups 2, 3 and 4 (received Salvigenin 25, 50 and 100 mg/kg intraperitoneally, espectively), group 5 (received 10 mg/kg morphine in hot plate test and 10 mg/kg indomethacin in writhing test). In the inflammatory test, animals were divided into 6 groups. Group 1 was assigned as a control group which received 0.05 ml of carrageenin. Groups 2, 3 and 4 (received Salvigenin, at doses of 25, 50 and 100 mg/kg). Group 5 (received 10 mg/kg indomethacin) and then changes of the volume of all groups were measured. Data were analyzed using ANOVA and Tukey test and PResults: In writhing test, Salvigenin reduced the number of abdominal contractions at doses of 50 and 100 mg/kg. Increasing dose of Salvigenin, with reduction in abdominal cramps resulted in the increasing of pain inhibition, and the percentage of this inhibition was statistically significant (P<0.001). In hot plate test, also 30, 45 and 60 minutes after injection of Salvigenin and morphine showed significant difference compared to the control group (P<0.001). Also, Salvigenin increased the maximum percentage of analgesic compared to the control group (P<0.001). Salvigenin could reduce inflammation and in the group that received Salvigenin at 100 mg/kg, the inflammation was significantly lower than the control group (P<0.05). Discussion: Our findings showed that Salvigenin has dose-dependent analgesic effect so that it can be useful in controlling of inflammations, acute and chronic pain

    The anti-infarct, antistunning and antiarrhythmic effects of oleuropein in isolated rat heart

    Get PDF
    Previous studies have reported that oleuropein, the major constituent of olive leaves, has cardioprotective effects. There is no report related to oleuropein and ischemic-reperfusion injuries (cardiac dysfunction and myocardial infarction) as well as preconditioning in rat hearts. 56 male Wistar rats were divided into 7 groups (n=8). Group 1 as the control group and groups 2 to 7 as the treatment groups that received a single dose of oleuropein (100 mg/kg, i.p.) 1, 3, 6, 12, 24 and 48 hours before the excision of the heart, respectively. After these times, their hearts were excised and subjected to 30 min regional ischemia and 120 min reperfusion under Langendorff apparatus. Electrocardiogram and intraventricular pressures were monitored and recorded throughout the procedure. Finally, infarct size was measured by triphenyltetrazolium chloride staining. Compared to the control group, oleuropein significantly reduced infarct size and reperfusion-induced cardiac dysfunction in groups 2 and 3. Oleuropein markedly attenuated both ischemic and reperfusion arrhythmias in groups 2 and 3. There was no significant difference between other groups (4 to 7) than the control group. Heart rate had no significant difference among all of the groups. These results indicate that pretreatment of rats with a single dose of intraperitoneal oleuropein could protect their heart against ischemic-reperfusion injury for at least 3 hours. However, it has no preconditioning effect, since oleuropein had not cardioprotective effects 24 hour later

    Microtensile Bond Strength of Three Restorative Core Materials with IPS E.max Press Ceramic by Two Resin Cements

    Get PDF
    Objectives: The aim of this study was to compare the microtensile bond strengths (µ TBS) of three core materials with one lithium disilicate reinforced ceramic using two resin cements.Methods: Three core materials (Nulite F® (Biodental Technologies), Filtek Z250® (3M-ESPE), Prettau-Anterior® (Zirkonzhan, Germany)) were prepared as blocks (10×10×4 mm3) according to the manufacturer’s instructions. Lithium disilicate ceramic blocks were also constructed and bonded to core specimens with two dual curing luting resin cements (Duo-Link® (Schaumburg, IL), Bifix QM® (VOCO, Cuxhaven, Germany)). Micro-bar specimens were prepared and loaded in tension to determine the µ TBS Failure modes were classified by scanning electron microscope (SEM). Data were analysed using two-way ANOVA and Tukey HSD test.Results: The µ TBS varied significantly depending on the core materials and resin cements used (P&lt; 0.05). The µ TBS of Bifix QM was significantly higher than of Duo-Link in all core materials. The µ TBS of zirconia core was significantly higher than of both composite cores with both resin cements. There were no statistically significant differences among Nulite F and Filtek Z250 (P&gt; 0.05). The highest bond strength was obtained between zirconia core and Bifix QM (45.3 ± 6.7 MPa).Conclusion: In vitro µ TBS of glass ceramic blocks bonded to zirconia core material showed higher bond strength values than resin-based core material, regardless of the resin cement type used

    Probabilistic Proximity-aware Resource Location in Peer-to-Peer Networks Using Resource Replication

    Get PDF
    Nowadays, content distribution has received remarkable attention in distributed computing researches and its applications typically allow personal computers, called peers, to cooperate with each other in order to accomplish distributed operations such as query search and acquiring digital contents. In a very large network, it is impossible to perform a query request by visiting all peers. There are some works that try to find the location of resources probabilistically (i.e. non-deterministically). They all have used inefficient protocols for finding the probable location of peers who manage the resources. This paper presents a more efficient protocol that is proximity-aware in the sense that it is able to cache and replicate the popular queries proportional to distance latency. The protocol dictates that the farther the resources are located from the origin of a query, the more should be the probability of their replication in the caches of intermediate peers. We have validated the proposed distributed caching scheme by running it on a simulated peer-to-peer network using the well-known Gnutella system parameters. The simulation results show that the proximity-aware distributed caching can improve the efficiency of peer-to-peer resource location services in terms of the probability of finding objects, overall miss rate of the system, fraction of involved peers in the search process, and the amount of system load

    Machine learning methods for service placement : a systematic review

    Get PDF
    With the growth of real-time and latency-sensitive applications in the Internet of Everything (IoE), service placement cannot rely on cloud computing alone. In response to this need, several computing paradigms, such as Mobile Edge Computing (MEC), Ultra-dense Edge Computing (UDEC), and Fog Computing (FC), have emerged. These paradigms aim to bring computing resources closer to the end user, reducing delay and wasted backhaul bandwidth. One of the major challenges of these new paradigms is the limitation of edge resources and the dependencies between different service parts. Some solutions, such as microservice architecture, allow different parts of an application to be processed simultaneously. However, due to the ever-increasing number of devices and incoming tasks, the problem of service placement cannot be solved today by relying on rule-based deterministic solutions. In such a dynamic and complex environment, many factors can influence the solution. Optimization and Machine Learning (ML) are two well-known tools that have been used most for service placement. Both methods typically use a cost function. Optimization is usually a way to define the difference between the predicted and actual value, while ML aims to minimize the cost function. In simpler terms, ML aims to minimize the gap between prediction and reality based on historical data. Instead of relying on explicit rules, ML uses prediction based on historical data. Due to the NP-hard nature of the service placement problem, classical optimization methods are not sufficient. Instead, metaheuristic and heuristic methods are widely used. In addition, the ever-changing big data in IoE environments requires the use of specific ML methods. In this systematic review, we present a taxonomy of ML methods for the service placement problem. Our findings show that 96% of applications use a distributed microservice architecture. Also, 51% of the studies are based on on-demand resource estimation methods and 81% are multi-objective. This article also outlines open questions and future research trends. Our literature review shows that one of the most important trends in ML is reinforcement learning, with a 56% share of research

    Sesame extraction gel as an agent for prevention of dental caries: An in-vitro study

    Get PDF
    BACKGROUND AND AIM: Sesame has a high content of calcium. Regarding to the lack of adequate data about its remineralizing potential, we conducted this study to evaluate the surface hardness of enamel exposed to sesame extraction gel in comparison to artificial saliva and fluoride. METHODS: After mounting and polishing twenty-four caries-free human premolars, the baseline microhardness was recorded. Subsequently, decalcification was accomplished by immersing into cola, after which the surface hardness was recorded again. Ultimately, the samples were divided into three groups, which were treated by either the sesame gel (SG), artificial saliva (AS) or the fluoride gel (FG). The final microhardness was assessed again. The repeated measure analysis of variance (ANOVA) was employed for comparison of baseline (B), decalcified (R) and remineralized (R) hardness while the one-way ANOVA followed by least significant difference test was used for comparison of different remineralizing agents. RESULTS: There was significant difference among the teeth at baseline, after decalcification and after treatment by experimental solutions (P < 0.001 and P = 0.002 for pair wise comparison of B/D and D/R, respectively). Moreover, after remineralizing treatment, there was no significance difference between the solutions (P = 0.350, P = 0.150 and P = 0.610 for pair-wise comparisons of SG-FG, SG-AS, FG-AS, respectively). However, the mean microhardness value was increasing in that order. CONCLUSION: Although treating the decalcified enamel by sesame extraction enhanced its microhardness, there was no significant difference between sesame, fluoride and artificial saliva when they were applied for just 15 min. KEYWORDS: Sesame; Dental Enamel; Hardnes

    ResBCDU-Net: A Deep Learning Framework for Lung CT Image Segmentation

    Get PDF
    Lung CT image segmentation is a key process in many applications such as lung cancer detection. It is considered a challenging problem due to existing similar image densities in the pulmonary structures, different types of scanners, and scanning protocols. Most of the current semi-automatic segmentation methods rely on human factors therefore it might suffer from lack of accuracy. Another shortcoming of these methods is their high false-positive rate. In recent years, several approaches, based on a deep learning framework, have been effectively applied in medical image segmentation. Among existing deep neural networks, the U-Net has provided great success in this field. In this paper, we propose a deep neural network architecture to perform an automatic lung CT image segmentation process. In the proposed method, several extensive preprocessing techniques are applied to raw CT images. Then, ground truths corresponding to these images are extracted via some morphological operations and manual reforms. Finally, all the prepared images with the corresponding ground truth are fed into a modified U-Net in which the encoder is replaced with a pre-trained ResNet-34 network (referred to as Res BCDU-Net). In the architecture, we employ BConvLSTM (Bidirectional Convolutional Long Short-term Memory)as an advanced integrator module instead of simple traditional concatenators. This is to merge the extracted feature maps of the corresponding contracting path into the previous expansion of the up-convolutional layer. Finally, a densely connected convolutional layer is utilized for the contracting path. The results of our extensive experiments on lung CT images (LIDC-IDRI database) confirm the effectiveness of the proposed method where a dice coefficient index of 97.31% is achieved

    Severity of post-Roux-en-Y gastric bypass dumping syndrome and weight loss outcomes : is there any correlation?

    Get PDF
    Altres ajuts: acords transformatius de la UABPurpose: The present research was conducted to evaluate the effect of the severity of dumping syndrome (DS) on weight loss outcomes after Roux-en-Y gastric bypass (RYGB) in patients with class III obesity. Methods: The present retrospective cohort study used the dumping symptom rating scale (DSRS) to evaluate the severity of DS and its correlation with weight loss outcomes in 207 patients 1 year after their RYGB. The patients were assigned to group A with mild-to-moderate DS or group B with severe DS. Results: The mean age of the patients was 42.18 ± 10.46 years and their mean preoperative BMI 42.74 ± 5.59 kg/m2. The total weight loss percentage (%TWL) in group B was insignificantly higher than that in group A, but besides that was not significantly different in the two groups. Conclusion: The present findings suggested insignificant relationships between the presence and severity of DS after RYGB and adequate postoperative weight loss

    Opium use and risk of lung cancer : A multicenter case-control study in Iran

    Get PDF
    Opium use was recently classified as a human carcinogen for lung cancer by the International Agency for Research on Cancer. We conducted a large, multicenter case-control study evaluating the association between opium use and the risk of lung cancer. We recruited 627 cases and 3477 controls from May 2017 to July 2020. We used unconditional logistic regression analyses to estimate the odds ratios (OR) and 95% confidence intervals (CI) and measured the association between opium use and the risk of lung cancer. The ORs were adjusted for the residential place, age, gender, socioeconomic status, cigarettes, and water pipe smoking. We found a 3.6-fold risk of lung cancer for regular opium users compared to never users (95% CI: 2.9, 4.6). There was a strong dose-response association between a cumulative count of opium use and lung cancer risk. The OR for regular opium use was higher for small cell carcinoma than in other histology (8.3, 95% CI: 4.8, 14.4). The OR of developing lung cancer among opium users was higher in females (7.4, 95% CI: 3.8, 14.5) than in males (3.3, 95% CI: 2.6, 4.2). The OR for users of both opium and tobacco was 13.4 (95% CI: 10.2, 17.7) compared to nonusers of anything. The risk of developing lung cancer is higher in regular opium users, and these results strengthen the conclusions on the carcinogenicity of opium. The association is stronger for small cell carcinoma cases than in other histology.Peer reviewe

    Towards strategic bandwidth sharing in overlay multicast networks based on mechanism design theory

    Get PDF
    AbstractThe selfish behavior of the users in the overlay multicast networks can lead to degradation of the performance. In this paper, we target the mechanism design for the overlay networks based on the monopoly auction economies. In our proposed auction mechanism, the bandwidth of the service offered by the origin servers can be thought of as commodity. In this auction, the sellers are either the origin servers or the peers who forward the content to their downstream peers. Also, the corresponding downstream peers of each seller play the role of buyers who are referred to as bidders. Each bidder submits a sealed bid to its corresponding seller. The high bidder wins and pays its bid for the service. By theoretical and experimental analysis, we prove that the proposed auction mechanism achieves performance improvements in the overlay network
    corecore