102 research outputs found

    Whole plant cannabis extracts in the treatment of spasticity in multiple sclerosis: a systematic review

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Cannabis therapy has been considered an effective treatment for spasticity, although clinical reports of symptom reduction in multiple sclerosis (MS) describe mixed outcomes. Recently introduced therapies of combined Δ<sup>9</sup>-tetrahydrocannabinol (THC) and cannabidiol (CBD) extracts have potential for symptom relief with the possibility of reducing intoxication and other side effects. Although several past reviews have suggested that cannabinoid therapy provides a therapeutic benefit for symptoms of MS, none have presented a methodical investigation of newer cannabinoid treatments in MS-related spasticity. The purpose of the present review was to systematically evaluate the effectiveness of combined THC and CBD extracts on MS-related spasticity in order to increase understanding of the treatment's potential effectiveness, safety and limitations.</p> <p>Methods</p> <p>We reviewed MEDLINE/PubMed, Ovid, and CENTRAL electronic databases for relevant studies using randomized controlled trials. Studies were included only if a combination of THC and CBD extracts was used, and if pre- and post-treatment assessments of spasticity were reported.</p> <p>Results</p> <p>Six studies were systematically reviewed for treatment dosage and duration, objective and subjective measures of spasticity, and reports of adverse events. Although there was variation in the outcome measures reported in these studies, a trend of reduced spasticity in treated patients was noted. Adverse events were reported in each study, however combined TCH and CBD extracts were generally considered to be well-tolerated.</p> <p>Conclusion</p> <p>We found evidence that combined THC and CBD extracts may provide therapeutic benefit for MS spasticity symptoms. Although some objective measures of spasticity noted improvement trends, there were no changes found to be significant in post-treatment assessments. However, subjective assessment of symptom relief did often show significant improvement post-treatment. Differences in assessment measures, reports of adverse events, and dosage levels are discussed.</p

    Detecting autozygosity through runs of homozygosity: A comparison of three autozygosity detection algorithms

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A central aim for studying runs of homozygosity (ROHs) in genome-wide SNP data is to detect the effects of autozygosity (stretches of the two homologous chromosomes within the same individual that are identical by descent) on phenotypes. However, it is unknown which current ROH detection program, and which set of parameters within a given program, is optimal for differentiating ROHs that are truly autozygous from ROHs that are homozygous at the marker level but vary at unmeasured variants between the markers.</p> <p>Method</p> <p>We simulated 120 Mb of sequence data in order to know the true state of autozygosity. We then extracted common variants from this sequence to mimic the properties of SNP platforms and performed ROH analyses using three popular ROH detection programs, PLINK, GERMLINE, and BEAGLE. We varied detection thresholds for each program (e.g., prior probabilities, lengths of ROHs) to understand their effects on detecting known autozygosity.</p> <p>Results</p> <p>Within the optimal thresholds for each program, PLINK outperformed GERMLINE and BEAGLE in detecting autozygosity from distant common ancestors. PLINK's sliding window algorithm worked best when using SNP data pruned for linkage disequilibrium (LD).</p> <p>Conclusion</p> <p>Our results provide both general and specific recommendations for maximizing autozygosity detection in genome-wide SNP data, and should apply equally well to research on whole-genome autozygosity burden or to research on whether specific autozygous regions are predictive using association mapping methods.</p

    Comparison of the Ki-67 score and S-phase fraction as prognostic variables in soft-tissue sarcoma

    Get PDF
    Immunohistochemically determined Ki-67 scores and flow cytometrically determined S-phase fractions were successfully evaluated from the primary tumours of 123 patients with soft-tissue sarcoma. All patients had either limb or superficial trunk tumours. Ki-67 score correlated strongly with ploidy, S-phase fraction and grade. Ki-67 did not correlate with the size of the primary tumour. When analysed as a continuous variable, Ki-67 was a stronger predictor of both metastasis-free survival and disease-specific overall survival (P= 0.003 and 0.04 respectively) than was the S-phase fraction (P= 0.06 and 0.07 respectively). We tested the relevance of different cut-point values by dividing the whole material into two parts at every 10% (e.g. 10% of patients vs. the remaining 90%, 20% vs. 80%, etc.). We counted the relative risk and confidence interval at all these cut-off points. Ki-67 had good prognostic discriminating power irrespective of the cut-point value, but S-phase fraction lost its prognostic power at higher cut-point values. In conclusion, we found that Ki-67 is a useful prognostic tool in the treatment of soft-tissue sarcoma patients irrespective of the cut-point value. S-phase fraction can be used at lower cut-point values. © 1999 Cancer Research Campaig

    Effects of edible bird's nest (EBN) on cultured rabbit corneal keratocytes

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There has been no effective treatment or agent that is available for corneal injury in promoting corneal wound healing. Previous studies on edible bird's nest extract (EBN) had reported the presence of hormone-like substance; avian epidermal growth factor that could stimulate cell division and enhance regeneration. This study aimed to investigate the effects of EBN on corneal keratocytes proliferative capacity and phenotypical changes.</p> <p>Methods</p> <p>Corneal keratocytes from six New Zealand White Rabbits were isolated and cultured until Passage 1. The proliferative effects of EBN on corneal keratocytes were determined by MTT assay in serum-containing medium (FDS) and serum-free medium (FD). Keratocytes phenotypical changes were morphologically assessed and gene expression of aldehyde dehydrogenase (ALDH), collagen type 1 and lumican were determined through RT-PCR.</p> <p>Results</p> <p>The highest cell proliferation was observed when both media were supplemented with 0.05% and 0.1% EBN. Cell proliferation was also consistently higher in FDS compared to FD. Both phase contrast micrographs and gene expression analysis confirmed the corneal keratocytes retained their phenotypes with the addition of EBN.</p> <p>Conclusions</p> <p>These results suggested that low concentration of EBN could synergistically induce cell proliferation, especially in serum-containing medium. This could be a novel breakthrough as both cell proliferation and functional maintenance are important during corneal wound healing. The in vitro test is considered as a crucial first step for nutri-pharmaceutical formation of EBN-based eye drops before in vivo application.</p

    Use of an innovative model to evaluate mobility in seniors with lower-limb amputations of vascular origin: a pilot study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The mobility of older individuals has often been only partially assessed, without considering all important aspects such as potential (available) versus effective (used) mobilities and the physical and psychosocial factors that modulate them. This study proposes a new model for evaluating mobility that considers all important aspects, applied here to lower-limb amputees with vascular origin. This model integrates the concepts of potential mobility (e.g. balance, speed of movement), effective mobility (e.g. life habits, movements in living areas) and factors that modulate these two types of mobility (e.g. strength, sensitivity, social support, depression). The main objective was to characterize potential and effective mobility as well as mobility modulators in a small sample of people with lower-limb amputations of vascular origin with different characteristics. The second objective of this pilot study was to assess the feasibility of measuring all variables in the model in a residential context.</p> <p>Methods</p> <p>An observational and transversal design was used with a heterogeneous sample of 10 participants with a lower-limb amputation of vascular origin, aged 51 to 83, assessed between eight and 18 months after discharge from an acute care hospital. A questionnaire of participant characteristics and 16 reliable and valid measurements were used.</p> <p>Results</p> <p>The results show that the potential mobility indicators do not accurately predict effective mobility, i.e., participants who perform well on traditional measures done in the laboratory or clinic are not always those who perform well in the real world. The model generated 4 different profiles (categories) of participants ranging from reduced to excellent potential mobility and low to excellent effective mobility, and characterized the modulating factors. The evaluations were acceptable in terms of the time taken (three hours) and the overall measurements, with a few exceptions, which were modified to optimize the data collected and the classification of the participants. For the population assessed, the results showed that some of the negative modulators (particularly living alone, no rehabilitation, pain, limited social support, poor muscle strength) played an important role in reducing effective mobility.</p> <p>Conclusion</p> <p>The first use of the model revealed interesting data that add to our understanding of important aspects linked to potential and effective mobility as well as modulators. The feasibility of measuring all variables in the model in a residential context was demonstrated. A study with a large number of participants is now warranted to rigorously characterize mobility levels of lower-limb amputees with vascular origin.</p

    Evaluation of the efficacy of a commercial inactivated genogroup 2b based porcine epidemic diarrhea virus (PEDV) vaccine and experimental live genogroup 1b exposure against 2b challenge

    Get PDF
    Abstract Porcine epidemic diarrhea virus strains from the G1b cluster are considered less pathogenic compared to the G2b cluster. The aim of this study was to compare the ability of G1b-based live virus exposure against use of a commercial G2b–based inactivated vaccine to protect growing pigs against G2b challenge. Thirty-nine PEDV naïve pigs were randomly divided into five groups: EXP-IM-1b (intramuscular G1b exposure; G2b challenge), EXP-ORAL-1b (oral G1b exposure; G2b challenge), VAC-IM-2b (intramuscular commercial inactivated G2b vaccination; G2b challenge), POS-CONTROL (sham-vaccination; G2b challenge) and NEG-CONTROL (sham-vaccination; sham-challenge). Pigs were vaccinated/exposed at 3 weeks of age (day post-vaccination 0, dpv 0), VAC-IM-2b pigs were revaccinated at dpv 14, and the pigs were challenged at dpv 28. Among all groups, VAC-IM-2b pigs had significantly higher anti-PEDV IgG levels on dpv 21 and 28 while EXP-ORAL-1b pigs had significantly higher anti-PEDV IgA levels on dpv 14, 21, 28 and 35. EXP-ORAL-1b also had detectable IgA in feces. Intramuscular PEDV exposure did not result in a detectable antibody response in EXP-IM-1b pigs. The fecal PEDV RNA levels in VAC-IM-2b pigs were significantly lower 5–7 days after challenge compared to the POS-CONTROL group. Under the study conditions a commercial inactivated G2b-based vaccine protected pigs against G2b challenge, as evidenced by reduction of PEDV RNA in feces for 3–4 logs during peak shedding and a shorter viral shedding duration. The oral, but not the intramuscular, experimental G1b-based live virus exposure induced a high anti-PEDV IgA response prior to challenge, which apparently did not impact PEDV shedding compared to POS-CONTROL pigs

    Immunogenic Salivary Proteins of Triatoma infestans: Development of a Recombinant Antigen for the Detection of Low-Level Infestation of Triatomines

    Get PDF
    Chagas disease, caused by Trypanosoma cruzi, is a neglected disease with 20 million people at risk in Latin America. The main control strategies are based on insecticide spraying to eliminate the domestic vectors, the most effective of which is Triatoma infestans. This approach has been very successful in some areas. However, there is a constant risk of recrudescence in once-endemic regions resulting from the re-establishment of T. infestans and the invasion of other triatomine species. To detect low-level infestations of triatomines after insecticide spraying, we have developed a new epidemiological tool based on host responses against salivary antigens of T. infestans. We identified and synthesized a highly immunogenic salivary protein. This protein was used successfully to detect differences in the infestation level of T. infestans of households in Bolivia and the exposure to other triatomine species. The development of such an exposure marker to detect low-level infestation may also be a useful tool for other disease vectors
    • 

    corecore