57 research outputs found

    Deir es-Salib, un monastère-ermitage rupestre de la Vallée Qadicha

    Get PDF
    L’étude de Deir es-Salib, site majeur de la vallée Qadicha, permet de définir les grandes lignes des modes d’installation de lieux de culte dans la montagne libanaise. L’analyse structurelle a permis de comprendre l’organisation spatiale et les composantes du monastère ainsi que l’évolution du bâti de l’église. L’observation des différentes techniques et des matériaux utilisés dans la construction ancienne a permis de définir les méthodes d’intervention. Ainsi, la restauration du mur septentrional de l’église a été effectuée à l’aide de briques crues fabriquées à l’identique. Après avoir assuré l’étanchéité de ce mur par des couches d’enduit à base d’argile et de chaux hydraulique, les fondations et les extrados des deux absides ont été consolidés.Dans la seconde partie de l’article, l’analyse stylistique des peintures murales de Deir es-Salib permet de proposer une synthèse des influences relevées et une datation, et de faire le point sur les grandes étapes de l’évolution du site.The study of Deir es-Salib, a major site of the Qadisha valley, allows us to set the outlines of the modes of establishment of cult sites in this part of the Lebanese mountains. The structural analysis made it possible to understand the spatial organization and the components of the monastery, as well as the evolution of the church’s construction. The observation of the various techniques and materials used in the ancient building helped us choose the intervention methods. Thus, the restoration of the north wall of the church was completed with mud bricks manufactured as they were in those days. After securing the sealing of the walls by layers of clay plaster and hydraulic lime, the foundations and the extrados of the two apses were strengthened. In the second part of this study, the stylistic analysis of the wall paintings of Deir es-Salib allowed the identification of the various influences noticed, to suggest a dating and to define the successive stages of the development of the site.تسمح دراسة دير الصليب، وهو موقع رئيس في وادي قاديشا، بتحديد الخطوط العريضة لأنواع الاستيطان في جبل لبنان من خلال مواقع ذات طابع ديني. سمح التحليل البنيوي بفهم تنظيم الحيز المكاني ومختلف أقسام الدير مع تطور عمارة الكنيسة. أتاحت دراسة التقنيات والمواد المختلفة في البناء القديم تحديد أساليب أعمال الترميم. وبالتالي، تم تنفيذ ترميم الجدار الشمالي للكنيسة بواسطة طوب من اللبن تم تحضيرها بشكل مطابق للقديم. بعد تأمين حماية هذا الجدار بواسطة طبقات من الملاط الطيني والكلس الهيدروليكي، تم تدعيم الأساسات و الجزء العلوي من حنيتي الكنيسة. وفي نصف المقالة الثاني أتاحت دراسة أسلوب جداريات دير الصليب وتحليله، استنتاج خلاصة عن التأثيرات الظاهرة وتأريخ الجداريات و تحديد مختلف مراحل تطور الموقع

    The Werner syndrome protein operates in base excision repair and cooperates with DNA polymerase β

    Get PDF
    Genome instability is a characteristic of cancer and aging, and is a hallmark of the premature aging disorder Werner syndrome (WS). Evidence suggests that the Werner syndrome protein (WRN) contributes to the maintenance of genome integrity through its involvement in DNA repair. In particular, biochemical evidence indicates a role for WRN in base excision repair (BER). We have previously reported that WRN helicase activity stimulates DNA polymerase beta (pol β) strand displacement synthesis in vitro. In this report we demonstrate that WRN exonuclease activity can act cooperatively with pol β, a polymerase lacking 3′–5′ proofreading activity. Furthermore, using small interference RNA technology, we demonstrate that WRN knockdown cells are hypersensitive to the alkylating agent methyl methanesulfonate, which creates DNA damage that is primarily repaired by the BER pathway. In addition, repair assays using whole cell extracts from WRN knockdown cells indicate a defect in long patch (LP) BER. These findings demonstrate that WRN plays a direct role in the repair of methylation-induced DNA damage, and suggest a role for both WRN helicase and exonuclease activities together with pol β during LP BER

    A review of the sensitivity of metastatic colorectal cancer patients with deficient mismatch repair to standard-of-care chemotherapy and monoclonal antibodies, with recommendations for future research

    Get PDF
    In 5% of metastatic colorectal cancer (mCRC) patients, tumours display a deficient mismatch repair (dMMR) system. Immunotherapy is beneficial in dMMR mCRC patients and has recently been approved by the Food and Drug Administration for patients with unresectable or metastatic dMMR CRC. Although dMMR and proficient MMR (pMMR) CRC tumours are biologically distinct, they are commonly treated with the same chemotherapy and monoclonal antibodies. This includes dMMR mCRC patients who did not respond to immunotherapy (20-30%). However, it is unclear if these treatments are equally beneficial in dMMR mCRC. Of note, dMMR mCRC patients have a worse prognosis compared to pMMR, which may in part be caused by a lower response to treatment. To avoid unnecessary exposure to ineffective treatments and their associated toxicity, it is important to identify which systemic treatments are most beneficial in dMMR mCRC patients, thus improving their outcome. Indeed, future treatment strategies are likely to involve combinations of immunotherapy, chemotherapy and monoclonal antibodies. In this evidence-based review, we summarize clinical trials reporting treatment efficacy of different types of chemotherapy and monoclonal antibodies in dMMR mCRC patients. We also review the biological rationale behind a potential differential benefit of chemotherapy with or without monoclonal antibodies in dMMR mCRC patients. A barrier in the interpretation of preclinical results is the choice of model systems. They largely comprise traditional models, including cell lines and xenografts, rather than more representative models, such as patient-derived organoids. We provide concrete recommendations for clinical investigators and fundamental researchers to accelerate research regarding which systemic therapy is most effective in dMMR mCRC patients

    Effect of Observing Change from Comparison Mammograms on Performance of Screening Mammography in a Large Community-based Population

    Get PDF
    To evaluate the effect of comparison mammograms on accuracy, sensitivity, specificity, positive predictive value (PPV1), and cancer detection rate (CDR) of screening mammography to determine the role played by identification of change on comparison mammograms

    Opt-out HIV testing in prison: informed and voluntary?

    Get PDF
    HIV testing in prison settings has been identified as an important mechanism to detect cases among high-risk, underserved populations. Several public health organizations recommend that testing across healthcare settings, including prisons, be delivered in an opt-out manner. However, implementation of opt-out testing within prisons may pose challenges in delivering testing that is informed and understood to be voluntary. In a large state prison system with a policy of voluntary opt-out HIV testing, we randomly sampled adult prisoners in each of seven intake prisons within two weeks after their opportunity to be HIV tested. We surveyed prisoners’ perception of HIV testing as voluntary or mandatory and used multivariable statistical models to identify factors associated with their perception. We also linked survey responses to lab records to determine if prisoners’ test status (tested or not) matched their desired and perceived test status. Thirty eight percent (359/936) perceived testing as voluntary. The perception that testing was mandatory was positively associated with age less than 25 years (adjusted relative risk [aRR]: 1.45, 95% CI: 1.24, 1.71) and preference that testing be mandatory (aRR: 1.81, 95% CI: 1.41, 2.31), but negatively associated with entry into one of the intake prisons (aRR: 0.41 95% CI: 0.27, 0.63). Eighty-nine percent of prisoners wanted to be tested, 85% were tested according to their wishes, and 82% correctly understood whether or not they were tested. Most prisoners wanted to be HIV tested and were aware that they had been tested, but less than 40% understood testing to be voluntary. Prisoners’ understanding of the voluntary nature of testing varied by intake prison and by a few individual-level factors. Testing procedures should ensure that opt-out testing is informed and understood to be voluntary by prisoners and other vulnerable populations

    Survival and patient-reported outcomes of real-world high-risk stage II and stage III colon cancer patients after reduction of adjuvant CAPOX duration from 6 to 3 months

    Get PDF
    Aim: Adjuvant chemotherapy has been advised for high-risk stage II and III colon cancer since 2004. After the IDEA study showed no clinically relevant difference in outcome, reduction of adjuvant CAPOX duration from 6 to 3 months was rapidly adopted in the Dutch treatment guideline in 2017. This study investigates the real-world impact of the guideline change on overall survival (OS) and patient-reported outcomes (PROs). Methods: Patients with high-risk stage II (pT4 +) and III (pN+) colon cancer were selected from the Netherlands Cancer Registry, based on surgical resection and adjuvant CAPOX before (2015–2016) versus after (2018–2019) the guideline change. Both groups were compared on OS, using multivariable Cox regression, and on PROs. Results: Patients treated before (n = 2330) and after (n = 2108) the guideline change showed similar OS (HR 1.02; 95 %CI [0.89–1.16]), also in high-risk stage III (pT4/N2, HR 1.06 [0.89–1.26]). After the guideline change, 90 % of patients were treated for 3 months with no inferior OS to those still receiving 6 months (HR 0.89 [0.66–1.20]). PROs 2 years after CAPOX completion, available for a subset of patients, suggest a lower neuropathy (n = 366; 26.2 [21.3–31.1] to 16.5 [14.4–18.6]) and better quality of life (n = 396; 80.9 [78.6–83.2] to 83.9 [82.8–84.9]), but no significant difference in workability (n = 120; 31.5 [27.9–35.1]) to 35.3 [33.8–36.7]), with reduction from 6 to 3 months of CAPOX. Conclusion: This real-world study confirmed that shorter adjuvant CAPOX did not compromise OS and may improve PROs, complementing the IDEA study and supporting 3 months of adjuvant CAPOX in daily clinical practice

    Community deworming alleviates geohelminth-induced immune hyporesponsiveness

    Get PDF
    In cross-sectional studies, chronic helminth infections have been associated with immunological hyporesponsiveness that can affect responses to unrelated antigens. To study the immunological effects of deworming, we conducted a cluster-randomized, double-blind, placebo-controlled trial in Indonesia and assigned 954 households to receive albendazole or placebo once every 3 mo for 2 y. Helminth-specific and nonspecific whole-blood cytokine responses were assessed in 1,059 subjects of all ages, whereas phenotyping of regulatory molecules was undertaken in 121 school-aged children. All measurements were performed before and at 9 and 21 mo after initiation of treatment. Anthelmintic treatment resulted in significant increases in proinflammatory cytokine responses to Plasmodium falciparum-infected red blood cells (PfRBCs) and mitogen, with the largest effect on TNF responses to PfRBCs at 9 mo—estimate [95% confidence interval], 0.37 [0.21–0.53], P value over time (Ptime) < 0.0001. Although the frequency of regulatory T cells did not change after treatment, there was a significant decline in the expression of the inhibitory molecule cytotoxic T lymphocyte-associated antigen 4 (CTLA-4) on CD4+ T cells of albendazole-treated individuals, –0.060 [–0.107 to –0.013] and –0.057 [–0.105 to –0.008] at 9 and 21 mo, respectively; Ptime = 0.017. This trial shows the capacity of helminths to up-regulate inhibitory molecules and to suppress proinflammatory immune responses in humans. This could help to explain the inferior immunological responses to vaccines and lower prevalence of inflammatory diseases in low- compared with high-income countries

    Harnessing the Potential of Real-World Evidence in the Treatment of Colorectal Cancer: Where Do We Stand?

    Get PDF
    Treatment guidelines for colorectal cancer (CRC) are primarily based on the results of randomized clinical trials (RCTs), the gold standard methodology to evaluate safety and efficacy of oncological treatments. However, generalizability of trial results is often limited due to stringent eligibility criteria, underrepresentation of specific populations, and more heterogeneity in clinical practice. This may result in an efficacy-effectiveness gap and uncertainty regarding meaningful benefit versus treatment harm. Meanwhile, conduct of traditional RCTs has become increasingly challenging due to identification of a growing number of (small) molecular subtypes. These challenges-combined with the digitalization of health records-have led to growing interest in use of real-world data (RWD) to complement evidence from RCTs. RWD is used to evaluate epidemiological trends, quality of care, treatment effectiveness, long-term (rare) safety, and quality of life (QoL) measures. In addition, RWD is increasingly considered in decision-making by clinicians, regulators, and payers. In this narrative review, we elaborate on these applications in CRC, and provide illustrative examples. As long as the quality of RWD is safeguarded, ongoing developments, such as common data models, federated learning, and predictive modelling, will further unfold its potential. First, whenever possible, we recommend conducting pragmatic trials, such as registry-based RCTs, to optimize generalizability and answer clinical questions that are not addressed in registrational trials. Second, we argue that marketing approval should be conditional for patients who would have been ineligible for the registrational trial, awaiting planned (non) randomized evaluation of outcomes in the real world. Third, high-quality effectiveness results should be incorporated in treatment guidelines to aid in patient counseling. We believe that a coordinated effort from all stakeholders is essential to improve the quality of RWD, create a learning healthcare system with optimal use of trials and real-world evidence (RWE), and ultimately ensure personalized care for every CRC patient

    Genetic, household and spatial clustering of leprosy on an island in Indonesia: a population-based study

    Get PDF
    BACKGROUND: It is generally accepted that genetic factors play a role in susceptibility to both leprosy per se and leprosy type, but only few studies have tempted to quantify this. Estimating the contribution of genetic factors to clustering of leprosy within families is difficult since these persons often share the same environment. The first aim of this study was to test which correlation structure (genetic, household or spatial) gives the best explanation for the distribution of leprosy patients and seropositive persons and second to quantify the role of genetic factors in the occurrence of leprosy and seropositivity. METHODS: The three correlation structures were proposed for population data (n = 560), collected on a geographically isolated island highly endemic for leprosy, to explain the distribution of leprosy per se, leprosy type and persons harbouring Mycobacterium leprae-specific antibodies. Heritability estimates and risk ratios for siblings were calculated to quantify the genetic effect. Leprosy was clinically diagnosed and specific anti-M. leprae antibodies were measured using ELISA. RESULTS: For leprosy per se in the total population the genetic correlation structure fitted best. In the population with relative stable household status (persons under 21 years and above 39 years) all structures were significant. For multibacillary leprosy (MB) genetic factors seemed more important than for paucibacillary leprosy. Seropositivity could be explained best by the spatial model, but the genetic model was also significant. Heritability was 57% for leprosy per se and 31% for seropositivity. CONCLUSION: Genetic factors seem to play an important role in the clustering of patients with a more advanced form of leprosy, and they could explain more than half of the total phenotypic variance

    European practice patterns and barriers to smoking cessation after a cancer diagnosis in the setting of curative versus palliative cancer treatment

    Get PDF
    Background: Smoking cessation after a cancer diagnosis is associated with improved overall survival. Few studies have reported oncologists' cessation practice patterns, but differences between the curative and palliative settings have not been described. We aimed to study the oncologist's perceptions on patients' tobacco use, current practices and barriers to providing smoking cessation support, while distinguishing between treatment with curative (C) and palliative (P) intent.Methods: In 2019, an online 34-item survey was sent to approximately 6235 oncologists from 16 European countries. Responses were descriptively reported and compared by treatment setting.Results: Responses from 544 oncologists were included. Oncologists appeared to favour addressing tobacco in the curative setting more than in the palliative setting. Oncologists believe that continued smoking impacts treatment outcomes (C: 94%, P: 74%) and that cessation support should be standard cancer care (C: 95%, P: 63%). Most routinely assess tobacco use (C: 93%, P: 78%) and advise patients to stop using tobacco (C: 88%, P: 54%), but only 24% (P) -39% (C) routinely discuss medication options, and only 18% (P)-31% (C) provide cessation support. Hesitation to remove a pleasurable habit (C: 13%, P: 43%) and disbelieve on smoking affecting outcomes (C: 3%, P: 14%) were disparate barriers between the curative and palliative settings (p Conclusion: Oncologists appear to favour addressing tobacco use more in the curative setting; however, they discuss medication options and/or provide cessation support in a minority of cases. All patients who report current smoking should have access to evidence-based smoking cessation support, also patients treated with palliative intent given their increasing survival. </div
    corecore