12 research outputs found

    Antioxidant and Radioprotective Effects of Ocimum Flavonoids Orientin and Vicenin in Escherichia coli

    Get PDF
    Antioxidant effect of the Ocimum flavonoids, orientin and vicenin (25-500 M), was evaluatedby the kat-sod assay in Escherichia coli mutants (DSH56, superoxide dismutase-deficient andDSH19, catalase-deficient) treated with 50 mM menadione or H2O2 (1mM). Protection by orientin(200 M) and vicenin (200 M) against H2O2-induced DNA damage in DSH19 cells ( -galactosidasetest) and against radiation lethality in wild-type (DSH7) and DSH19 cells exposed to 0-150 Gygamma radiation was also studied. Menadione and H2O2 reduced the surviving fraction to 0.2and 0.4 in DSH56 and DSH19 cells, respectively. Even 25 M of either flavonoid significantlyincreased the surviving fraction, with maximum protection at 200 M. H2O2 increased the -galactosidase activity in a concentration-dependent manner, which was significantly(P < 0.050–0.001) reduced by orientin and vicenin (200 M). Radiation produced a dose-dependentdecrease in the surviving fraction of both DSH7 and DSH19 cells. Pretreatment with 200 Morientin or vicenin significantly increased the survival (DRF: DSH7 = 2.2; DSH19 = 1.8). Bothcompounds were equally effective in reducing the cytotoxicity of radiation and the chemicaloxidants. The cytoprotective action of these plant flavonoids could be ascribed to their freeradical scavenging activity

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    Full text link
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    Estimation of tuberculosis incidence at subnational level using three methods to monitor progress towards ending TB in India, 2015–2020

    No full text
    Objectives We verified subnational (state/union territory (UT)/district) claims of achievements in reducing tuberculosis (TB) incidence in 2020 compared with 2015, in India.Design A community-based survey, analysis of programme data and anti-TB drug sales and utilisation data.Setting National TB Elimination Program and private TB treatment settings in 73 districts that had filed a claim to the Central TB Division of India for progress towards TB-free status.Participants Each district was divided into survey units (SU) and one village/ward was randomly selected from each SU. All household members in the selected village were interviewed. Sputum from participants with a history of anti-TB therapy (ATT), those currently experiencing chest symptoms or on ATT were tested using Xpert/Rif/TrueNat. The survey continued until 30 Mycobacterium tuberculosis cases were identified in a district.Outcome measures We calculated a direct estimate of TB incidence based on incident cases identified in the survey. We calculated an under-reporting factor by matching these cases within the TB notification system. The TB notification adjusted for this factor was the estimate by the indirect method. We also calculated TB incidence from drug sale data in the private sector and drug utilisation data in the public sector. We compared the three estimates of TB incidence in 2020 with TB incidence in 2015.Results The estimated direct incidence ranged from 19 (Purba Medinipur, West Bengal) to 1457 (Jaintia Hills, Meghalaya) per 100 000 population. Indirect estimates of incidence ranged between 19 (Diu, Dadra and Nagar Haveli) and 788 (Dumka, Jharkhand) per 100 000 population. The incidence using drug sale data ranged from 19 per 100 000 population in Diu, Dadra and Nagar Haveli to 651 per 100 000 population in Centenary, Maharashtra.Conclusion TB incidence in 1 state, 2 UTs and 35 districts had declined by at least 20% since 2015. Two districts in India were declared TB free in 2020

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    No full text
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License
    corecore