36 research outputs found

    The melon fruit fly, Bactrocera cucurbitae: A review of its biology and management

    Get PDF
    The melon fruit fly, Bactrocera cucurbitae (Coquillett) (Diptera: Tephritidae) is distributed widely in temperate, tropical, and sub-tropical regions of the world. It has been reported to damage 81 host plants and is a major pest of cucurbitaceous vegetables, particularly the bitter gourd (Momordica charantia), muskmelon (Cucumis melo), snap melon (C. melo var. momordica), and snake gourd (Trichosanthes anguina). The extent of losses vary between 30 to 100%, depending on the cucurbit species and the season. Its abundance increases when the temperatures fall below 32° C, and the relative humidity ranges between 60 to 70%. It prefers to infest young, green, soft-skinned fruits. It inserts the eggs 2 to 4 mm deep in the fruit tissues, and the maggots feed inside the fruit. Pupation occurs in the soil at 0.5 to 15 cm below the soil surface. Keeping in view the importance of the pest and crop, melon fruit fly management could be done using local area management and wide area management. The melon fruit fly can successfully be managed over a local area by bagging fruits, field sanitation, protein baits, cue-lure traps, growing fruit fly-resistant genotypes, augmentation of biocontrol agents, and soft insecticides. The wide area management program involves the coordination of different characteristics of an insect eradication program (including local area options) over an entire area within a defensible perimeter, and subsequently protected against reinvasion by quarantine controls. Although, the sterile insect technique has been successfully used in wide area approaches, this approach needs to use more sophisticated and powerful technologies in eradication programs such as insect transgenesis and geographical information systems, which could be deployed over a wide area. Various other options for the management of fruit fly are also discussed in relation to their bio-efficacy and economics for effective management of this pest

    Seed coat mediated resistance against Aspergillus flavus infection in peanut

    Get PDF
    Toxic metabolites known as aflatoxins are produced via certain species of the Aspergillus genus, specifically A. flavus, A. parasiticus, A. nomius, and A. tamarie. Although various pre- and post-harvest strategies have been employed, aflatoxin contamination remains a major problem within peanut crop, especially in subtropical environments. Aflatoxins are the most well-known and researched mycotoxins produced within the Aspergillus genus (namely Aspergillus flavus) and are classified as group 1 carcinogens. Their effects and etiology have been extensively researched and aflatoxins are commonly linked to growth defects and liver diseases in humans and livestock. Despite the known importance of seed coats in plant defense against pathogens, peanut seed coat mediated defenses against Aspergillus flavus resistance, have not received considerable attention. The peanut seed coat (testa) is primarily composed of a complex cell wall matrix consisting of cellulose, lignin, hemicellulose, phenolic compounds, and structural proteins. Due to cell wall desiccation during seed coat maturation, postharvest A. flavus infection occurs without the pathogen encountering any active genetic resistance from the live cell(s) and the testa acts as a physical and biochemical barrier only against infection. The structure of peanut seed coat cell walls and the presence of polyphenolic compounds have been reported to inhibit the growth of A. flavus and aflatoxin contamination; however, there is no comprehensive information available on peanut seed coat mediated resistance. We have recently reviewed various plant breeding, genomic, and molecular mechanisms, and management practices for reducing A. flavus infection and aflatoxin contamination. Further, we have also proved that seed coat acts as a physical and biochemical barrier against A. flavus infection. The current review focuses specifically on the peanut seed coat cell wall-mediated disease resistance, which will enable researchers to understand the mechanism and design efficient strategies for seed coat cell wall-mediated resistance against A. flavus infection and aflatoxin contamination

    Clustering Algorithms: Their Application to Gene Expression Data

    Get PDF
    Gene expression data hide vital information required to understand the biological process that takes place in a particular organism in relation to its environment. Deciphering the hidden patterns in gene expression data proffers a prodigious preference to strengthen the understanding of functional genomics. The complexity of biological networks and the volume of genes present increase the challenges of comprehending and interpretation of the resulting mass of data, which consists of millions of measurements; these data also inhibit vagueness, imprecision, and noise. Therefore, the use of clustering techniques is a first step toward addressing these challenges, which is essential in the data mining process to reveal natural structures and iden-tify interesting patterns in the underlying data. The clustering of gene expression data has been proven to be useful in making known the natural structure inherent in gene expression data, understanding gene functions, cellular processes, and subtypes of cells, mining useful information from noisy data, and understanding gene regulation. The other benefit of clustering gene expression data is the identification of homology, which is very important in vaccine design. This review examines the various clustering algorithms applicable to the gene expression data in order to discover and provide useful knowledge of the appropriate clustering technique that will guarantee stability and high degree of accuracy in its analysis procedure

    Mapping geographical inequalities in childhood diarrhoeal morbidity and mortality in low-income and middle-income countries, 2000–17 : analysis for the Global Burden of Disease Study 2017

    Get PDF
    Background Across low-income and middle-income countries (LMICs), one in ten deaths in children younger than 5 years is attributable to diarrhoea. The substantial between-country variation in both diarrhoea incidence and mortality is attributable to interventions that protect children, prevent infection, and treat disease. Identifying subnational regions with the highest burden and mapping associated risk factors can aid in reducing preventable childhood diarrhoea. Methods We used Bayesian model-based geostatistics and a geolocated dataset comprising 15 072 746 children younger than 5 years from 466 surveys in 94 LMICs, in combination with findings of the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2017, to estimate posterior distributions of diarrhoea prevalence, incidence, and mortality from 2000 to 2017. From these data, we estimated the burden of diarrhoea at varying subnational levels (termed units) by spatially aggregating draws, and we investigated the drivers of subnational patterns by creating aggregated risk factor estimates. Findings The greatest declines in diarrhoeal mortality were seen in south and southeast Asia and South America, where 54·0% (95% uncertainty interval [UI] 38·1–65·8), 17·4% (7·7–28·4), and 59·5% (34·2–86·9) of units, respectively, recorded decreases in deaths from diarrhoea greater than 10%. Although children in much of Africa remain at high risk of death due to diarrhoea, regions with the most deaths were outside Africa, with the highest mortality units located in Pakistan. Indonesia showed the greatest within-country geographical inequality; some regions had mortality rates nearly four times the average country rate. Reductions in mortality were correlated to improvements in water, sanitation, and hygiene (WASH) or reductions in child growth failure (CGF). Similarly, most high-risk areas had poor WASH, high CGF, or low oral rehydration therapy coverage. Interpretation By co-analysing geospatial trends in diarrhoeal burden and its key risk factors, we could assess candidate drivers of subnational death reduction. Further, by doing a counterfactual analysis of the remaining disease burden using key risk factors, we identified potential intervention strategies for vulnerable populations. In view of the demands for limited resources in LMICs, accurately quantifying the burden of diarrhoea and its drivers is important for precision public health

    Mapping local patterns of childhood overweight and wasting in low- and middle-income countries between 2000 and 2017

    Get PDF
    A double burden of malnutrition occurs when individuals, household members or communities experience both undernutrition and overweight. Here, we show geospatial estimates of overweight and wasting prevalence among children under 5 years of age in 105 low- and middle-income countries (LMICs) from 2000 to 2017 and aggregate these to policy-relevant administrative units. Wasting decreased overall across LMICs between 2000 and 2017, from 8.4 (62.3 (55.1�70.8) million) to 6.4 (58.3 (47.6�70.7) million), but is predicted to remain above the World Health Organization�s Global Nutrition Target of <5 in over half of LMICs by 2025. Prevalence of overweight increased from 5.2 (30 (22.8�38.5) million) in 2000 to 6.0 (55.5 (44.8�67.9) million) children aged under 5 years in 2017. Areas most affected by double burden of malnutrition were located in Indonesia, Thailand, southeastern China, Botswana, Cameroon and central Nigeria. Our estimates provide a new perspective to researchers, policy makers and public health agencies in their efforts to address this global childhood syndemic. © 2020, The Author(s)

    Author Correction: Mapping local patterns of childhood overweight and wasting in low- and middle-income countries between 2000 and 2017 (Nature Medicine, (2020), 26, 5, (750-759), 10.1038/s41591-020-0807-6)

    Get PDF
    An amendment to this paper has been published and can be accessed via a link at the top of the paper. © 2020, The Author(s)

    Author Correction: Mapping local patterns of childhood overweight and wasting in low- and middle-income countries between 2000 and 2017 (Nature Medicine, (2020), 26, 5, (750-759), 10.1038/s41591-020-0807-6)

    Get PDF
    An amendment to this paper has been published and can be accessed via a link at the top of the paper. © 2020, The Author(s)

    Pharmacokinetics of Bupropion and Its Metabolites in Haemodialysis Patients Who Smoke

    No full text
    Original article can be found at: http://content.karger.com/ [Full text of this article is not available in the UHRA]To date, no study has investigated the effects of bupropion (BP) in renal-impaired humans. This study aims to identify the pharmacokinetics of BP and metabolites in haemodialysis patients who smoke, determine whether haemodialysis affects BP and metabolite clearance, and suggest the BP dose in haemodialysis. The pharmacokinetics of BP and two of its major metabolites, hydroxybupropion (HB) and threohydrobupropion (TB) were studied in 8 smokers with ESRD receiving haemodialysis. Following a single oral dose of 150 mg bupropion hydrochloride sustained-release, blood samples were taken over 7 days, which were assayed using HPLC-mass spectrometry. Pharmacokinetic analysis was undertaken by non-linear regression using MWPharm. The BP results were similar to those for individuals with normal renal function. The metabolites demonstrated increased areas under the curve, indicating accumulation. Dialysis clearance of HB is unlikely. The results suggest significant accumulation of the metabolites in renal failure. Clarification of the clinical importance of the metabolites and toxic plasma levels is required. The effects of haemodialysis on BP and metabolites require further study. A dose of 150 mg bupropion every 3 days in patients receiving haemodialysis is more appropriate than the current manufacturer’s recommendation (in renal impaired patients) of 150 mg daily. A multi-dose study is required.Peer reviewe

    Avoiding acyclovir neurotoxicity in patients with chronic renal failure undergoing haemodialysis

    No full text
    Acute neurotoxicity following the administration of the recommended oral dose of acyclovir (800 mg twice daily) to dialysis-dependent patients is increasingly recognised. This suggests that the recommended dose is too high. Little is known of the pharmacokinetics of oral acyclovir in dialysis patients. We studied 7 patients with oliguric end stage renal failure receiving haemodialysis. Following haemodialysis, each patient received a single 800-mg tablet of acyclovir. Plasma acyclovir levels were monitored over the next 48 h as well as before and after the next routine dialysis. Peak plasma levels were achieved at 3 h (12.54 ± 1.76 μM, range 8.5 - 17.5 μM) with the half-life calculated to be 20.2 ± 4.6 h. Mean plasma levels of 6.29 ± 0.94 μM were within the quoted range to inhibit herpes tester virus (4-8 μM) at 18 h. Haemodialysis (4-5 h) eliminated 51 ± 11.5% of the acyclovir which remained at 48 h. Computer modelling of various dose modifications suggests that a loading dose of 400 mg and a maintenance dose of 200 mg twice daily is sufficient to maintain a mean plasma acyclovir level of 6.4 ± 0.8 μM A further loading dose (400 mg) after dialysis would raise the residual acyclovir concentration by 6.1 ± 1.0 μM. Such a dose modification should prevent neurotoxicity, whilst the rapid elimination of acyclovir by a single haemodialysis treatment provides both a diagnostic and therapeutic tool when toxicity is suspected.Peer reviewe
    corecore