78 research outputs found

    Biodiversity Loss and the Taxonomic Bottleneck: Emerging Biodiversity Science

    Get PDF
    Human domination of the Earth has resulted in dramatic changes to global and local patterns of biodiversity. Biodiversity is critical to human sustainability because it drives the ecosystem services that provide the core of our life-support system. As we, the human species, are the primary factor leading to the decline in biodiversity, we need detailed information about the biodiversity and species composition of specific locations in order to understand how different species contribute to ecosystem services and how humans can sustainably conserve and manage biodiversity. Taxonomy and ecology, two fundamental sciences that generate the knowledge about biodiversity, are associated with a number of limitations that prevent them from providing the information needed to fully understand the relevance of biodiversity in its entirety for human sustainability: (1) biodiversity conservation strategies that tend to be overly focused on research and policy on a global scale with little impact on local biodiversity; (2) the small knowledge base of extant global biodiversity; (3) a lack of much-needed site-specific data on the species composition of communities in human-dominated landscapes, which hinders ecosystem management and biodiversity conservation; (4) biodiversity studies with a lack of taxonomic precision; (5) a lack of taxonomic expertise and trained taxonomists; (6) a taxonomic bottleneck in biodiversity inventory and assessment; and (7) neglect of taxonomic resources and a lack of taxonomic service infrastructure for biodiversity science. These limitations are directly related to contemporary trends in research, conservation strategies, environmental stewardship, environmental education, sustainable development, and local site-specific conservation. Today’s biological knowledge is built on the known global biodiversity, which represents barely 20% of what is currently extant (commonly accepted estimate of 10 million species) on planet Earth. Much remains unexplored and unknown, particularly in hotspots regions of Africa, South Eastern Asia, and South and Central America, including many developing or underdeveloped countries, where localized biodiversity is scarcely studied or described. ‘‘Backyard biodiversity’’, defined as local biodiversity near human habitation, refers to the natural resources and capital for ecosystem services at the grassroots level, which urgently needs to be explored, documented, and conserved as it is the backbone of sustainable economic development in these countries. Beginning with early identification and documentation of local flora and fauna, taxonomy has documented global biodiversity and natural history based on the collection of ‘‘backyard biodiversity’’ specimens worldwide. However, this branch of science suffered a continuous decline in the latter half of the twentieth century, and has now reached a point of potential demise. At present there are very few professional taxonomists and trained local parataxonomists worldwide, while the need for, and demands on, taxonomic services by conservation and resource management communities are rapidly increasing. Systematic collections, the material basis of biodiversity information, have been neglected and abandoned, particularly at institutions of higher learning. Considering the rapid increase in the human population and urbanization, human sustainability requires new conceptual and practical approaches to refocusing and energizing the study of the biodiversity that is the core of natural resources for sustainable development and biotic capital for sustaining our life-support system. In this paper we aim to document and extrapolate the essence of biodiversity, discuss the state and nature of taxonomic demise, the trends of recent biodiversity studies, and suggest reasonable approaches to a biodiversity science to facilitate the expansion of global biodiversity knowledge and to create useful data on backyard biodiversity worldwide towards human sustainability

    End-stage heart failure in congenitally corrected transposition of the great arteries:a multicentre study

    Get PDF
    BACKGROUND AND AIMS: For patients with congenitally corrected transposition of the great arteries (ccTGA), factors associated with progression to end-stage congestive heart failure (CHF) remain largely unclear. METHODS: This multicentre, retrospective cohort study included adults with ccTGA seen at a congenital heart disease centre. Clinical data from initial and most recent visits were obtained. The composite primary outcome was mechanical circulatory support, heart transplantation, or death. RESULTS: From 558 patients (48% female, age at first visit 36 ± 14.2 years, median follow-up 8.7 years), the event rate of the primary outcome was 15.4 per 1000 person-years (11 mechanical circulatory support implantations, 12 transplantations, and 52 deaths). Patients experiencing the primary outcome were older and more likely to have a history of atrial arrhythmia. The primary outcome was highest in those with both moderate/severe right ventricular (RV) dysfunction and tricuspid regurgitation (n = 110, 31 events) and uncommon in those with mild/less RV dysfunction and tricuspid regurgitation (n = 181, 13 events, P &lt; .001). Outcomes were not different based on anatomic complexity and history of tricuspid valve surgery or of subpulmonic obstruction. New CHF admission or ventricular arrhythmia was associated with the primary outcome. Individuals who underwent childhood surgery had more adverse outcomes than age- and sex-matched controls. Multivariable Cox regression analysis identified older age, prior CHF admission, and severe RV dysfunction as independent predictors for the primary outcome. CONCLUSIONS: Patients with ccTGA have variable deterioration to end-stage heart failure or death over time, commonly between their fifth and sixth decades. Predictors include arrhythmic and CHF events and severe RV dysfunction but not anatomy or need for tricuspid valve surgery.</p

    Shortened Telomere Length Is Associated with Increased Risk of Cancer: A Meta-Analysis

    Get PDF
    BACKGROUND: Telomeres play a key role in the maintenance of chromosome integrity and stability, and telomere shortening is involved in initiation and progression of malignancies. A series of epidemiological studies have examined the association between shortened telomeres and risk of cancers, but the findings remain conflicting. METHODS: A dataset composed of 11,255 cases and 13,101 controls from 21 publications was included in a meta-analysis to evaluate the association between overall cancer risk or cancer-specific risk and the relative telomere length. Heterogeneity among studies and their publication bias were further assessed by the χ(2)-based Q statistic test and Egger's test, respectively. RESULTS: The results showed that shorter telomeres were significantly associated with cancer risk (OR = 1.35, 95% CI = 1.14-1.60), compared with longer telomeres. In the stratified analysis by tumor type, the association remained significant in subgroups of bladder cancer (OR = 1.84, 95% CI = 1.38-2.44), lung cancer (OR = 2.39, 95% CI = 1.18-4.88), smoking-related cancers (OR = 2.25, 95% CI = 1.83-2.78), cancers in the digestive system (OR = 1.69, 95% CI = 1.53-1.87) and the urogenital system (OR = 1.73, 95% CI = 1.12-2.67). Furthermore, the results also indicated that the association between the relative telomere length and overall cancer risk was statistically significant in studies of Caucasian subjects, Asian subjects, retrospective designs, hospital-based controls and smaller sample sizes. Funnel plot and Egger's test suggested that there was no publication bias in the current meta-analysis (P = 0.532). CONCLUSIONS: The results of this meta-analysis suggest that the presence of shortened telomeres may be a marker for susceptibility to human cancer, but single larger, well-design prospective studies are warranted to confirm these findings

    Towards precision medicine in psychosis: benefits and challenges of multimodal multicenter studies—PSYSCAN: translating neuroimaging findings from research into clinical practice

    Get PDF
    In the last 2 decades, several neuroimaging studies investigated brain abnormalities associated with the early stages of psychosis in the hope that these could aid the prediction of onset and clinical outcome. Despite advancements in the field, neuroimaging has yet to deliver. This is in part explained by the use of univariate analytical techniques, small samples and lack of statistical power, lack of external validation of potential biomarkers, and lack of integration of nonimaging measures (eg, genetic, clinical, cognitive data). PSYSCAN is an international, longitudinal, multicenter study on the early stages of psychosis which uses machine learning techniques to analyze imaging, clinical, cognitive, and biological data with the aim of facilitating the prediction of psychosis onset and outcome. In this article, we provide an overview of the PSYSCAN protocol and we discuss benefits and methodological challenges of large multicenter studies that employ neuroimaging measures
    corecore