2,744 research outputs found

    Investigation of techniques for inventorying forested regions. Volume 2: Forestry information system requirements and joint use of remotely sensed and ancillary data

    Get PDF
    The author has identified the following significant results. Effects of terrain topography in mountainous forested regions on LANDSAT signals and classifier training were found to be significant. The aspect of sloping terrain relative to the sun's azimuth was the major cause of variability. A relative insolation factor could be defined which, in a single variable, represents the joint effects of slope and aspect and solar geometry on irradiance. Forest canopy reflectances were bound, both through simulation, and empirically, to have nondiffuse reflectance characteristics. Training procedures could be improved by stratifying in the space of ancillary variables and training in each stratum. Application of the Tasselled-Cap transformation for LANDSAT data acquired over forested terrain could provide a viable technique for data compression and convenient physical interpretations

    Biomarker-Drug and Liquid Biopsy Co-development for Disease Staging and Targeted Therapy: Cornerstones for Alzheimer's Precision Medicine and Pharmacology.

    Get PDF
    Systems biology studies have demonstrated that different (epi)genetic and pathophysiological alterations may be mapped onto a single tumor's clinical phenotype thereby revealing commonalities shared by cancers with divergent phenotypes. The success of this approach in cancer based on analyses of traditional and emerging body fluid-based biomarkers has given rise to the concept of liquid biopsy enabling a non-invasive and widely accessible precision medicine approach and a significant paradigm shift in the management of cancer. Serial liquid biopsies offer clues about the evolution of cancer in individual patients across disease stages enabling the application of individualized genetically and biologically guided therapies. Moreover, liquid biopsy is contributing to the transformation of drug research and development strategies as well as supporting clinical practice allowing identification of subsets of patients who may enter pathway-based targeted therapies not dictated by clinical phenotypes alone. A similar liquid biopsy concept is emerging for Alzheimer's disease, in which blood-based biomarkers adaptable to each patient and stage of disease, may be used for positive and negative patient selection to facilitate establishment of high-value drug targets and counter-measures for drug resistance. Going beyond the "one marker, one drug" model, integrated applications of genomics, transcriptomics, proteomics, receptor expression and receptor cell biology and conformational status assessments during biomarker-drug co-development may lead to a new successful era for Alzheimer's disease therapeutics. We argue that the time is now for implementing a liquid biopsy-guided strategy for the development of drugs that precisely target Alzheimer's disease pathophysiology in individual patients

    Enhanced clustering analysis pipeline for performance analysis of parallel applications

    Get PDF
    Clustering analysis is widely used to stratify data in the same cluster when they are similar according to the specific metrics. We can use the cluster analysis to group the CPU burst of a parallel application, and the regions on each process in-between communication calls or calls to the parallel runtime. The resulting clusters obtained are the different computational trends or phases that appear in the application. These clusters are useful to understand the behavior of the computation part of the application and focus the analyses on those that present performance issues. Although density-based clustering algorithms are a powerful and efficient tool to summarize this type of information, their traditional user-guided clustering methodology has many shortcomings and deficiencies in dealing with the complexity of data, the diversity of data structures, high-dimensionality of data, and the dramatic increase in the amount of data. Consequently, the majority of DBSCAN-like algorithms have weaknesses to handle high-dimensionality and/or Multi-density data, and they are sensitive to their hyper-parameter configuration. Furthermore, extracting insight from the obtained clusters is an intuitive and manual task. To mitigate these weaknesses, we have proposed a new unified approach to replace the user-guided clustering with an automated clustering analysis pipeline, called Enhanced Cluster Identification and Interpretation (ECII) pipeline. To build the pipeline, we propose novel techniques including Robust Independent Feature Selection, Feature Space Curvature Map, Organization Component Analysis, and hyper-parameters tuning to feature selection, density homogenization, cluster interpretation, and model selection which are the main components of our machine learning pipeline. This thesis contributes four new techniques to the Machine Learning field with a particular use case in Performance Analytics field. The first contribution is a novel unsupervised approach for feature selection on noisy data, called Robust Independent Feature Selection (RIFS). Specifically, we choose a feature subset that contains most of the underlying information, using the same criteria as the Independent component analysis. Simultaneously, the noise is separated as an independent component. The second contribution of the thesis is a parametric multilinear transformation method to homogenize cluster densities while preserving the topological structure of the dataset, called Feature Space Curvature Map (FSCM). We present a new Gravitational Self-organizing Map to model the feature space curvature by plugging the concepts of gravity and fabric of space into the Self-organizing Map algorithm to mathematically describe the density structure of the data. To homogenize the cluster density, we introduce a novel mapping mechanism to project the data from the non-Euclidean curved space to a new Euclidean flat space. The third contribution is a novel topological-based method to study potentially complex high-dimensional categorized data by quantifying their shapes and extracting fine-grain insights from them to interpret the clustering result. We introduce our Organization Component Analysis (OCA) method for the automatic arbitrary cluster-shape study without an assumption about the data distribution. Finally, to tune the DBSCAN hyper-parameters, we propose a new tuning mechanism by combining techniques from machine learning and optimization domains, and we embed it in the ECII pipeline. Using this cluster analysis pipeline with the CPU burst data of a parallel application, we provide the developer/analyst with a high-quality SPMD computation structure detection with the added value that reflects the fine grain of the computation regions.El análisis de conglomerados se usa ampliamente para estratificar datos en el mismo conglomerado cuando son similares según las métricas específicas. Nosotros puede usar el análisis de clúster para agrupar la ráfaga de CPU de una aplicación paralela y las regiones en cada proceso intermedio llamadas de comunicación o llamadas al tiempo de ejecución paralelo. Los clusters resultantes obtenidos son las diferentes tendencias computacionales o fases que aparecen en la solicitud. Estos clusters son útiles para entender el comportamiento de la parte de computación del aplicación y centrar los análisis en aquellos que presenten problemas de rendimiento. Aunque los algoritmos de agrupamiento basados en la densidad son una herramienta poderosa y eficiente para resumir este tipo de información, su La metodología tradicional de agrupación en clústeres guiada por el usuario tiene muchas deficiencias y deficiencias al tratar con la complejidad de los datos, la diversidad de estructuras de datos, la alta dimensionalidad de los datos y el aumento dramático en la cantidad de datos. En consecuencia, el La mayoría de los algoritmos similares a DBSCAN tienen debilidades para manejar datos de alta dimensionalidad y/o densidad múltiple, y son sensibles a su configuración de hiperparámetros. Además, extraer información de los clústeres obtenidos es una forma intuitiva y tarea manual Para mitigar estas debilidades, hemos propuesto un nuevo enfoque unificado para reemplazar el agrupamiento guiado por el usuario con un canalización de análisis de agrupamiento automatizado, llamada canalización de identificación e interpretación de clúster mejorada (ECII). para construir el tubería, proponemos técnicas novedosas que incluyen la selección robusta de características independientes, el mapa de curvatura del espacio de características, Análisis de componentes de la organización y ajuste de hiperparámetros para la selección de características, homogeneización de densidad, agrupación interpretación y selección de modelos, que son los componentes principales de nuestra canalización de aprendizaje automático. Esta tesis aporta cuatro nuevas técnicas al campo de Machine Learning con un caso de uso particular en el campo de Performance Analytics. La primera contribución es un enfoque novedoso no supervisado para la selección de características en datos ruidosos, llamado Robust Independent Feature. Selección (RIFS).Específicamente, elegimos un subconjunto de funciones que contiene la mayor parte de la información subyacente, utilizando el mismo criterios como el análisis de componentes independientes. Simultáneamente, el ruido se separa como un componente independiente. La segunda contribución de la tesis es un método de transformación multilineal paramétrica para homogeneizar densidades de clústeres mientras preservando la estructura topológica del conjunto de datos, llamado Mapa de Curvatura del Espacio de Características (FSCM). Presentamos un nuevo Gravitacional Mapa autoorganizado para modelar la curvatura del espacio característico conectando los conceptos de gravedad y estructura del espacio en el Algoritmo de mapa autoorganizado para describir matemáticamente la estructura de densidad de los datos. Para homogeneizar la densidad del racimo, introducimos un mecanismo de mapeo novedoso para proyectar los datos del espacio curvo no euclidiano a un nuevo plano euclidiano espacio. La tercera contribución es un nuevo método basado en topología para estudiar datos categorizados de alta dimensión potencialmente complejos mediante cuantificando sus formas y extrayendo información detallada de ellas para interpretar el resultado de la agrupación. presentamos nuestro Método de análisis de componentes de organización (OCA) para el estudio automático de forma arbitraria de conglomerados sin una suposición sobre el distribución de datos.Postprint (published version

    Mismatches between objective parameters and measured perception assessment in room acoustics: a holistic approach

    Full text link
    Psychoacoustic research in the field of concert halls has revealed that many aspects concerning listening perception have yet to be totally understood. On the one hand, the objective room acoustics of performance spaces are reflected in parameters, some standardized and some not, but these are related to a limited number of perceptual attributes of human response. In general, these objective parameters cannot accurately describe the acoustic details due to their inherent simplification. Under these premises, impulse responses (576 receivers) are measured in 16 concert halls, according to standard procedures, and the perception and satisfaction of the occupants of the rooms are evaluated by completing a questionnaire during live concerts. Correlation analyses and multidimensional scaling (MDS) techniques have been applied to spatial and multi-band averaged values of the acoustic parameters studied (18), and the average values of users responses (1284) to the questionnaire items (26). As a first result, correlations between objective parameters and users responses show that transversality exists between them. Secondly, hierarchical clustering produces the classification of survey questions in 7 hierarchical classes. On the other hand, a lack of tuning between objective parameters and perceptual responses is observed on applying MDS analysis to the ordination of the venues from a subjective assessment and a subjectiveobjective assessment. Finally, although the results show the mismatch between objective parameters and subjective responses, a model of subjective global evaluation of the acoustics of the room from data of three orthogonal acoustic parameters is implemented, revealing a reasonably good fit.The authors wish to express their gratitude to P. Bustamante for his help, to all those who participated as listeners in this study, and to management and staff of each hall for facilitating acoustic measurements and allowing distribution of the questionnaires in their theatres. This work has been financially supported by FEDER funds and by the Ministry of Science and Technology with references Nos. BIA2003-09306, BIA2008-05485, BIA 2010-20523, and BIA 2012-36896.Giménez Pérez, A.; Cibrián Ortíz De Anda, R.; Cerdá Jordá, S.; Girón, S.; Zamarreño García, T. (2014). Mismatches between objective parameters and measured perception assessment in room acoustics: a holistic approach. Building and Environment. 74:119-131. doi:10.1016/j.buildenv.2013.12.022S1191317

    Networks of innovation: measuring, modelling and enhancing innovation in surgery

    Get PDF
    The rate of innovation occurring in surgery is beyond our systemic capacity to quantify, with several methodological and practical challenges. The existing paucity of surgical innovation metrics presents a global healthcare problem especially as surgical innovations become increasingly costlier at a time when healthcare provision is experiencing a radical transformation driven by pressures to reduce costs, an ageing population with ever-increasing healthcare needs and patients with growing expectations. This thesis aims to devise a novel, quantitative, network-based framework that will permit modelling and measuring surgical innovation to add the most value to patient care. It involves the systematic, graphical and analytical assessment of surgical innovation in a way that has never been done before. This is based on successful models previously applied in the industry with advanced analytical techniques derived from social science (network analysis). In doing so, it offers an exciting new perspective and opportunity for understanding how the innovation process originates and evolves in surgery and how it can be measured in terms of value and virality, a priority for the NHS, RCS, Imperial and the wider surgical community. The ability to measure value and rank innovations is expected to play a fundamental role in guiding policy, strategically direct surgical research funding, and uncover innovation barriers and catalysts. This will ensure participation in the forefront of novel surgical technology and lay the scientific foundations for the development of improved healthcare models and services to enhance the quality of healthcare delivered.Open Acces

    Identification Of Metabolite Biomarkers In Epilepsy Using 1h Mrs

    Get PDF
    Epilepsy is a serious neurological disorder that affects 1% percent of the global population. Despite its status as one of the oldest neurological disorders known to man, its mechanisms remain poorly understood. Available medications are not curative but provide symptomatic management and do not work for well for more than 30 percent of patients. Because it is nearly impossible to predict on an individual level who will eventually develop epilepsy, it is also a disorder that can only be diagnosed after the patient has experienced established seizure activity, eliminating any possibility of stopping the disorder in its prodromal phase, before the patients are symptomatic. Availability of a reliable and non-invasive biomarker tool that can predict and identify the development of epilepsy would dramatically change how the disorder is detected, monitored, managed, and treated. In this project, we tested the potential of 1H MRS to provide metabolite biomarkers of epilepsy and epileptogenesis, both in human neocortical tissue obtained from intractable epilepsy patients who underwent resective surgery and also in a longitudinal rat model of epileptogenesis, using interictal epileptiform discharges as a surrogate indicator of disease progression. Using 1H MRS, we found unique metabolite differences that are highly predictive of epileptic and non-epileptic neocortex in humans that also partially overlaps with findings from our rat model. These findings provide evidence that 1H MRS is capable of identifying metabolite changes specific to epilepsy and may lead to reliable and non-invasive biomarkers of epilepsy and epileptogenesis in the future

    Proteome profiling in cerebrospinal fluid reveals novel biomarkers of Alzheimer's disease

    Get PDF
    Neurodegenerative diseases are a growing burden, and there is an urgent need for better biomarkers for diagnosis, prognosis, and treatment efficacy. Structural and functional brain alterations are reflected in the protein composition of cerebrospinal fluid (CSF). Alzheimer's disease (AD) patients have higher CSF levels of tau, but we lack knowledge of systems-wide changes of CSF protein levels that accompany AD. Here, we present a highly reproducible mass spectrometry (MS)-based proteomics workflow for the in-depth analysis of CSF from minimal sample amounts. From three independent studies (197 individuals), we characterize differences in proteins by AD status (> 1,000 proteins, CV < 20%). Proteins with previous links to neurodegeneration such as tau, SOD1, and PARK7 differed most strongly by AD status, providing strong positive controls for our approach. CSF proteome changes in Alzheimer's disease prove to be widespread and often correlated with tau concentrations. Our unbiased screen also reveals a consistent glycolytic signature across our cohorts and a recent study. Machine learning suggests clinical utility of this proteomic signature

    Effects of Chronic Sleep Restriction during Early Adolescence on the Adult Pattern of Connectivity of Mouse Secondary Motor Cortex

    Get PDF
    Cortical circuits mature in stages, from early synaptogenesis and synaptic pruning to late synaptic refinement, resulting in the adult anatomical connection matrix. Because the mature matrix is largely fixed, genetic or environmental factors interfering with its establishment can have irreversible effects. Sleep disruption is rarely considered among those factors, and previous studies have focused on very young animals and the acute effects of sleep deprivation on neuronal morphology and cortical plasticity. Adolescence is a sensitive time for brain remodeling, yet whether chronic sleep restriction (CSR) during adolescence has long-term effects on brain connectivity remains unclear. We used viral-mediated axonal labeling and serial two-photon tomography to measure brain-wide projections from secondary motor cortex (MOs), a high-order area with diffuse projections. For each MOs target, we calculated the projection fraction, a combined measure of passing fibers and axonal terminals normalized for the size of each target. We found no homogeneous differences in MOs projection fraction between mice subjected to 5 days of CSR during early adolescence (P25–P30, ≥50% decrease in daily sleep, n=14) and siblings that slept undisturbed (n=14). Machine learning algorithms, however, classified animals at significantly above chance levels, indicating that differences between the two groups exist, but are subtle and heterogeneous. Thus, sleep disruption in early adolescence may affect adult brain connectivity. However, because our method relies on a global measure of projection density and was not previously used to measure connectivity changes due to behavioral manipulations, definitive conclusions on the long-term structural effects of early CSR require additional experiments
    corecore