4,114 research outputs found
Stability-based multivariate mapping using SCoRS
Recently we proposed a feature selection method based on stability theory (SCoRS - Survival Count on Random Subspaces) and showed that the proposed approach was able to improve classification accuracy using different datasets. In the present work we propose: (i) an extension of SCoRS using reproducibility instead of model accuracy as the parameter optimization criterion and (ii) a procedure to estimate the rate of false positive selection associated with the set of features obtained. Our results using the proposed framework showed that, as expected, the optimal parameter was more stable across the cross-validation folds, the spatial map displaying the features selected was less noisy and there was no decrease in classification accuracy. In addition, our results suggest that the estimated false positive rate for the features selected by SCoRS is under 0.05 for both optimization approaches, nevertheless lower when optimizing reproducibility in comparison with the standard optimization approach
A new feature selection method based on stability theory - Exploring parameters space to evaluate classification accuracy in neuroimaging data
Recently we proposed a feature selection method based on stability theory. In the present work we present an evaluation of its performance in different contexts through a grid search performed in a subset of its parameters space. The main contributions of this work are: we show that the method can improve the classification accuracy in relation to the wholebrain in different functional datasets; we evaluate the parameters influence in the results, getting some insight in reasonable ranges of values; and we show that combinations of parameters that yield the best accuracies are stable (i.e., they have low rates of false positive selections)
Half a century of computer methods and programs in biomedicine: A bibliometric analysis from 1970 to 2017
© 2019 Background and Objective: Computer Methods and Programs in Biomedicine (CMPB) is a leading international journal that presents developments about computing methods and their application in biomedical research. The journal published its first issue in 1970. In 2020, the journal celebrates the 50th anniversary. Motivated by this event, this article presents a bibliometric analysis of the publications of the journal during this period (1970–2017). Methods: The objective is to identify the leading trends occurring in the journal by analysing the most cited papers, keywords, authors, institutions and countries. For doing so, the study uses the Web of Science Core Collection database. Additionally, the work presents a graphical mapping of the bibliographic information by using the visualization of similarities (VOS) viewer software. This is done to analyze bibliographic coupling, co-citation and co-occurrence of keywords. Results: CMPB is identified as a leading and core journal for biomedical researchers. The journal is strongly connected to IEEE Transactions on Biomedical Engineering and IEEE Transactions on Medical Imaging. Paper from Wang, Jacques, Zheng (published in 1995) is its most cited document. The top author in this journal is James Geoffrey Chase and the top contributing institution is Uppsala U (Sweden). Most of the papers in CMPB are from the USA followed by the UK and Italy. China and Taiwan are the only Asian countries to appear in the top 10 publishing in CMPB. A keyword co-occurrences analysis revealed strong co-occurrences for classification, picture archiving and communication system (PACS), heart rate variability, survival analysis and simulation. Keywords analysis for the last decade revealed that machine learning for a variety of healthcare problems (including image processing and analysis) dominated other research fields in CMPB. Conclusions: It can be concluded that CMPB is a world-renowned publication outlet for biomedical researchers which has been growing in a number of publications since 1970. The analysis also conclude that the journal is very international with publications from all over the world although today European countries are the most productive ones
Predictive Modelling using Neuroimaging Data in the Presence of Confounds
When training predictive models from neuroimaging data, we typically have available non-imaging variables such as age and gender that affect the imaging data but which we may be uninterested in from a clinical perspective. Such variables are commonly referred to as 'confounds'. In this work, we firstly give a working definition for confound in the context of training predictive models from samples of neuroimaging data. We define a confound as a variable which affects the imaging data and has an association with the target variable in the sample that differs from that in the population-of-interest, i.e., the population over which we intend to apply the estimated predictive model. The focus of this paper is the scenario in which the confound and target variable are independent in the population-of-interest, but the training sample is biased due to a sample association between the target and confound. We then discuss standard approaches for dealing with confounds in predictive modelling such as image adjustment and including the confound as a predictor, before deriving and motivating an Instance Weighting scheme that attempts to account for confounds by focusing model training so that it is optimal for the population-of-interest. We evaluate the standard approaches and Instance Weighting in two regression problems with neuroimaging data in which we train models in the presence of confounding, and predict samples that are representative of the population-of-interest. For comparison, these models are also evaluated when there is no confounding present. In the first experiment we predict the MMSE score using structural MRI from the ADNI database with gender as the confound, while in the second we predict age using structural MRI from the IXI database with acquisition site as the confound. Considered over both datasets we find that none of the methods for dealing with confounding gives more accurate predictions than a baseline model which ignores confounding, although including the confound as a predictor gives models that are less accurate than the baseline model. We do find, however, that different methods appear to focus their predictions on specific subsets of the population-of-interest, and that predictive accuracy is greater when there is no confounding present. We conclude with a discussion comparing the advantages and disadvantages of each approach, and the implications of our evaluation for building predictive models that can be used in clinical practice
A people-oriented paradigm for smart cities
Most works in the literature agree on considering the Internet of Things (IoT) as the base technology to collect information related to smart cities. This information is usually offered as open data for its analysis, and to elaborate statistics or provide services which improve the management of the city, making it more efficient and more comfortable to live in. However, it is not possible to actually improve the quality of life of smart cities’ inhabitants if there is no direct information about them and their experiences. To address this problem, we propose using a social and mobile computation model, called the Internet of People (IoP) which empowers smartphones to recollect information about their users, analyze it to obtain knowledge about their habits, and provide this knowledge as a service creating a collaborative information network. Combining IoT and IoP, we allow the smart city to dynamically adapt its services to the needs of its citizens, promoting their welfare as the main objective of the city.Universidad de Málaga. Campus de Excelencia Internacional AndalucÃa Tech
A multimodal multiple kernel learning approach to Alzheimer's disease detection
In neuroimaging-based diagnostic problems, the combination of different sources of information as MR images and clinical data is a challenging task. Their simple combination usually does not provides an improvement if compared with using the best source alone. In this paper, we deal with the well known Alzheimer's Disease Neuroimaging Initiative (ADNI) dataset tackling the AD versus Control task. We use a recently proposed multiple kernel learning approach, called EasyMKL, to combine a huge amount of basic kernels in synergy with a feature selection methodology, pursuing an optimal and sparse solution to facilitate interpretability. Our new approach, called EasyMKLFS, outperforms baselines (e.g. SVM) and state-of-the-art methods as recursive feature elimination and SimpleMKL
Smart firefighters PPE: Impact of phase change materials
Considering the high level of heat and flame exposure firefighters encounter while performing their work activities, personal protective equipment (PPE) is of the utmost importance to enhance their safety. Phase change materials (PCMs) are known as advanced materials able to absorb high amounts of thermal energy, with the potential to increase the thermal performance of protective clothing. In this work, a PCM-vest was developed for the first time, and its thermal performance was evaluated. A three-stage approach was followed: (1) at a small scale in the laboratory, the effect of different encapsulated PCMs on a multilayer assembly performance was evaluated; (2) in the laboratory, the essential requirements of heat and flame tests were assessed; and (3) in a simulated urban fire, the thermal performance of three different PCM-vests (different textiles and designs) was studied. As the main conclusions, the PCMs significantly affected the heating rate of the multilayer assembly, particularly when a PCM with higher latent heat was used. In some cases, the heat transfer index (HTI) doubled by comparison with the sample without PCMs. As a drawback and as expected, the cooling time was increased. The PCM-vest sample ensured the requirements of the heat and flame tests. Through this study, the positive impact of using PCMs to enhance the heat protection of conventional PPE can be highlighted
Long-term stochastic heave-induced dynamic buckling of a top-tensioned riser and its influence on the ultimate limit state reliability
A top-tensioned riser is a slender pipe that conveys fluids between a floater and a subsea system. High top-tension keeps its straight configuration and helps to prevent compressive loads. Because of the floater's heave motion, the tension on the riser fluctuates giving rise to dynamic buckling. This paper examines the dynamic buckling characteristics of a top-tensioned riser analyzing the governing equation with nonlinear damping. The equation is discretized in space by the finite difference method and then is numerically integrated by the Runge-Kutta method. As main objective, an ultimate limit state function for risers is used to investigate its reliability during parametric excitation. While the short-term stationary Gaussian random motion of a floater can be described by a response spectrum, the uncertainties of a long-term response are considered by Monte Carlo simulation. In view of an applied example, it is found that the dynamic buckling would occur often, and although the probability of failure is acceptable, it can cause serious failure when axial excitation is of significance in harsher sea states. This study aims to contribute in clarifying the role of parametric vibrations (dynamic buckling) in the reliability of risers for ultimate limit state
Clostridium difficile colitis in an internal medicine ward
A colite por Clostridium difficile condiciona cada vez mais morbilidade
e mortalidade em doentes hospitalizados. Estudaram-se retrospectivamente caracterÃsticas epidemiológicas e clÃnicas dos casos de colite por Clostridium difficile ocorridos numa enfermaria
de Medicina Interna, num certo perÃodo de tempo, como contributo à melhoria da abordagem desta infecção em Portugal.
A incidência da doença esteve ao nÃvel dos registos mais baixos conhecidos do inÃcio deste século. A utilização em grande
escala de antibióticos de espectro alargado aparentou ser um factor de risco determinante, entre outros. Apesar dos surtos da doença causados por estirpes resistentes a certos antibióticos,
os resultados da terapêutica com metronidazol e/ou vancomicina foram aparentemente favoráveis, ainda que a escolha terapêutica inicial possa ser optimizada. A doença complicada e a recorrência registaram-se em nÃveis próximos dos valores mais baixos recentemente relatados
Sweet’s Syndrome and Inflammatory Bowel Disease - uncommon association
O SÃndrome de Sweet (SSw) é uma dermatose neutrofÃlica caracterizada pela presença de febre, neutrofilia e lesões cutâneas cuja
histologia revela um infiltrado inflamatório difuso neutrofÃlico da derme. A fisiopatologia deste SÃndrome ainda não foi totalmente esclarecida. Pode ser idiopático ou associar-se a diversas patologias (infecciosas, neoplásicas, inflamatórias) pelo que deve ser primariamente considerado como manifestação sistémica de uma doença subjacente. A associação entre a sÃndrome de Sweet e a
doença inflamatória intestinal não é muito frequente, e o primeiro não parece reflectir a actividade da última, partilhando, sim, um mesmo mecanismo fisiopatológico
- …