26 research outputs found

    Computing relative elasticity of materials by ultrasonic elastography /

    Get PDF
    En este artículo se presenta el procedimiento para calcular la elasticidad relativa de materiales usando elastografía por ultrasonido. El artículo describe el procedimiento necesario para el cálculo del elastograma a mano libre, utilizando algoritmos reportados en la literatura para cálculo de desplazamientos, cálculo de deformación y normalización de la imagen del elastograma. Utilizando marcos de ultrasonido de phantoms y de tejidos biológicos disponibles en bases de datos de sitios en internet, se estudia la coniabilidad de la información de elasticidad relativa obtenida con tales algoritmos, con base al parámetro de calidad relación señal a ruido. El resultado de este análisis muestra la necesidad de nuevos algoritmos para poder proporcionar una información semicuantitativa acerca de la dureza de los tejidos que sea de fácil interpretación y buena coniabilidad para el uso de esta técnica como herramienta diagnóstica de enfermedades en la práctica clínica.ABSTRACT: This article addresses the problem of calculating the relative elasticity of materials by using ultrasound elastography. The needed procedure to compute an elastogram through the freehand method is presented, using some algorithms reported in the literature to compute displacements, deformations, and normalization of the elastogram image. Using ultrasound frames from both, phantoms and biological tissues available at web sites, the reliability of the relative elasticity information obtained with such algorithms is studied, based on the quality inferred from the signal to noise ratio of the elastogram. The result of this analysis shows the need for new algorithms providing information about the hardness of tissues, but being also reliable and easy to interpret so that they can be used in the clinical practice

    Sistema anticolisión para invidentes usando redes neuronales evolutivas

    Get PDF
    Introducción: El presente artículo muestra el diseño e implementación de un sistema anticolisión para invidentes usando redes neuronales evolutivas. Objetivo: Presentar la implementación de redes neuronales evolutivas en un sistema guía para invidentes en la detección de obstáculos estáticos y en movimiento. Metodología: La metodología empleada se basa en la creación de redes neuronales artificiales a partir del algoritmo genético cooperativo coevolutivo (AGCC), este se encarga de estructurar, modificar y entrenar las redes neuronales. Para ello utiliza la matriz de definición de red (MDR). Para la elaboración de una MDR se toma como base un cromosoma “parte del algoritmo genético”. Una vez este realizada la MDR se crea una red neuronal artificial para luego ser entrenada. Resultados: El programa realizó varias redes neuronales generando en cada ejecución 10 cromosomas, que al ser entrenados con el AGCC y aplicando la cooperatividad, se obtuvieron las mejores redes neuronales anticolisión teniendo en cuenta un tiempo definido, funcionando efectivamente para la detección de obstáculos estáticos y en movimiento. Conclusiones: En el sistema anticolisión para invidentes se observó la eficacia de las redes neuronales en dar una respuesta, detectando objetos tanto estáticos como en movimiento proporcionando seguridad al invidente, evitando colisiones con estos.Introduction− This paper sets forth the design and im-plementation of an anti-collision system for visually-im-paired people using evolutionary artificial neural networks (EANNs).Objective−Present the implementation of evolutionary neural networks in a guide system for the visually-impaired people for the detection of static and moving obstacles.Methodology−The methodology is based on the creation of artificial neural networks from the cooperative co-evo-lutionary genetic algorithm (CCGA), which is responsible for structuring, modifying and training neural networks. It uses the network definition matrix (NDM). The NDM is based on a chromosome which is “part of the genetic algo-rithm”. Once the NDM is generated, an artificial neural network is created in order to be trained. Results− The program accomplished several neural net-works, generating 10 chromosomes in each execution. When the artificial neural networks were trained with the CCGA and the cooperation was included, the best anti-collision neural networks were obtained considering a defi-nite time. Hence, the anti-collision neural networks worked effectively for the detection of physical obstacles whether static or in motion.Conclusions−In this anti-collision system for the visu-ally impaired, the effectiveness of neural networks used to provide a suitable answer when detecting both static and moving objects was observed, thus, delivering security to visually impaired people by avoiding them collisions with objects

    Reconocimiento de actividades humanas por medio de extracción de características y técnicas de inteligencia artificial: una revisión

    Get PDF
    Context: In recent years, the recognition of human activities has become an area of constant exploration in different fields. This article presents a literature review focused on the different types of human activities and information acquisition devices for the recognition of activities. It also delves into elderly fall detection via computer vision using feature extraction methods and artificial intelligence techniques. Methodology: This manuscript was elaborated following the criteria of the document review and analysis methodology (RAD), dividing the research process into the heuristics and hermeneutics of the information sources. Finally, 102 research works were referenced, which made it possible to provide information on current state of the recognition of human activities. Results: The analysis of the proposed techniques for the recognition of human activities shows the importance of efficient fall detection. Although it is true that, at present, positive results are obtained with the techniques described in this article, their study environments are controlled, which does not contribute to the real advancement of research. Conclusions: It would be of great impact to present the results of studies in environments similar to reality, which is why it is essential to focus research on the development of databases with real falls of adults or in uncontrolled environments.Contexto: En los últimos años, el reconocimiento de actividades humanas se ha convertido en un área de constante exploración en diferentes campos. Este artículo presenta una revisión de la literatura enfocada en diferentes tipos de actividades humanas y dispositivos de adquisición de información para el reconocimiento de actividades, y profundiza en la detección de caídas de personas de tercera edad por medio de visión computacional, utilizando métodos de extracción de características y técnicas de inteligencia artificial. Metodología: Este manuscrito se elaboró con criterios de la metodología de revisión y análisis documental (RAD), dividiendo el proceso de investigación en heurística y hermenéutica de las fuentes de información. Finalmente, se referenciaron 102 investigaciones que permitieron dar a conocer la actualidad del reconocimiento de actividades humanas. Resultados: El análisis de las técnicas propuestas para el reconocimiento de actividades humanas muestra la importancia de la detección eficiente de caídas. Si bien es cierto en la actualidad se obtienen resultados positivos con las técnicas descritas en este artículo, sus entornos de estudio son controlados, lo cual no contribuye al verdadero avance de las investigaciones. Conclusiones: Sería de gran impacto presentar resultados de estudios en entornos semejantes a la realidad, por lo que es primordial centrar el trabajo de investigación en la elaboración de bases de datos con caídas reales de personas adultas o en entornos no controlados

    Characterization of Microbubbles Generated in a Venturi Tube via Image Processing: Effect of Operating Parameters

    Get PDF
    Context: This research developed a dissolved air flotation system using a Venturi tube to produce microbubbles. The Venturi tube replaces the saturation tank and the pressure-reducing valve of conventional systems. Method: The system has both suction and injection air inlets, regulates the recirculation flow of the liquid to the tank, and provides a high hydraulic load in a reduced size. Counting and measuring the microbubbles produced via digital image processing helps to characterize the system's performance. Results: The system with air suction produces smaller bubbles than that with air injection. A higher liquid recirculation pressure produces more bubbles and reduces their size in the case of air suction. Conclusions: In air injection, the change in flow rate influences the size of the microbubbles. Air injection and recirculation pressure do not influence the number of bubbles generated

    Spectral clustering and fuzzy similarity measure for images segmentation

    Get PDF
    In image segmentation algorithms using spectral clustering, due to the size of the images, the computational load for the construction of the similarity matrix and the solution to the eigenvalue problem for the Laplacian matrix is high. Furthermore, the Gaussian kernel similarity measure is the most used, but it presents problems with irregular data distributions. This work proposes to perform a pre-segmentation or decimation by superpixels with the Simple Linear Iterative Clustering algorithm to reduce the computational cost, and to build the similarity matrix with a fuzzy measure based on the Fuzzy C-Means classifier, providing the algorithm a greater robustness against images with complex distributions and by spectral clustering the final segmentation is determined. Experimentally, it was found that the proposed approach obtains adequate segmentations, good clustering results and a comparable precision with respect to five algorithms; measuring performance under four determined validation metrics.En los algoritmos de segmentación de imágenes mediante agrupamiento espectral, debido al tamaño de las imágenes, la carga computacional para la construcción de la matriz de similitud y la solución al problema de valores propios para la matriz laplaciana son altos. Además, la medida de similitud más utilizada es el kernel gaussiano, el cual presenta problemas con distribuciones de datos irregulares. Este trabajo propone realizar una presegmentación o diezmado mediante superpíxeles con el algoritmo Simple Linear Iterative Clustering, para disminuir el costo computacional y construir la matriz de similaridad con una medida difusa basada en el clasificador Fuzzy C-Means, que proporciona al algoritmo una mayor robustez frente a imágenes con distribuciones complejas; mediante agrupamiento espectral se determina la segmentación final. Experimentalmente, se comprobó que el enfoque propuesto obtiene segmentaciones adecuadas, buenos resultados de agrupamiento y una precisión comparable respecto a cinco algoritmos, midiendo el desempeño bajo cuatro métricas de validación

    Measurement of activity produced by low energy proton beam in metals using off-line PET imaging

    Get PDF
    Proceeding of: 2011 Nuclear Science Symposium and Medical Imaging Conference, Valencia, España, 23-29 October, 2011In this work, we investigate PET imaging with 68Ga and 66Ga after proton irradiation on a natural zinc foil. The nuclides 68Ga and 66Ga are ideally suited for off line PET monitoring of proton radiotherapy due to their beta decay halflives of 67.71(9) minutes and 9.49(3) hours, respectively, and suitable fl end point energy. The purpose of this work is to explore the feasibility of PET monitoring in hadrontherapy treatments, and to study how the amount of activity and the positron range affect the PET image reconstruction. Profiting from the low energy reaction threshold for production via (p,n) reactions, both 68Ga and 66Ga gallium isotopes have been produced by activation on a natural zinc target by a proton pencil beam. In this way, it is possible to create detailed patterns, such as the Derenzo inspired one employed here. The proton beam was produced by the 5 MV tandetron accelerator at CMAM in Madrid. The energy of this beam (up to 10 MeV) is similar to the residual energy of the protons used for therapy at the distal edge of their path. The activated target was imaged in an ARGUS small animal PETtCT scanner and reconstructed with a fully 3D iterative algorithm, with and without positron range corrections.This work was supported in part by Comunidad de Madrid (ARTEMIS S2009/DPI 1802), Spanish Ministry of Science and Innovation (grants FPA2010 17142 and ENTEPRASE, PSE 300000 2009 5), by European Regional Funds, by CDTI under the CENIT Programme (AMIT Project), UCM (grupos UCM, 910059) and by CPAN, CSPD 2007 [email protected]

    Experimental validation of gallium production and isotope-dependent positron range correction in PET

    Get PDF
    Abstract Positron range (PR) is one of the important factors that limit the spatial resolution of positron emission tomography (PET) preclinical images. Its blurring effect can be corrected to a large extent if the appropriate method is used during the image reconstruction. Nevertheless, this correction requires an accurate modelling of the PR for the particular radionuclide and materials in the sample under study. In this work we investigate PET imaging with 68Ga and 66Ga radioisotopes, which have a large PR and are being used in many preclinical and clinical PET studies. We produced a 68Ga and 66Ga phantom on a natural zinc target through (p,n) reactions using the 9-MeV proton beam delivered by the 5-MV CMAM tandetron accelerator. The phantom was imaged in an ARGUS small animal PET/CT scanner and reconstructed with a fully 3D iterative algorithm, with and without PR corrections. The reconstructed images at different time frames show significant improvement in spatial resolution when the appropriate PR is applied for each frame, by taking into account the relative amount of each isotope in the sample. With these results we validate our previously proposed PR correction method for isotopes with large PR. Additionally, we explore the feasibility of PET imaging with 68Ga and 66Ga radioisotopes in proton therapy.We acknowledge support from the Spanish MINECO through projects FPA2010-17142, FPA2013-41267-P, CSD-2007-00042 (CPAN), and the RTC-2015-3772-1 grant. We also acknowledge support from Comunidad de Madrid via the TOPUS S2013/MIT-3024 project

    Treatment with tocilizumab or corticosteroids for COVID-19 patients with hyperinflammatory state: a multicentre cohort study (SAM-COVID-19)

    Get PDF
    Objectives: The objective of this study was to estimate the association between tocilizumab or corticosteroids and the risk of intubation or death in patients with coronavirus disease 19 (COVID-19) with a hyperinflammatory state according to clinical and laboratory parameters. Methods: A cohort study was performed in 60 Spanish hospitals including 778 patients with COVID-19 and clinical and laboratory data indicative of a hyperinflammatory state. Treatment was mainly with tocilizumab, an intermediate-high dose of corticosteroids (IHDC), a pulse dose of corticosteroids (PDC), combination therapy, or no treatment. Primary outcome was intubation or death; follow-up was 21 days. Propensity score-adjusted estimations using Cox regression (logistic regression if needed) were calculated. Propensity scores were used as confounders, matching variables and for the inverse probability of treatment weights (IPTWs). Results: In all, 88, 117, 78 and 151 patients treated with tocilizumab, IHDC, PDC, and combination therapy, respectively, were compared with 344 untreated patients. The primary endpoint occurred in 10 (11.4%), 27 (23.1%), 12 (15.4%), 40 (25.6%) and 69 (21.1%), respectively. The IPTW-based hazard ratios (odds ratio for combination therapy) for the primary endpoint were 0.32 (95%CI 0.22-0.47; p < 0.001) for tocilizumab, 0.82 (0.71-1.30; p 0.82) for IHDC, 0.61 (0.43-0.86; p 0.006) for PDC, and 1.17 (0.86-1.58; p 0.30) for combination therapy. Other applications of the propensity score provided similar results, but were not significant for PDC. Tocilizumab was also associated with lower hazard of death alone in IPTW analysis (0.07; 0.02-0.17; p < 0.001). Conclusions: Tocilizumab might be useful in COVID-19 patients with a hyperinflammatory state and should be prioritized for randomized trials in this situatio

    Clonal chromosomal mosaicism and loss of chromosome Y in elderly men increase vulnerability for SARS-CoV-2

    Full text link
    The pandemic caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2, COVID-19) had an estimated overall case fatality ratio of 1.38% (pre-vaccination), being 53% higher in males and increasing exponentially with age. Among 9578 individuals diagnosed with COVID-19 in the SCOURGE study, we found 133 cases (1.42%) with detectable clonal mosaicism for chromosome alterations (mCA) and 226 males (5.08%) with acquired loss of chromosome Y (LOY). Individuals with clonal mosaic events (mCA and/or LOY) showed a 54% increase in the risk of COVID-19 lethality. LOY is associated with transcriptomic biomarkers of immune dysfunction, pro-coagulation activity and cardiovascular risk. Interferon-induced genes involved in the initial immune response to SARS-CoV-2 are also down-regulated in LOY. Thus, mCA and LOY underlie at least part of the sex-biased severity and mortality of COVID-19 in aging patients. Given its potential therapeutic and prognostic relevance, evaluation of clonal mosaicism should be implemented as biomarker of COVID-19 severity in elderly people. Among 9578 individuals diagnosed with COVID-19 in the SCOURGE study, individuals with clonal mosaic events (clonal mosaicism for chromosome alterations and/or loss of chromosome Y) showed an increased risk of COVID-19 lethality
    corecore