105 research outputs found

    A Filtering technique based on a DLMS algorithm for ultrasonography video

    Get PDF
    It is well known that ultrasonography is a diagnostic method for visualizinginside human tissues by spreading ultrasounds and measuring their return time tothe sensor. However, the interface between the human skin and this ultrasoundtransducer attenuates the received signal and the medical image quality deterioratessignificantly. In this paper we propose a filtering technique in order to compensatethis attenuation. A finite impulse response filter (FIR) based on a Delayed LeastMean Square (DLMS) was optimized and implemented. The main contribution ofour work consists of finding the order and the coefficients of the filter that minimizethe attenuation error. We validate our method first on simulated data and later on areprogrammable FPGA device for a real time performance testing. Among others,we show that incrementing the order of the filter, not always is the best way toreduce image quality errors

    A Sensor Network with Embedded Data Processing and Data-to-Cloud Capabilities for Vibration-Based Real-Time SHM

    Get PDF
    This work describes a network of low power/low-cost microelectromechanical- (MEMS-) based three-axial acceleration sensors with local data processing and data-to-cloud capabilities. In particular, the developed sensor nodes are capable to acquire acceleration time series and extract their frequency spectrum peaks, which are autonomously sent through an ad hoc developed gateway device to an online database using a dedicated transfer protocol. The developed network minimizes the power consumption to monitor remotely and in real time the acceleration spectra peaks at each sensor node. An experimental setup in which a network of 5 sensor nodes is used to monitor a simply supported steel beam in free vibration conditions is considered to test the performance of the implemented circuitry. The total weight and energy consumption of the entire network are, respectively, less than 50 g and 300 mW in continuous monitoring conditions. Results show a very good agreement between the measured natural vibration frequencies of the beam and the theoretical values estimated according to the classical closed formula. As such, the proposed monitoring network can be considered ideal for the SHM of civil structures like long-span bridges

    Cystic echinococcosis in cattle dairy farms: spatial distribution and epidemiological dynamics

    Get PDF
    A study monitoring cystic echinococcosis (CE) in adult dairy cattle from intensive livestock farms located in the municipality of Arborea (Sardinia, Italy) was carried out between 2012 and 2015. A retrospective study of veterinary reporting forms of post-mortem inspections in 10 different Italian slaughterhouses was also performed. In addition, data on viability and molecular characterisation of hydatid cysts removed from parasitised organs in cattle was carried out. A geographical information system (GIS) with data layers of the study area and the geo-referenced points of 160 cattle farms was constructed. CE was found in 21.9% (35/160) of the surveyed farms. The retrospective study revealed that 0.05% (13/23,656) of adult slaughtered animals (over one year of age) from Arborea had tested positive to CE. The results stratified per year showed the following CE prevalences: 0.09% (5/5673) in 2012; 0.02% (1/5682) in 2013; 0.08% (5/6261) in 2014; and 0.03% (2/6040) in 2015 (χ2 with 3 degrees of freedom=3.81; P=0.282). The E. granulosus sensu stricto (formerly called G1 or sheep strain) was detected in all cysts subjected to molecular analysis. The GIS analysis showed that CE is fairly resilient in the Arborea territory where most of cattle farms are located, while a small cluster of cases was found located in the southeastern part of Arborea, close to districts where sheep farms are situated. The present survey reports the presence of CE in Sardinian dairy cattle intensive farms and suggests that the parasitic pressure of CE in the island continues to be very strong

    A Toolchain Architecture for Condition Monitoring Using the Eclipse Arrowhead Framework

    Get PDF
    Condition Monitoring is one of the most critical applications of the Internet of Things (IoT) within the context of Industry 4.0. Current deployments typically present interoperability and management issues, requiring human intervention along the engineering process of the systems; in addition, the fragmentation of the IoT landscape, and the adoption of poor architectural solutions often make it difficult to integrate third-party devices in a seamless way. In this paper, we tackle these issues by proposing a tool-driven architecture that supports heterogeneous sensor management through well-established interoperability solutions for the IoT domain, i.e. the Eclipse Arrowhead framework and the recent Web of Things (WoT) standard released by the W3C working group. We deploy the architecture in a real Structural Health Monitoring (SHM) scenario, which validates each developed tool and demonstrates the increased automation derived from their combined usage

    An IoT Toolchain Architecture for Planning, Running and Managing a Complete Condition Monitoring Scenario

    Get PDF
    Condition Monitoring (CM) is an extremely critical application of the Internet of Things (IoT) within Industry 4.0 and Smart City scenarios, especially following the recent energy crisis. CM aims to monitor the status of a physical appliance over time and in real time in order to react promptly when anomalies are detected, as well as perform predictive maintenance tasks. Current deployments suffer from both interoperability and management issues within their engineering process at all phases – from their design to their deployment, to their management –, often requiring human intervention. Furthermore, the fragmentation of the IoT landscape and the heterogeneity of IoT solutions hinder a seamless onboarding process of legacy devices and systems. In this paper, we tackle these problems by first proposing an architecture for CM based on both abstraction layers and toolchains, i.e., automated pipelines of engineering tools aimed at supporting the engineering process. In particular, we introduce four different toolchains, each of them dedicated to a well-defined task (e.g., energy monitoring). This orthogonal separation of concerns aims to simplify both the understanding of a complex ecosystem and the accomplishment of independent tasks. We then illustrate our implementation of a complete CM system that follows said architecture as a real Structural Health Monitoring (SHM) pilot of the Arrowhead Tools project, by describing in detail every single tool that we developed. We finally show how our pilot achieves the main objectives of the project: the reduction of engineering costs, the integration of legacy systems, and the interoperability with IoT frameworks

    The baseline comorbidity burden affects survival in elderly patients with acute myeloid leukemia receiving hypomethylating agents: Results from a multicentric clinical study

    Get PDF
    Background: In older patients with acute myeloid leukemia (AML), the definition of fitness, prognosis, and risk of death represents an open question. Methods: In the present study, we tested the impact on survival of disease- and patient-related parameters in a large cohort of elderly AML patients homogeneously assigned to treatment with hypomethylating agents (HMAs). Results: In 131 patients with a median age of 76 years, we confirmed that early response (<0.001) and biology-based risk classification (p = 0.003) can select patients with better-predicted survival. However, a full disease-oriented model had limitations in stratifying our patients, prompting us to investigate the impact of baseline comorbidities on overall survival basing on a comorbidity score. The albumin level (p = 0.001) and the presence of lung disease (p = 0.013) had a single-variable impact on prognosis. The baseline comorbidity burden was a powerful predictor of patients' frailty, correlating with increased incidence of adverse events, especially infections, and predicted overall survival (p < 0.001). Conclusion: The comorbidity burden may contribute to impact prognosis in addition to disease biology. While the therapeutic armamentarium of elderly AML is improving, a comprehensive approach that combines AML biology with tailored interventions to patients' frailty is likely to fully exploit the anti-leukemia potential of novel drugs

    Towards a Muon Collider

    Full text link
    A muon collider would enable the big jump ahead in energy reach that is needed for a fruitful exploration of fundamental interactions. The challenges of producing muon collisions at high luminosity and 10 TeV centre of mass energy are being investigated by the recently-formed International Muon Collider Collaboration. This Review summarises the status and the recent advances on muon colliders design, physics and detector studies. The aim is to provide a global perspective of the field and to outline directions for future work.Comment: 118 pages, 103 figure

    Towards a muon collider

    Get PDF
    A muon collider would enable the big jump ahead in energy reach that is needed for a fruitful exploration of fundamental interactions. The challenges of producing muon collisions at high luminosity and 10 TeV centre of mass energy are being investigated by the recently-formed International Muon Collider Collaboration. This Review summarises the status and the recent advances on muon colliders design, physics and detector studies. The aim is to provide a global perspective of the field and to outline directions for future work

    Erratum:Towards a muon collider

    Get PDF

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe
    corecore