8 research outputs found

    Trends in monkeypox research: A sixty year bibliometric analysis

    Get PDF
    Aim:  The purpose of the study was to identify common descriptors and publication hotspots that may form reference themes for future monkey pox research. Method: Bibliometric analysis of monkeypox related studies between 1962 and 2022 was carried out to ascertain and describe this body of literature. Results and conclusion: A total of 1,134 documents were analysed for bibliometric indicators. The studies had 3,478 authors, an average of 5.72 co-authors per publication and a 3.73 author collaboration index. Annual scientific production peaked in 2004 (5.5%) and 2020 (5.3%). Monkeypox research accumulated 128 grants, 68 policy documents, 9 clinical trials, and 50 patents. The United States placed first in terms of the number of documents and citations, followed by Germany with 73, United Kingdom with 53, Russia and the Democratic Republic of the Congo (DRC) with 34 documents each. The DRC and Nigeria had the most documents among African countries. Text mining showed researchers have put their efforts into studies related to infectious disease ‘epidemiology’: the ‘emergence’, ‘Case diagnosis and ‘surveillance’ of ‘outbreaks’. The top keywords were ‘monkeypox’ (570 times), ‘monkeypox virus’ (411 times), ‘poxviridae infections’ (332 times), ‘small pox’ (266 times), ‘orthopox virus’ (248 times), ‘vaccinia virus’ (203 times), and ‘disease outbreaks’ (179 times). The most cited treatment related noun phrases were ‘tecovirimat’ (brand name Tembexa), ‘Cidofovir’/ ‘CMX001’ (Brincidofovir), ‘ACAM2000’ (imvanex vaccine) and ‘Vaccinia’ Immune Globulin (‘VIG’). This result will serve as a foundation for future research, guiding decision-making in monkeypox research and therapy

    Multiple transmission optimization of medical images in recourse-constraint mobile telemedicine systems

    Get PDF
    Background and objective In the state-of-the-art image transmission methods, multiple large medical images are usually transmitted one by one which is very inefficient. The objective of our study is to devise an effective and efficient multiple transmission optimization scheme for medical images called Mto via analyzing the visual content of the multiple images based on the characteristics of a recourse-constraint mobile telemedicine system (MTS) and the medical images; Methods To better facilitate the efficient Mto processing, two enabling techniques, i.e., 1) NIB grouping scheme, and 2) adaptive RIB replicas selection are developed. Given a set of transmission images (Ω), the correlation of these transmission images is first explored, the pixel resolutions of the corresponding MIBs keep high, the NIBs are grouped into k clusters based on the visual similarity in which the k RIBs are obtained. An optimal pixel resolution for the RIBs is derived based on the current network bandwidth and their corresponding areas, etc. Then, the candidate MIBs and the k RIBs are transmitted to the receiver node based on their transmission priorities. Finally, the IBs are reconstructed and displayed at the receiver node level for different users. Results The experimental results show that our approach is about 45% more efficient than the state-of-the-art methods, significantly minimizing the response time by decreasing the network communication cost while improving the transmission throughput; Conclusions Our proposed Mto method can be seamlessly applied in a recourse-constraint MTS environment in which the high transmission efficiency and the acceptable image quality can be guaranteed. Keywords Medical imageMulti-resolutionMobile telemedicine systemBatch transmissionpostprin

    Computer-aided diagnosis through medical image retrieval in radiology.

    Get PDF
    Currently, radiologists face an excessive workload, which leads to high levels of fatigue, and consequently, to undesired diagnosis mistakes. Decision support systems can be used to prioritize and help radiologists making quicker decisions. In this sense, medical content-based image retrieval systems can be of extreme utility by providing well-curated similar examples. Nonetheless, most medical content-based image retrieval systems work by finding the most similar image, which is not equivalent to finding the most similar image in terms of disease and its severity. Here, we propose an interpretability-driven and an attention-driven medical image retrieval system. We conducted experiments in a large and publicly available dataset of chest radiographs with structured labels derived from free-text radiology reports (MIMIC-CXR-JPG). We evaluated the methods on two common conditions: pleural effusion and (potential) pneumonia. As ground-truth to perform the evaluation, query/test and catalogue images were classified and ordered by an experienced board-certified radiologist. For a profound and complete evaluation, additional radiologists also provided their rankings, which allowed us to infer inter-rater variability, and yield qualitative performance levels. Based on our ground-truth ranking, we also quantitatively evaluated the proposed approaches by computing the normalized Discounted Cumulative Gain (nDCG). We found that the Interpretability-guided approach outperforms the other state-of-the-art approaches and shows the best agreement with the most experienced radiologist. Furthermore, its performance lies within the observed inter-rater variability

    Mobile-cloud assisted video summarization framework for efficient management of remote sensing data generated by wireless capsule sensors

    Get PDF
    YesWireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data.Supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2013R1A1A2012904)

    Efficient and robust large medical image retrieval in mobile cloud computing environment

    No full text
    This paper presents an efficient and robust content-based large medical image retrieval method in mobile Cloud computing environment, called the Mirc. The whole query process of the Mirc is composed of three steps. First, when a clinical user submits a query image Iq, a parallel image set reduction process is conducted at a master node. Then the candidate images are transferred to the slave nodes for a refinement process to obtain the answer set. The answer set is finally transferred to the query node. The proposed method including an priority-based robust image block transmission scheme is specifically designed for solving the instability and the heterogeneity of the mobile cloud environment, and an index-support image set reduction algorithm is introduced for reducing the data transfer cost involved. We also propose a content-aware and bandwidth-conscious multi-resolution-based image data replica selection method and a correlated data caching algorithm to further improve the query performance. The experimental results show that the performance of our approach is both efficient and effective, minimizing the response time by decreasing the network transfer cost while increasing the parallelism of I/O and CPU

    Need for speed:Achieving fast image processing in acute stroke care

    Get PDF
    This thesis aims to investigate the use of high-performance computing (HPC) techniques in developing imaging biomarkers to support the clinical workflow of acute stroke patients. In the first part of this thesis, we evaluate different HPC technologies and how such technologies can be leveraged by different image analysis applications used in the context of acute stroke care. More specifically, we evaluated how computers with multiple computing devices can be used to accelerate medical imaging applications in Chapter 2. Chapter 3 proposes a novel data compression technique that efficiently processes CT perfusion (CTP) images in GPUs. Unfortunately, the size of CTP datasets makes data transfers to computing devices time-consuming and, therefore, unsuitable in acute situations. Chapter 4 further evaluates the algorithm's usefulness proposed in Chapter 3 with two different applications: a double threshold segmentation and a time-intensity profile similarity (TIPS) bilateral filter to reduce noise in CTP scans. Finally, Chapter 5 presents a cloud platform for deploying high-performance medical applications for acute stroke patients. In Part 2 of this thesis, Chapter 6 presents a convolutional neural network (CNN) for detecting and volumetric segmentation of subarachnoid hemorrhages (SAH) in non-contrast CT scans. Chapter 7 proposed another method based on CNNs to quantify the final infarct volumes in follow-up non-contrast CT scans from ischemic stroke patients

    Performance assessment of real-time data management on wireless sensor networks

    Get PDF
    Technological advances in recent years have allowed the maturity of Wireless Sensor Networks (WSNs), which aim at performing environmental monitoring and data collection. This sort of network is composed of hundreds, thousands or probably even millions of tiny smart computers known as wireless sensor nodes, which may be battery powered, equipped with sensors, a radio transceiver, a Central Processing Unit (CPU) and some memory. However due to the small size and the requirements of low-cost nodes, these sensor node resources such as processing power, storage and especially energy are very limited. Once the sensors perform their measurements from the environment, the problem of data storing and querying arises. In fact, the sensors have restricted storage capacity and the on-going interaction between sensors and environment results huge amounts of data. Techniques for data storage and query in WSN can be based on either external storage or local storage. The external storage, called warehousing approach, is a centralized system on which the data gathered by the sensors are periodically sent to a central database server where user queries are processed. The local storage, in the other hand called distributed approach, exploits the capabilities of sensors calculation and the sensors act as local databases. The data is stored in a central database server and in the devices themselves, enabling one to query both. The WSNs are used in a wide variety of applications, which may perform certain operations on collected sensor data. However, for certain applications, such as real-time applications, the sensor data must closely reflect the current state of the targeted environment. However, the environment changes constantly and the data is collected in discreet moments of time. As such, the collected data has a temporal validity, and as time advances, it becomes less accurate, until it does not reflect the state of the environment any longer. Thus, these applications must query and analyze the data in a bounded time in order to make decisions and to react efficiently, such as industrial automation, aviation, sensors network, and so on. In this context, the design of efficient real-time data management solutions is necessary to deal with both time constraints and energy consumption. This thesis studies the real-time data management techniques for WSNs. It particularly it focuses on the study of the challenges in handling real-time data storage and query for WSNs and on the efficient real-time data management solutions for WSNs. First, the main specifications of real-time data management are identified and the available real-time data management solutions for WSNs in the literature are presented. Secondly, in order to provide an energy-efficient real-time data management solution, the techniques used to manage data and queries in WSNs based on the distributed paradigm are deeply studied. In fact, many research works argue that the distributed approach is the most energy-efficient way of managing data and queries in WSNs, instead of performing the warehousing. In addition, this approach can provide quasi real-time query processing because the most current data will be retrieved from the network. Thirdly, based on these two studies and considering the complexity of developing, testing, and debugging this kind of complex system, a model for a simulation framework of the real-time databases management on WSN that uses a distributed approach and its implementation are proposed. This will help to explore various solutions of real-time database techniques on WSNs before deployment for economizing money and time. Moreover, one may improve the proposed model by adding the simulation of protocols or place part of this simulator on another available simulator. For validating the model, a case study considering real-time constraints as well as energy constraints is discussed. Fourth, a new architecture that combines statistical modeling techniques with the distributed approach and a query processing algorithm to optimize the real-time user query processing are proposed. This combination allows performing a query processing algorithm based on admission control that uses the error tolerance and the probabilistic confidence interval as admission parameters. The experiments based on real world data sets as well as synthetic data sets demonstrate that the proposed solution optimizes the real-time query processing to save more energy while meeting low latency.Fundação para a Ciência e Tecnologi
    corecore