19,360 research outputs found

    Segmentation of ultrasound images of thyroid nodule for assisting fine needle aspiration cytology

    Get PDF
    The incidence of thyroid nodule is very high and generally increases with the age. Thyroid nodule may presage the emergence of thyroid cancer. The thyroid nodule can be completely cured if detected early. Fine needle aspiration cytology is a recognized early diagnosis method of thyroid nodule. There are still some limitations in the fine needle aspiration cytology, and the ultrasound diagnosis of thyroid nodule has become the first choice for auxiliary examination of thyroid nodular disease. If we could combine medical imaging technology and fine needle aspiration cytology, the diagnostic rate of thyroid nodule would be improved significantly. The properties of ultrasound will degrade the image quality, which makes it difficult to recognize the edges for physicians. Image segmentation technique based on graph theory has become a research hotspot at present. Normalized cut (Ncut) is a representative one, which is suitable for segmentation of feature parts of medical image. However, how to solve the normalized cut has become a problem, which needs large memory capacity and heavy calculation of weight matrix. It always generates over segmentation or less segmentation which leads to inaccurate in the segmentation. The speckle noise in B ultrasound image of thyroid tumor makes the quality of the image deteriorate. In the light of this characteristic, we combine the anisotropic diffusion model with the normalized cut in this paper. After the enhancement of anisotropic diffusion model, it removes the noise in the B ultrasound image while preserves the important edges and local details. This reduces the amount of computation in constructing the weight matrix of the improved normalized cut and improves the accuracy of the final segmentation results. The feasibility of the method is proved by the experimental results.Comment: 15pages,13figure

    Multisensor Fusion Remote Sensing Technology For Assessing Multitemporal Responses In Ecohydrological Systems

    Get PDF
    Earth ecosystems and environment have been changing rapidly due to the advanced technologies and developments of humans. Impacts caused by human activities and developments are difficult to acquire for evaluations due to the rapid changes. Remote sensing (RS) technology has been implemented for environmental managements. A new and promising trend in remote sensing for environment is widely used to measure and monitor the earth environment and its changes. RS allows large-scaled measurements over a large region within a very short period of time. Continuous and repeatable measurements are the very indispensable features of RS. Soil moisture is a critical element in the hydrological cycle especially in a semiarid or arid region. Point measurement to comprehend the soil moisture distribution contiguously in a vast watershed is difficult because the soil moisture patterns might greatly vary temporally and spatially. Space-borne radar imaging satellites have been popular because they have the capability to exhibit all weather observations. Yet the estimation methods of soil moisture based on the active or passive satellite imageries remain uncertain. This study aims at presenting a systematic soil moisture estimation method for the Choke Canyon Reservoir Watershed (CCRW), a semiarid watershed with an area of over 14,200 km2 in south Texas. With the aid of five corner reflectors, the RADARSAT-1 Synthetic Aperture Radar (SAR) imageries of the study area acquired in April and September 2004 were processed by both radiometric and geometric calibrations at first. New soil moisture estimation models derived by genetic programming (GP) technique were then developed and applied to support the soil moisture distribution analysis. The GP-based nonlinear function derived in the evolutionary process uniquely links a series of crucial topographic and geographic features. Included in this process are slope, aspect, vegetation cover, and soil permeability to compliment the well-calibrated SAR data. Research indicates that the novel application of GP proved useful for generating a highly nonlinear structure in regression regime, which exhibits very strong correlations statistically between the model estimates and the ground truth measurements (volumetric water content) on the basis of the unseen data sets. In an effort to produce the soil moisture distributions over seasons, it eventually leads to characterizing local- to regional-scale soil moisture variability and performing the possible estimation of water storages of the terrestrial hydrosphere. A new evolutionary computational, supervised classification scheme (Riparian Classification Algorithm, RICAL) was developed and used to identify the change of riparian zones in a semi-arid watershed temporally and spatially. The case study uniquely demonstrates an effort to incorporating both vegetation index and soil moisture estimates based on Landsat 5 TM and RADARSAT-1 imageries while trying to improve the riparian classification in the Choke Canyon Reservoir Watershed (CCRW), South Texas. The CCRW was selected as the study area contributing to the reservoir, which is mostly agricultural and range land in a semi-arid coastal environment. This makes the change detection of riparian buffers significant due to their interception capability of non-point source impacts within the riparian buffer zones and the maintenance of ecosystem integrity region wide. The estimation of soil moisture based on RADARSAT-1 Synthetic Aperture Radar (SAR) satellite imagery as previously developed was used. Eight commonly used vegetation indices were calculated from the reflectance obtained from Landsat 5 TM satellite images. The vegetation indices were individually used to classify vegetation cover in association with genetic programming algorithm. The soil moisture and vegetation indices were integrated into Landsat TM images based on a pre-pixel channel approach for riparian classification. Two different classification algorithms were used including genetic programming, and a combination of ISODATA and maximum likelihood supervised classification. The white box feature of genetic programming revealed the comparative advantage of all input parameters. The GP algorithm yielded more than 90% accuracy, based on unseen ground data, using vegetation index and Landsat reflectance band 1, 2, 3, and 4. The detection of changes in the buffer zone was proved to be technically feasible with high accuracy. Overall, the development of the RICAL algorithm may lead to the formulation of more effective management strategies for the handling of non-point source pollution control, bird habitat monitoring, and grazing and live stock management in the future. Soil properties, landscapes, channels, fault lines, erosion/deposition patches, and bedload transport history show geologic and geomorphologic features in a variety of watersheds. In response to these unique watershed characteristics, the hydrology of large-scale watersheds is often very complex. Precipitation, infiltration and percolation, stream flow, plant transpiration, soil moisture changes, and groundwater recharge are intimately related with each other to form water balance dynamics on the surface of these watersheds. Within this chapter, depicted is an optimal site selection technology using a grey integer programming (GIP) model to assimilate remote sensing-based geo-environmental patterns in an uncertain environment with respect to some technical and resources constraints. It enables us to retrieve the hydrological trends and pinpoint the most critical locations for the deployment of monitoring stations in a vast watershed. Geo-environmental information amassed in this study includes soil permeability, surface temperature, soil moisture, precipitation, leaf area index (LAI) and normalized difference vegetation index (NDVI). With the aid of a remote sensing-based GIP analysis, only five locations out of more than 800 candidate sites were selected by the spatial analysis, and then confirmed by a field investigation. The methodology developed in this remote sensing-based GIP analysis will significantly advance the state-of-the-art technology in optimum arrangement/distribution of water sensor platforms for maximum sensing coverage and information-extraction capacity. Effective water resources management is a critically important priority across the globe. While water scarcity limits the uses of water in many ways, floods also have caused so many damages and lives. To more efficiently use the limited amount of water or to resourcefully provide adequate time for flood warning, the results have led us to seek advanced techniques for improving streamflow forecasting. The objective of this section of research is to incorporate sea surface temperature (SST), Next Generation Radar (NEXRAD) and meteorological characteristics with historical stream data to forecast the actual streamflow using genetic programming. This study case concerns the forecasting of stream discharge of a complex-terrain, semi-arid watershed. This study elicits microclimatological factors and the resultant stream flow rate in river system given the influence of dynamic basin features such as soil moisture, soil temperature, ambient relative humidity, air temperature, sea surface temperature, and precipitation. Evaluations of the forecasting results are expressed in terms of the percentage error (PE), the root-mean-square error (RMSE), and the square of the Pearson product moment correlation coefficient (r-squared value). The developed models can predict streamflow with very good accuracy with an r-square of 0.84 and PE of 1% for a 30-day prediction

    Full Issue

    Get PDF

    A study case of Dynamic Motion Primitives as a motion planning method to automate the work of forestry cranes

    Get PDF
    Dynamic motion primitives (DMPs) is a motion planning method based on the concept of teaching a robot how to move based on human demonstration. To this end, DMPs use a machine learning framework that tunes stable non-linear differential equations according to data sets from demonstrated motions. Consequently, the numerical solution of these differential equations represent the desired motions. The purpose of this article is to present the steps to apply the DMPs framework and analyse its application for automating motions of forestry cranes. Our study considers an example of a forwarder crane that has been equipped with sensors to record motion data while performing standard work in the forest with expert operators. The objective of our motion planner is to automatically retract the logs back into the machine once the operator has grabbed them manually using joysticks. The results show that the final motion planner has the ability of reproducing the demonstrated action with above 95% accuracy. In addition, it has also the versatility to plan motions and perform similar action from other positions around the workspace, different than the ones used during the training stage. Thus, this initial study concludes that DMPs gives the means to develop a new generation of dynamic motion planners for forestry cranes that readily allow merging the operator?s experience in the development process

    Analysis of Decision Support Systems of Industrial Relevance: Application Potential of Fuzzy and Grey Set Theories

    Get PDF
    The present work articulates few case empirical studies on decision making in industrial context. Development of variety of Decision Support System (DSS) under uncertainty and vague information is attempted herein. The study emphases on five important decision making domains where effective decision making may surely enhance overall performance of the organization. The focused territories of this work are i) robot selection, ii) g-resilient supplier selection, iii) third party logistics (3PL) service provider selection, iv) assessment of supply chain’s g-resilient index and v) risk assessment in e-commerce exercises. Firstly, decision support systems in relation to robot selection are conceptualized through adaptation to fuzzy set theory in integration with TODIM and PROMETHEE approach, Grey set theory is also found useful in this regard; and is combined with TODIM approach to identify the best robot alternative. In this work, an attempt is also made to tackle subjective (qualitative) and objective (quantitative) evaluation information simultaneously, towards effective decision making. Supplier selection is a key strategic concern for the large-scale organizations. In view of this, a novel decision support framework is proposed to address g-resilient (green and resilient) supplier selection issues. Green capability of suppliers’ ensures the pollution free operation; while, resiliency deals with unexpected system disruptions. A comparative analysis of the results is also carried out by applying well-known decision making approaches like Fuzzy- TOPSIS and Fuzzy-VIKOR. In relation to 3PL service provider selection, this dissertation proposes a novel ‘Dominance- Based’ model in combination with grey set theory to deal with 3PL provider selection, considering linguistic preferences of the Decision-Makers (DMs). An empirical case study is articulated to demonstrate application potential of the proposed model. The results, obtained thereof, have been compared to that of grey-TOPSIS approach. Another part of this dissertation is to provide an integrated framework in order to assess gresilient (ecosilient) performance of the supply chain of a case automotive company. The overall g-resilient supply chain performance is determined by computing a unique ecosilient (g-resilient) index. The concepts of Fuzzy Performance Importance Index (FPII) along with Degree of Similarity (DOS) (obtained from fuzzy set theory) are applied to rank different gresilient criteria in accordance to their current status of performance. The study is further extended to analyze, and thereby, to mitigate various risk factors (risk sources) involved in e-commerce exercises. A total forty eight major e-commerce risks are recognized and evaluated in a decision making perspective by utilizing the knowledge acquired from the fuzzy set theory. Risk is evaluated as a product of two risk quantifying parameters viz. (i) Likelihood of occurrence and, (ii) Impact. Aforesaid two risk quantifying parameters are assessed in a subjective manner (linguistic human judgment), rather than exploring probabilistic approach of risk analysis. The ‘crisp risk extent’ corresponding to various risk factors are figured out through the proposed fuzzy risk analysis approach. The risk factor possessing high ‘crisp risk extent’ score is said be more critical for the current problem context (toward e-commerce success). Risks are now categorized into different levels of severity (adverse consequences) (i.e. negligible, minor, marginal, critical and catastrophic). Amongst forty eight risk sources, top five risk sources which are supposed to adversely affect the company’s e-commerce performance are recognized through such categorization. The overall risk extent is determined by aggregating individual risks (under ‘critical’ level of severity) using Fuzzy Inference System (FIS). Interpretive Structural Modeling (ISM) is then used to obtain structural relationship amongst aforementioned five risk sources. An appropriate action requirement plan is also suggested, to control and minimize risks associated with e-commerce exercises

    Chance-constrained optimization of demand response to price signals

    Get PDF

    Integrated multiple sequence alignment

    Get PDF
    Sammeth M. Integrated multiple sequence alignment. Bielefeld (Germany): Bielefeld University; 2005.The thesis presents enhancements for automated and manual multiple sequence alignment: existing alignment algorithms are made more easily accessible and new algorithms are designed for difficult cases. Firstly, we introduce the QAlign framework, a graphical user interface for multiple sequence alignment. It comprises several state-of-the-art algorithms and supports their parameters by convenient dialogs. An alignment viewer with guided editing functionality can also highlight or print regions of the alignment. Also phylogenetic features are provided, e.g., distance-based tree reconstruction methods, corrections for multiple substitutions and a tree viewer. The modular concept and the platform-independent implementation guarantee an easy extensibility. Further, we develop a constrained version of the divide-and-conquer alignment such that it can be restricted by anchors found earlier with local alignments. It can be shown that this method shares attributes of both, local and global aligners, in the quality of results as well as in the computation time. We further modify the local alignment step to work on bipartite (or even multipartite) sets for sequences where repeats overshadow valuable sequence information. In the end a technique is established that can accurately align sequences containing eventually repeated motifs. Finally, another algorithm is presented that allows to compare tandem repeat sequences by aligning them with respect to their possible repeat histories. We describe an evolutionary model including tandem duplications and excisions, and give an exact algorithm to compare two sequences under this model

    Detection of crack-like indications in digital radiography by global optimisation of a probabilistic estimation function

    Get PDF
    A new algorithm for detection of longitudinal crack-like indications in radiographic images is developed in this work. Conventional local detection techniques give unsatisfactory results for this task due to the low signal to noise ratio (SNR ~ 1) of crack-like indications in radiographic images. The usage of global features of crack-like indications provides the necessary noise resistance, but this is connected with prohibitive computational complexities of detection and difficulties in a formal description of the indication shape. Conventionally, the excessive computational complexity of the solution is reduced by usage of heuristics. The heuristics to be used, are selected on a trial and error basis, are problem dependent and do not guarantee the optimal solution. Not following this way is a distinctive feature of the algorithm developed here. Instead, a global characteristic of crack-like indication (the estimation function) is used, whose maximum in the space of all possible positions, lengths and shapes can be found exactly, i.e. without any heuristics. The proposed estimation function is defined as a sum of a posteriori information gains about hypothesis of indication presence in each point along the whole hypothetical indication. The gain in the information about hypothesis of indication presence results from the analysis of the underlying image in the local area. Such an estimation function is theoretically justified and exhibits a desirable behaviour on changing signals. The developed algorithm is implemented in the C++ programming language and testet on synthetic as well as on real images. It delivers good results (high correct detection rate by given false alarm rate) which are comparable to the performance of trained human inspectors.In dieser Arbeit wurde ein neuer Algorithmus zur Detektion rissartiger Anzeigen in der digitalen Radiographie entwickelt. Klassische lokale Detektionsmethoden versagen wegen des geringen Signal-Rausch-Verhältnisses (von ca. 1) der Rissanzeigen in den Radiographien. Die notwendige Resistenz gegen Rauschen wird durch die Benutzung von globalen Merkmalen dieser Anzeigen erzielt. Das ist aber mit einem undurchführbaren Rechenaufwand sowie Problemen bei der formalen Beschreibung der Rissform verbunden. Üblicherweise wird ein übermäßiger Rechenaufwand bei der Lösung vergleichbarer Probleme durch Anwendung von Heuristisken reduziert. Dazu benuzte Heuristiken werden mit der Versuchs-und-Irrtums-Methode ermittelt, sind stark problemangepasst und können die optimale Lösung nicht garantieren. Das Besondere dieser Arbeit ist anderer Lösungsansatz, der jegliche Heuristik bei der Suche nach Rissanzeigen vermeidet. Ein globales wahrscheinlichkeitstheoretisches Merkmal, hier Schätzfunktion genannt, wird konstruiert, dessen Maximum unter allen möglichen Formen, Längen und Positionen der Rissanzeige exakt (d.h. ohne Einsatz jeglicher Heuristik) gefunden werden kann. Diese Schätzfunktion wird als die Summe des a posteriori Informationsgewinns bezüglich des Vorhandenseins eines Risses im jeden Punkt entlang der hypothetischen Rissanzeige definiert. Der Informationsgewinn entsteht durch die Überprüfung der Hypothese der Rissanwesenheit anhand der vorhandenen Bildinformation. Eine so definierte Schätzfunktion ist theoretisch gerechtfertigt und besitzt die gewünschten Eigenschaften bei wechselnder Anzeigenintensität. Der Algorithmus wurde in der Programmiersprache C++ implementiert. Seine Detektionseigenschaften wurden sowohl mit simulierten als auch mit realen Bildern untersucht. Der Algorithmus liefert gute Ergenbise (hohe Detektionsrate bei einer vorgegebenen Fehlalarmrate), die jeweils vergleichbar mit den Ergebnissen trainierter menschlicher Auswerter sind

    A novel combination of Cased-Based Reasoning and Multi Criteria Decision Making approach to radiotherapy dose planning

    Get PDF
    In this thesis, a set of novel approaches has been developed by integration of Cased-Based Reasoning (CBR) and Multi-Criteria Decision Making (MCDM) techniques. Its purpose is to design a support system to assist oncologists with decision making about the dose planning for radiotherapy treatment with a focus on radiotherapy for prostate cancer. CBR, an artificial intelligence approach, is a general paradigm to reasoning from past experiences. It retrieves previous cases similar to a new case and exploits the successful past solutions to provide a suggested solution for the new case. The case pool used in this research is a dataset consisting of features and details related to successfully treated patients in Nottingham University Hospital. In a typical run of prostate cancer radiotherapy simple CBR, a new case is selected and thereafter based on the features available at our data set the most similar case to the new case is obtained and its solution is prescribed to the new case. However, there are a number of deficiencies associated with this approach. Firstly, in a real-life scenario, the medical team considers multiple factors rather than just the similarity between two cases and not always the most similar case provides with the most appropriate solution. Thus, in this thesis, the cases with high similarity to a new case have been evaluated with the application of the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS). This approach takes into account multiple criteria besides similarity to prescribe a final solution. Moreover, the obtained dose plans were optimised through a Goal Programming mathematical model to improve the results. By incorporating oncologists’ experiences about violating the conventionally available dose limits a system was devised to manage the trade-off between treatment risk for sensitive organs and necessary actions to effectively eradicate cancer cells. Additionally, the success rate of the treatment, the 2-years cancer free possibility, has a vital role in the efficiency of the prescribed solutions. To consider the success rate, as well as uncertainty involved in human judgment about the values of different features of radiotherapy Data Envelopment Analysis (DEA) based on grey numbers, was used to assess the efficiency of different treatment plans on an input and output based approach. In order to deal with limitations involved in DEA regarding the number of inputs and outputs, we presented an approach for Factor Analysis based on Principal Components to utilize the grey numbers. Finally, to improve the CBR base of the system, we applied Grey Relational Analysis and Gaussian distant based CBR along with features weight selection through Genetic Algorithm to better handle the non-linearity exists within the problem features and the high number of features. Finally, the efficiency of each system has been validated through leave-one-out strategy and the real dataset. The results demonstrated the efficiency of the proposed approaches and capability of the system to assist the medical planning team. Furthermore, the integrated approaches developed within this thesis can be also applied to solve other real-life problems in various domains other than healthcare such as supply chain management, manufacturing, business success prediction and performance evaluation
    corecore