97 research outputs found

    Biomedical Knowledge Engineering Using a Computational Grid

    Get PDF

    Optimising replenishment policy in an integrated supply chain with controllable lead time and backorders-lost sales mixture

    Get PDF
    This paper aims to optimize the inventory replenishment policy in an integrated supply chain consisting of a single supplier and a single buyer. The system under consideration has the features such as backorders-lost sales mixture, controllable lead time, stochastic demand, and stockout costs. The underlying problem has not been studied in the literature. We present a novel approach to formulate the optimization problem, which is able to satisfy the constraint on the number of admissible stockouts per time unit. To solve the optimization problem, we propose two algorithms: an exact algorithm and a heuristic algorithm. These two algorithms are developed based on some analytical properties that we established by analysing the cost function in relation to the decision variables. The heuristic algorithm employs an approximation technique based on an ad-hoc Taylor series expansion. Extensive numerical experiments are provided to demonstrate the effectiveness of the proposed algorithms

    Efficient near-optimal procedures for some inventory models with backorders-lost sales mixture and controllable lead time, under continuous or periodic review

    Get PDF
    This paper considers a number of inventory models with backorders-lost sales mixture, stockout costs, and controllable lead time. The lead time is a linear function of the lot size and includes a constant term that is made of several components. These lot-size-independent components are assumed to be controllable. Both single- and double-echelon inventory systems, under periodic or continuous review, are considered. To authors knowledge, these models have never been previously studied in literature. The purpose of this paper is to analyse and optimize these novel inventory models. The optimization is carried out by means of heuristics that work on an ad hoc approximation of the cost functions. This peculiarity permits to exploit closed-form expressions that make the optimization procedure simpler and more readily applicable in practice than standard approaches. Finally, numerical experiments investigate the efficiency of the proposed heuristics and the sensitivity of the developed models

    Pattern matching in high energy physics by using neural network and genetic algorithm

    Get PDF
    In this paper two different approaches to provide information from events by high energy physics experiments are shown. Usually the representations produced in such experiments are spot-composed and the classical algorithms to be needed for data analysis are time consuming. For this reason the possibility to speed up pattern recognition tasks by soft computing approach with parallel algorithms has been investigated. The first scheme shown in the following is a two-layer neural network with forward connections, the second one consists of an evolutionary algorithm with elitistic strategy and mutation and cross-over adaptive probability. Test results of these approaches have been carried out analysing a set of images produced by an optical ring imaging Cherenkov (RICH) detector at CERN

    A bioinformatics knowledge discovery in text application for grid computing

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A fundamental activity in biomedical research is Knowledge Discovery which has the ability to search through large amounts of biomedical information such as documents and data. High performance computational infrastructures, such as Grid technologies, are emerging as a possible infrastructure to tackle the intensive use of Information and Communication resources in life science. The goal of this work was to develop a software middleware solution in order to exploit the many knowledge discovery applications on scalable and distributed computing systems to achieve intensive use of ICT resources.</p> <p>Methods</p> <p>The development of a grid application for Knowledge Discovery in Text using a middleware solution based methodology is presented. The system must be able to: perform a user application model, process the jobs with the aim of creating many parallel jobs to distribute on the computational nodes. Finally, the system must be aware of the computational resources available, their status and must be able to monitor the execution of parallel jobs. These operative requirements lead to design a middleware to be specialized using user application modules. It included a graphical user interface in order to access to a node search system, a load balancing system and a transfer optimizer to reduce communication costs.</p> <p>Results</p> <p>A middleware solution prototype and the performance evaluation of it in terms of the speed-up factor is shown. It was written in JAVA on Globus Toolkit 4 to build the grid infrastructure based on GNU/Linux computer grid nodes. A test was carried out and the results are shown for the named entity recognition search of symptoms and pathologies. The search was applied to a collection of 5,000 scientific documents taken from PubMed.</p> <p>Conclusion</p> <p>In this paper we discuss the development of a grid application based on a middleware solution. It has been tested on a knowledge discovery in text process to extract new and useful information about symptoms and pathologies from a large collection of unstructured scientific documents. As an example a computation of Knowledge Discovery in Database was applied on the output produced by the KDT user module to extract new knowledge about symptom and pathology bio-entities.</p

    Comprehensive Brain Tumour Characterisation with VERDICT-MRI: Evaluation of Cellular and Vascular Measures Validated by Histology

    Get PDF
    The aim of this work was to extend the VERDICT-MRI framework for modelling brain tumours, enabling comprehensive characterisation of both intra- and peritumoural areas with a particular focus on cellular and vascular features. Diffusion MRI data were acquired with multiple b-values (ranging from 50 to 3500 s/mm2), diffusion times, and echo times in 21 patients with brain tumours of different types and with a wide range of cellular and vascular features. We fitted a selection of diffusion models that resulted from the combination of different types of intracellular, extracellular, and vascular compartments to the signal. We compared the models using criteria for parsimony while aiming at good characterisation of all of the key histological brain tumour components. Finally, we evaluated the parameters of the best-performing model in the differentiation of tumour histotypes, using ADC (Apparent Diffusion Coefficient) as a clinical standard reference, and compared them to histopathology and relevant perfusion MRI metrics. The best-performing model for VERDICT in brain tumours was a three-compartment model accounting for anisotropically hindered and isotropically restricted diffusion and isotropic pseudo-diffusion. VERDICT metrics were compatible with the histological appearance of low-grade gliomas and metastases and reflected differences found by histopathology between multiple biopsy samples within tumours. The comparison between histotypes showed that both the intracellular and vascular fractions tended to be higher in tumours with high cellularity (glioblastoma and metastasis), and quantitative analysis showed a trend toward higher values of the intracellular fraction (fic) within the tumour core with increasing glioma grade. We also observed a trend towards a higher free water fraction in vasogenic oedemas around metastases compared to infiltrative oedemas around glioblastomas and WHO 3 gliomas as well as the periphery of low-grade gliomas. In conclusion, we developed and evaluated a multi-compartment diffusion MRI model for brain tumours based on the VERDICT framework, which showed agreement between non-invasive microstructural estimates and histology and encouraging trends for the differentiation of tumour types and sub-regions

    Geophysical monitoring of Stromboli volcano: insight into recent volcanic activity

    Get PDF
    Stromboli is an open conduit strato-volcano of the Aeolian archipelago (Italy), characterized by typical Strom-bolian explosive activity, lasting for several centuries, and by the emission of huge amounts of gas. The normalactivity of Stromboli is characterized by some hundreds of moderate explosions per day. Major explosions, whichlaunch scoria up to hundreds of meters from the craters, lava flows and paroxysmal explosions, which producelarge ballistic blocks, sometimes take place. During the effusive eruption in 2002 - 2003, which caused a tsunamiwith waves of about 10 meters high along the coasts of the Island, the monitoring system was enhanced. In 2006INGV has added two Sacks-Evertson borehole volumetric dilatometers to the surveillance system, in order to mon-itor changes in the local strain field by measuring areal strain. Today we have a large amount of geophysical dataand observations that allow us to better understand how this volcano works. After a period of low explosive activitystarted in mid-2014, Stromboli has shown a more intense explosive activity in the last few months. During the re-cent phase of increased activity, the geophysical monitoring system detected four major explosions occurred on 26July, 23 October, 1 November and 1 December 2017, respectively. The current phase of reawakening of Strombolivolcano has led the Italian civil protection authorities to decree the "attention" alert level (yellow) on the Island.PublishedVienna, Austria1IT. Reti di monitoraggio e sorveglianz
    • …
    corecore