699 research outputs found

    Genetic improvement of GPU software

    Get PDF
    We survey genetic improvement (GI) of general purpose computing on graphics cards. We summarise several experiments which demonstrate four themes. Experiments with the gzip program show that genetic programming can automatically port sequential C code to parallel code. Experiments with the StereoCamera program show that GI can upgrade legacy parallel code for new hardware and software. Experiments with NiftyReg and BarraCUDA show that GI can make substantial improvements to current parallel CUDA applications. Finally, experiments with the pknotsRG program show that with semi-automated approaches, enormous speed ups can sometimes be had by growing and grafting new code with genetic programming in combination with human input

    Multiobjective Reliable Cloud Storage with Its Particle Swarm Optimization Algorithm

    Get PDF
    Information abounds in all fields of the real life, which is often recorded as digital data in computer systems and treated as a kind of increasingly important resource. Its increasing volume growth causes great difficulties in both storage and analysis. The massive data storage in cloud environments has significant impacts on the quality of service (QoS) of the systems, which is becoming an increasingly challenging problem. In this paper, we propose a multiobjective optimization model for the reliable data storage in clouds through considering both cost and reliability of the storage service simultaneously. In the proposed model, the total cost is analyzed to be composed of storage space occupation cost, data migration cost, and communication cost. According to the analysis of the storage process, the transmission reliability, equipment stability, and software reliability are taken into account in the storage reliability evaluation. To solve the proposed multiobjective model, a Constrained Multiobjective Particle Swarm Optimization (CMPSO) algorithm is designed. At last, experiments are designed to validate the proposed model and its solution PSO algorithm. In the experiments, the proposed model is tested in cooperation with 3 storage strategies. Experimental results show that the proposed model is positive and effective. The experimental results also demonstrate that the proposed model can perform much better in alliance with proper file splitting methods

    Cheetah:a computational toolkit for cybergenetic control

    Get PDF
    Abstract Advances in microscopy, microfluidics, and optogenetics enable single-cell monitoring and environmental regulation and offer the means to control cellular phenotypes. The development of such systems is challenging and often results in bespoke setups that hinder reproducibility. To address this, we introduce Cheetah, a flexible computational toolkit that simplifies the integration of real-time microscopy analysis with algorithms for cellular control. Central to the platform is an image segmentation system based on the versatile U-Net convolutional neural network. This is supplemented with functionality to robustly count, characterize, and control cells over time. We demonstrate Cheetah’s core capabilities by analyzing long-term bacterial and mammalian cell growth and by dynamically controlling protein expression in mammalian cells. In all cases, Cheetah’s segmentation accuracy exceeds that of a commonly used thresholding-based method, allowing for more accurate control signals to be generated. Availability of this easy-to-use platform will make control engineering techniques more accessible and offer new ways to probe and manipulate living cells

    Combined tumour treatment by coupling conventional radiotherapy to an additional dose contribution from thermal neutrons

    Get PDF
    Aim: To employ the thermal neutron background in conventional X-rays radiotherapy treatments in order to add a localized neutron dose boost to the patient, enhancing the treatment effectiveness. Background: Conventional linear accelerators for radiotherapy produce fast secondary neutrons with a mean energy of about 1 MeV due to (\u3b3, n) reaction. This neutron field, isotropically distributed, is considered as an extra unaccounted dose during the treatment. Moreover, considering the moderating effect of human body, a thermal neutron field is localized in the tumour area: this neutron background could be employed for Boron Neutron Capture Therapy (BNCT) by previously administering a boron (10B enriched) carrier to the patient, acting as a localized radiosensitizer. The thermal neutron absorption in the 10B enriched tissue will improve radiotherapy effectiveness. Materials and Methods: The feasibility of the proposed method was investigated by using simplified tissue-equivalent phantoms with cavities in correspondence of relevant tissues or organs, suited for dosimetric measurements. A 10 cm 7 10 cm square photon field with different energies was delivered to the phantoms. Additional exposures were implemented, using a compact neutron photo-converter-moderator assembly, with the purpose of modifying the mixed photon-neutron field in the treatment region. Doses due to photons and neutrons were both measured by using radiochromic films and superheated bubble detectors, respectively, and simulated with Monte Carlo codes. Results: For a 10 cm 7 10 cm square photon field with accelerating potentials 6 MV, 10 MV and 15 MV, the neutron dose equivalent in phantom was measured and its values was 0.07 mGy/Gy (neutron dose equivalent / photon absorbed dose at isocentre), 0.99 mGy/Gy and 2.22 mGy/Gy, respectively. For a 18 MV treatment, simulations and measurements quantified the thermal neutron field in the treatment zone in 1.55 7 107 cm 122 Gy 121. Assuming a BNCT- standard 10B concentration in tumour tissue, the calculated additional BNCT dose at 4 cm depth in phantom would be 1.5 mGy-eq/Gy. This ratio would reach 43 mGy- eq/Gy for an intensity modulated radiotherapy treatment (IMRT). When a specifically designed compact neutron photo-converter-moderator assembly is applied to the LINAC to enhance the thermal neutron field, the photon field is modified. Particularly, a 15 MV photon field produces a dose profile very similar to that would be produced by a 6 MV field in absence of the photo-converter-moderator assembly. As far as the thermal neutron field is concerned, more thermal neutrons are present, and thermal neutrons per photon increase of a factor 3 to 12 according to the depth in phantom and to different photoconverter geometries. By contrast, the photo-converter-moderator assembly was found to reduce fast neutrons of a factor 16 in the direction of the incident beam. Conclusions: The parasitic thermal neutron component during conventional high- energy radiotherapy could be exploited to produce additional therapeutic doses if the 10B-carrier was administered to the patient. This radiosensitization effect could be increased by modifying the treatment field by using the specifically designed neutron photo-converter-moderator assembly

    Relational Database Design and Multi-Objective Database Queries for Position Navigation and Timing Data

    Get PDF
    Performing flight tests is a natural part of researching cutting edge sensors and filters for sensor integration. Unfortunately, tests are expensive, and typically take many months of planning. A sensible goal would be to make previously collected data readily available to researchers for future development. The Air Force Institute of Technology (AFIT) has hundreds of data logs potentially available to aid in facilitating further research in the area of navigation. A database would provide a common location where older and newer data sets are available. Such a database must be able to store the sensor data, metadata about the sensors, and affiliated metadata of interest. This thesis proposes a standard approach for sensor and metadata schema and three different design approaches that organize this data in relational databases. Queries proposed by members of the Autonomy and Navigation Technology (ANT) Center at AFIT are the foundation of experiments for testing. These tests fall into two categories, downloaded data, and queries which return a list of missions. Test databases of 100 and 1000 missions are created for the three design approaches to simulate AFIT\u27s present and future volume of data logs. After testing, this thesis recommends one specific approach to the ANT Center as its database solution. In order to enable more complex queries, a Genetic algorithm and Hill Climber algorithm are developed as solutions to queries in the combined Knapsack/Set Covering Problem Domains. These algorithms are tested against the two test databases for the recommended database approach. Each algorithm returned solutions in under two minutes, and may be a valuable tool for researchers when the database becomes operational

    Oil and Gas flow Anomaly Detection on offshore naturally flowing wells using Deep Neural Networks

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Data Science and Advanced Analytics, specialization in Data ScienceThe Oil and Gas industry, as never before, faces multiple challenges. It is being impugned for being dirty, a pollutant, and hence the more demand for green alternatives. Nevertheless, the world still has to rely heavily on hydrocarbons, since it is the most traditional and stable source of energy, as opposed to extensively promoted hydro, solar or wind power. Major operators are challenged to produce the oil more efficiently, to counteract the newly arising energy sources, with less of a climate footprint, more scrutinized expenditure, thus facing high skepticism regarding its future. It has to become greener, and hence to act in a manner not required previously. While most of the tools used by the Hydrocarbon E&P industry is expensive and has been used for many years, it is paramount for the industry’s survival and prosperity to apply predictive maintenance technologies, that would foresee potential failures, making production safer, lowering downtime, increasing productivity and diminishing maintenance costs. Many efforts were applied in order to define the most accurate and effective predictive methods, however data scarcity affects the speed and capacity for further experimentations. Whilst it would be highly beneficial for the industry to invest in Artificial Intelligence, this research aims at exploring, in depth, the subject of Anomaly Detection, using the open public data from Petrobras, that was developed by experts. For this research the Deep Learning Neural Networks, such as Recurrent Neural Networks with LSTM and GRU backbones, were implemented for multi-class classification of undesirable events on naturally flowing wells. Further, several hyperparameter optimization tools were explored, mainly focusing on Genetic Algorithms as being the most advanced methods for such kind of tasks. The research concluded with the best performing algorithm with 2 stacked GRU and the following vector of hyperparameters weights: [1, 47, 40, 14], which stand for timestep 1, number of hidden units 47, number of epochs 40 and batch size 14, producing F1 equal to 0.97%. As the world faces many issues, one of which is the detrimental effect of heavy industries to the environment and as result adverse global climate change, this project is an attempt to contribute to the field of applying Artificial Intelligence in the Oil and Gas industry, with the intention to make it more efficient, transparent and sustainable

    A Review

    Get PDF
    Ovarian cancer is the most common cause of death among gynecological malignancies. We discuss different types of clinical and nonclinical features that are used to study and analyze the differences between benign and malignant ovarian tumors. Computer-aided diagnostic (CAD) systems of high accuracy are being developed as an initial test for ovarian tumor classification instead of biopsy, which is the current gold standard diagnostic test. We also discuss different aspects of developing a reliable CAD system for the automated classification of ovarian cancer into benign and malignant types. A brief description of the commonly used classifiers in ultrasound-based CAD systems is also given

    All-Optical 4D In Vivo Monitoring And Manipulation Of Zebrafish Cardiac Conduction

    Get PDF
    The cardiac conduction system is vital for the initiation and maintenance of the heartbeat. Over the recent years, the zebrafish (Danio rerio) has emerged as a promising model organism to study this specialized system. The embryonic zebrafish heart’s unique accessibility for light microscopy has put it in the focus of many cardiac researchers. However, imaging cardiac conduction in vivo remained a challenge. Typically, hearts had to be removed from the animal to make them accessible for fluorescent dyes and electrophysiology. Furthermore, no technique provided enough spatial and temporal resolution to study the importance of individual cells in the myocardial network. With the advent of light sheet microscopy, better camera technology, new fluorescent reporters and advanced image analysis tools, all-optical in vivo mapping of cardiac conduction is now within reach. In the course of this thesis, I developed new methods to image and manipulate cardiac conduction in 4D with cellular resolution in the unperturbed zebrafish heart. Using my newly developed methods, I could detect the first calcium sparks and reveal the onset of cardiac automaticity in the early heart tube. Furthermore, I could visualize the 4D cardiac conduction pattern in the embryonic heart and use it to study component-specific calcium transients. In addition, I could test the robustness of embryonic cardiac conduction under aggravated conditions, and found new evidence for the presence of an early ventricular pacemaker system. My results lay the foundation for novel, non-invasive in vivo studies of cardiac function and performance
    • …
    corecore