84 research outputs found

    Finite element analysis applied to redesign of submerged entry nozzles for steelmaking

    Get PDF
    The production of steel by continuous casting is facilitated by the use of refractory hollow-ware components. A critical component in this process is the submerged entry nozzle (SEN). The normal operating conditions of the SEN are arduous, involving large temperature gradients and exposure to mechanical forces arising from the flow of molten steel; experimental development of the components is challenging in so hazardous an environment. The effects of the thermal stress conditions in relation to a well-tried design were therefore simulated using a finite element analysis approach. It was concluded from analyses that failures of the type being experienced are caused by the large temperature gradient within the nozzle. The analyses pointed towards a supported shoulder area of the nozzle being most vulnerable to failure and practical in-service experience confirmed this. As a direct consequence of the investigation, design modifications, incorporating changes to both the internal geometry and to the nature of the intermediate support material, were implemented, thereby substantially reducing the stresses within the Al2O3/graphite ceramic liner. Industrial trials of this modified design established that the component reliability would be significantly improved and the design has now been implemented in series production

    Metaheuristics for solving a multimodal home-healthcare scheduling problem

    Get PDF
    Abstract We present a general framework for solving a real-world multimodal home-healthcare scheduling (MHS) problem from a major Austrian home-healthcare provider. The goal of MHS is to assign home-care staff to customers and determine efficient multimodal tours while considering staff and customer satisfaction. Our approach is designed to be as problem-independent as possible, such that the resulting methods can be easily adapted to MHS setups of other home-healthcare providers. We chose a two-stage approach: in the first stage, we generate initial solutions either via constraint programming techniques or by a random procedure. During the second stage, the initial solutions are (iteratively) improved by applying one of four metaheuristics: variable neighborhood search, a memetic algorithm, scatter search and a simulated annealing hyper-heuristic. An extensive computational comparison shows that the approach is capable of solving real-world instances in reasonable time and produces valid solutions within only a few seconds

    A cognitive framework for object recognition with application to autonomous vehicles

    Get PDF
    Autonomous vehicles or self-driving cars are capable of sensing the surrounding environment so they can navigate roads without human input. Decisions are constantly made on sensing, mapping and driving policy using machine learning techniques. Deep Learning – massive neural networks that utilize the power of parallel processing – has become a popular choice for addressing the complexities of real time decision making. This method of machine learning has been shown to outperform alternative solutions in multiple domains, and has an architecture that can be adapted to new problems with relative ease. To harness the power of Deep Learning, it is necessary to have large amounts of training data that are representative of all possible situations the system will face. To successfully implement situational awareness in driverless vehicles, it is not possible to exhaust all possible training examples. An alternative method is to apply cognitive approaches to perception, for situations the autonomous vehicles will face. Cognitive approaches to perception work by mimicking the process of human intelligence – thereby permitting a machine to react to situations it has not previously experienced. This paper proposes a novel cognitive approach for object recognition. The proposed cognitive object recognition algorithm, referred to as Recognition by Components, is inspired by the psychological studies pertaining to early childhood development. The algorithm works by breaking down images into a series of primitive forms such as square, triangle, circle or rectangle and memory based aggregation to identify objects. Experimental results suggest that Recognition by Component algorithm performs significantly better than algorithms that require large amounts of training data

    Estimating the Fractal Dimension, K_2-entropy, and the Predictability of the Atmosphere

    Full text link
    The series of mean daily temperature of air recorded over a period of 215 years is used for analysing the dimensionality and the predictability of the atmospheric system. The total number of data points of the series is 78527. Other 37 versions of the original series are generated, including ``seasonally adjusted'' data, a smoothed series, series without annual course, etc. Modified methods of Grassberger and Procaccia are applied. A procedure for selection of the ``meaningful'' scaling region is proposed. Several scaling regions are revealed in the ln C(r) versus ln r diagram. The first one in the range of larger ln r has a gradual slope and the second one in the range of intermediate ln r has a fast slope. Other two regions are settled in the range of small ln r. The results lead us to claim that the series arises from the activity of at least two subsystems. The first subsystem is low-dimensional (d_f=1.6) and it possesses the potential predictability of several weeks. We suggest that this subsystem is connected with seasonal variability of weather. The second subsystem is high-dimensional (d_f>17) and its error-doubling time is about 4-7 days. It is found that the predictability differs in dependence on season. The predictability time for summer, winter and the entire year (T_2 approx. 4.7 days) is longer than for transition-seasons (T_2 approx. 4.0 days for spring, T_2 approx. 3.6 days for autumn). The role of random noise and the number of data points are discussed. It is shown that a 15-year-long daily temperature series is not sufficient for reliable estimations based on Grassberger and Procaccia algorithms.Comment: 27 pages (LaTex version 2.09) and 15 figures as .ps files, e-mail: [email protected]

    New cellular tools reveal complex epithelial–mesenchymal interactions in hepatocarcinogenesis

    Get PDF
    To enable detailed analyses of cell interactions in tumour development, new epithelial and mesenchymal cell lines were established from human hepatocellular carcinoma by spontaneous outgrowth in culture. We obtained several hepatocarcinoma (HCC)-, B-lymphoblastoid (BLC)-, and myofibroblastoid (MF)-lines from seven cases. In-depth characterisation included cell kinetics, genotype, tumourigenicity, expression of cell-type specific markers, and proteome patterns. Many functions of the cells of origin were found to be preserved. We studied the impact of the mesenchymal lines on hepatocarcinogenesis by in vitro assays. BLC- and MF-supernatants strongly increased the DNA replication of premalignant hepatocytes. The stimulation by MF-lines was mainly attributed to HGF secretion. In HCC-cells, MF-supernatant had only minor effects on cell growth but enhanced migration. MF-lines also stimulated neoangiogenesis through vEGF release. BLC-supernatant dramatically induced death of HCC-cells, which could be largely abrogated by preincubating the supernatant with TNFβ-antiserum. Thus, the new cell lines reveal stage-specific stimulatory and inhibitory interactions between mesenchymal and epithelial tumour cells. In conclusion, the new cell lines provide unique tools to analyse essential components of the complex interplay between the microenvironment and the developing liver cancer, and to identify factors affecting proliferation, migration and death of tumour cells, neoangiogenesis, and outgrowth of additional malignancy

    On Empirical Memory Design, Faster Selection of Bayesian Factorizations and Parameter-Free Gaussian EDAs

    No full text
    Often, Estimation-of-Distribution Algorithms (EDAs) are praised for their ability to optimize a broad class of problems. EDA applications are however still limited. Often heard criticism is that a large population size is required and that distribution estimation takes long. Here we look at possibilities for improvements in these areas. We first discuss the use of a memory to aggregate information over multiple generations and reduce the population size. The approach we take, empirical risk minimization to perform non-linear regression of memory parameters, may well be generalizable to other EDAs. We design a memory this way for a Gaussian EDA and observe smaller population size requirements and fewer evaluations used. We also speed up the selection of Bayesian factorizations for Gaussian EDAs by sorting the entries in the covariance matrix. Finally, we discuss parameter-free Gaussian EDAs for real-valued single-objective optimization. We propose to not only increase the population size in subsequent runs, but to also divide it over parallel runs across the search space. On some multimodal problems improvements are thereby obtained

    Solving the post enrolment course timetabling problem by ant colony optimization.

    No full text
    Abstract In this work we present a new approach to tackle the problem of Post Enrolment Course Timetabling as specified for the International Timetabling Competition 2007 (ITC2007), competition track 2. The heuristic procedure is based on Ant Colony Optimization (ACO) where artificial ants successively construct solutions based on pheromones (stigmergy) and local information. The key feature of our algorithm is the use of two distinct but simplified pheromone matrices in order to improve convergence but still provide enough flexibility for effectively guiding the solution construction process. We show that by parallelizing the algorithm we can improve the solution quality significantly. We applied our algorithm to the instances used for the ITC2007. The results document that our approach is among the leading algorithms for this problem; in all cases the optimal solution could be found. Furthermore we discuss the characteristics of the instances where the algorithm performs especially well

    Preface to Hybrid Metaheuristics - 7th International Workshop, HM 2010

    No full text
    Research in hybrid metaheuristics is now established as a reference field in the areas of optimization and problem solving. Hybrid metaheuristics have a strong impact on applications because they provide efficient and powerful problem solving techniques for optimization problems in industry. Furthermore, the related interdisciplinary research community provides a fertile environment where innovative techniques are presented and discussed. The International Workshop on Hybrid Metaheuristics pursues the direction of combining application-oriented and foundational research. This is demonstrated by the papers in the proceedings of this 7th HM event. The contributions selected for this volume represent an important sample of current research in hybrid metaheuristics. It is worth emphasizing that the selected papers cover both theoretical and applicational results, including applications to logistics and bioinformatics and new paradigmatic hybrid solvers
    corecore