13,577 research outputs found

    An empirical learning-based validation procedure for simulation workflow

    Full text link
    Simulation workflow is a top-level model for the design and control of simulation process. It connects multiple simulation components with time and interaction restrictions to form a complete simulation system. Before the construction and evaluation of the component models, the validation of upper-layer simulation workflow is of the most importance in a simulation system. However, the methods especially for validating simulation workflow is very limit. Many of the existing validation techniques are domain-dependent with cumbersome questionnaire design and expert scoring. Therefore, this paper present an empirical learning-based validation procedure to implement a semi-automated evaluation for simulation workflow. First, representative features of general simulation workflow and their relations with validation indices are proposed. The calculation process of workflow credibility based on Analytic Hierarchy Process (AHP) is then introduced. In order to make full use of the historical data and implement more efficient validation, four learning algorithms, including back propagation neural network (BPNN), extreme learning machine (ELM), evolving new-neuron (eNFN) and fast incremental gaussian mixture model (FIGMN), are introduced for constructing the empirical relation between the workflow credibility and its features. A case study on a landing-process simulation workflow is established to test the feasibility of the proposed procedure. The experimental results also provide some useful overview of the state-of-the-art learning algorithms on the credibility evaluation of simulation models

    Robust Multi-Objective Sustainable Reverse Supply Chain Planning: An Application in the Steel Industry

    Get PDF
    In the design of the supply chain, the use of the returned products and their recycling in the production and consumption network is called reverse logistics. The proposed model aims to optimize the flow of materials in the supply chain network (SCN), and determine the amount and location of facilities and the planning of transportation in conditions of demand uncertainty. Thus, maximizing the total profit of operation, minimizing adverse environmental effects, and maximizing customer and supplier service levels have been considered as the main objectives. Accordingly, finding symmetry (balance) among the profit of operation, the environmental effects and customer and supplier service levels is considered in this research. To deal with the uncertainty of the model, scenario-based robust planning is employed alongside a meta-heuristic algorithm (NSGA-II) to solve the model with actual data from a case study of the steel industry in Iran. The results obtained from the model, solving and validating, compared with actual data indicated that the model could optimize the objectives seamlessly and determine the amount and location of the necessary facilities for the steel industry more appropriately.This article belongs to the Special Issue Uncertain Multi-Criteria Optimization Problem

    A survey on utilization of data mining approaches for dermatological (skin) diseases prediction

    Get PDF
    Due to recent technology advances, large volumes of medical data is obtained. These data contain valuable information. Therefore data mining techniques can be used to extract useful patterns. This paper is intended to introduce data mining and its various techniques and a survey of the available literature on medical data mining. We emphasize mainly on the application of data mining on skin diseases. A categorization has been provided based on the different data mining techniques. The utility of the various data mining methodologies is highlighted. Generally association mining is suitable for extracting rules. It has been used especially in cancer diagnosis. Classification is a robust method in medical mining. In this paper, we have summarized the different uses of classification in dermatology. It is one of the most important methods for diagnosis of erythemato-squamous diseases. There are different methods like Neural Networks, Genetic Algorithms and fuzzy classifiaction in this topic. Clustering is a useful method in medical images mining. The purpose of clustering techniques is to find a structure for the given data by finding similarities between data according to data characteristics. Clustering has some applications in dermatology. Besides introducing different mining methods, we have investigated some challenges which exist in mining skin data

    Expert systems and finite element structural analysis - a review

    Get PDF
    Finite element analysis of many engineering systems is practised more as an art than as a science . It involves high level expertise (analytical as well as heuristic) regarding problem modelling (e .g. problem specification,13; choosing the appropriate type of elements etc .), optical mesh design for achieving the specified accuracy (e .g . initial mesh selection, adaptive mesh refinement), selection of the appropriate type of analysis and solution13; routines and, finally, diagnosis of the finite element solutions . Very often such expertise is highly dispersed and is not available at a single place with a single expert. The design of an expert system, such that the necessary expertise is available to a novice to perform the same job even in the absence of trained experts, becomes an attractive proposition. 13; In this paper, the areas of finite element structural analysis which require experience and decision-making capabilities are explored . A simple expert system, with a feasible knowledge base for problem modelling, optimal mesh design, type of analysis and solution routines, and diagnosis, is outlined. Several efforts in these directions, reported in the open literature, are also reviewed in this paper

    QUANTIFYING THE EFFECT OF CONSTRUCTION SITE FACTORS ON CONCRETE QUALITY, COSTS AND PRODUCTION RATES

    Get PDF
    Factors affecting concrete can be categorized as structured factors or unstructured factors. The first group of factors consists of those related to the production process of concrete including water-cement ratio, properties of raw materials and mix proportions. Unstructured factors or construction site factors are related to labor skills and local conditions during the construction process of a project. Concrete compressive strength as a quality metric, costs and production rates may be affected significantly by such factors while performing concrete operations at the jobsite. Several prior studies have investigated the effect of structured factors on concrete. However literature is limited regarding the effects of unstructured factors during the construction phase of a facility. This study proposes a systematic methodology to identify and quantify the effects of construction site factors including crew experience, compaction method, mixing time, curing humidity and curing temperature on concrete quality, costs and production rates using fuzzy inference systems. First, the perceived importance of construction-related factors is identified and evaluated through literature review and a survey deployed to construction experts. Then, the theory of design of experiments (DOE) is used to conduct a full 25 factorial experiment consisting of 32 runs and 192 compressive strength tests to identify statistically significant unstructured factors. Fuzzy inference systems (FISs) are proposed for predicting concrete compressive strength, costs and production rate effects through the use of adapted network-based fuzzy inference system (ANFIS). Finally, an optimization model is formulated and tested for managing concrete during the construction process of a facility. Literature review and survey results showed that curing humidity, crew experience, and compaction method are the top three factors perceived by construction experts to affect concrete compressive strength, whereas crew experience, mixing time and compaction method are the top three factors affecting concrete costs and production rates. Additionally, crew experience, compaction and mixing time were found to dominate global ranking of perceived affecting factors through the application of the relative importance index (RII). When conducting designed experiments and analysis of variance (ANOVA), compaction method, mixing time, curing humidity and curing temperature were identified to be statistically significant construction site factors for concrete compressive strength whereas crew experience, compaction method and mixing time were statistically significant factors for cost and production rates. A Sugeno type fuzzy inference system (FIS) for quantifying compressive strength, cost and production rate effects was created by using ANFIS, having correlation coefficients (R-squared values) greater than 93%, indicating that resulting models predict new observations well. Curing temperature (i.e., on-site curing temperature) was identified to be the most affecting condition for concrete compressive strength while mixing time had the biggest impact on concrete cost and production rates. The developed FISs can be used as a decision–support tool that allows for determining desired operating conditions, that ensures specified compressive strength, saves resources and maximizes profits when fabricating, placing and curing concrete

    An Empirical Study on the Procedure to Derive Software Quality Estimation Models

    Get PDF
    Software quality assurance has been a heated topic for several decades. If factors that influence software quality can be identified, they may provide more insight for better software development management. More precise quality assurance can be achieved by employing resources according to accurate quality estimation at the early stages of a project. In this paper, a general procedure is proposed to derive software quality estimation models and various techniques are presented to accomplish the tasks in respective steps. Several statistical techniques together with machine learning method are utilized to verify the effectiveness of software metrics. Moreover, a neuro-fuzzy approach is adopted to improve the accuracy of the estimation model. This procedure is carried out based on data from the ISBSG repository to present its empirical value

    Rule-Based System Architecting of Earth Observing Systems: Earth Science Decadal Survey

    Get PDF
    This paper presents a methodology to explore the architectural trade space of Earth observing satellite systems, and applies it to the Earth Science Decadal Survey. The architecting problem is formulated as a combinatorial optimization problem with three sets of architectural decisions: instrument selection, assignment of instruments to satellites, and mission scheduling. A computational tool was created to automatically synthesize architectures based on valid combinations of options for these three decisions and evaluate them according to several figures of merit, including satisfaction of program requirements, data continuity, affordability, and proxies for fairness, technical, and programmatic risk. A population-based heuristic search algorithm is used to search the trade space. The novelty of the tool is that it uses a rule-based expert system to model the knowledge-intensive components of the problem, such as scientific requirements, and to capture the nonlinear positive and negative interactions between instruments (synergies and interferences), which drive both requirement satisfaction and cost. The tool is first demonstrated on the past NASA Earth Observing System program and then applied to the Decadal Survey. Results suggest that the Decadal Survey architecture is dominated by other more distributed architectures in which DESDYNI and CLARREO are consistently broken down into individual instruments."La Caixa" FoundationCharles Stark Draper LaboratoryGoddard Space Flight Cente
    corecore