33 research outputs found

    A Soft Computing Framework for Software Effort Estimation

    Get PDF
    Accurate software estimation such as cost estimation, quality estimation and risk analysis is a major issue in software project management. In this paper, we present a soft computing framework to tackle this challenging problem. We first use a preprocessing neuro-fuzzy inference system to handle the dependencies among contributing factors and decouple the effects of the contributing factors into individuals. Then we use a neuro-fuzzy bank to calibrate the parameters of contributing factors. In order to extend our framework into fields that lack of an appropriate algorithmic model of their own, we propose a default algorithmic model that can be replaced when a better model is available. One feature of this framework is that the architecture is inherently independent of the choice of algorithmic models or the nature of the estimation problems. By integrating neural networks, fuzzy logic and algorithmic models into one scheme, this framework has learning ability, integration capability of both expert knowledge and project data, good interpretability, and robustness to imprecise and uncertain inputs. Validation using industry project data shows that the framework produces good results when used to predict software cost

    A New Calibration for Function Point Complexity Weights

    Get PDF
    Function Point (FP) is a useful software metric that was first proposed twenty-five years ago, since then, it has steadily evolved into a functional size metric consolidated in the well-accepted Standardized International Function Point Users Group (IFPUG) Counting Practices Manual - version 4.2. While software development industry has grown rapidly, the weight values assigned to count standard FP still remain same, which raise critical questions about the validity of the weight values. In this paper, we discuss the concepts of calibrating Function Point, whose aims are to estimate a more accurate software size that fits for specific software application, to reflect software industry trend, and to improve the cost estimation of software projects. A FP calibration model called Neuro-Fuzzy Function Point Calibration Model (NFFPCM) that integrates the learning ability from neural network and the ability to capture human knowledge from fuzzy logic is proposed. The empirical validation using International Software Benchmarking Standards Group (ISBSG) data repository release 8 shows a 22% accuracy improvement of mean MRE in software effort estimation after calibration

    Towards an Early Software Estimation Using Log-Linear Regression and a Multilayer Perceptron Model

    Get PDF
    Software estimation is a tedious and daunting task in project management and software development. Software estimators are notorious in predicting software effort and they have been struggling in the past decades to provide new models to enhance software estimation. The most critical and crucial part of software estimation is when estimation is required in the early stages of the software life cycle where the problem to be solved has not yet been completely revealed. This paper presents a novel log-linear regression model based on the use case point model (UCP) to calculate the software effort based on use case diagrams. A fuzzy logic approach is used to calibrate the productivity factor in the regression model. Moreover, a multilayer perceptron (MLP) neural network model was developed to predict software effortbased on the software size and team productivity. Experiments show that the proposed approach outperforms the original UCP model. Furthermore, a comparison between the MLP and log-linear regression models was conducted based on the size of the projects. Results demonstrate that the MLP model can surpass the regression model when small projects are used, but the log-linear regression model gives better results when estimating larger projects

    Multifunctional optimized group method data handling for software effort estimation

    Get PDF
    Nowadays, the trend of significant effort estimations is in demand. Due to its popularity, the stakeholder needs effective and efficient software development processes with the best estimation and accuracy to suit all data types. Nevertheless, finding the best effort estimation model with good accuracy is hard to serve this purpose. Group Method of Data Handling (GMDH) algorithms have been widely used for modelling and identifying complex systems and potentially applied in software effort estimation. However, there is limited study to determine the best architecture and optimal weight coefficients of the transfer function for the GMDH model. This study aimed to propose a hybrid multifunctional GMDH with Artificial Bee Colony (GMDH-ABC) based on a combination of four individual GMDH models, namely, GMDH-Polynomial, GMDH-Sigmoid, GMDH-Radial Basis Function, and GMDH-Tangent. The best GMDH architecture is determined based on L9 Taguchi orthogonal array. Five datasets (i.e., Cocomo, Dershanais, Albrecht, Kemerer and ISBSG) were used to validate the proposed models. The missing values in the dataset are imputed by the developed MissForest Multiple imputation method (MFMI). The Mean Absolute Percentage Error (MAPE) was used as performance measurement. The result showed that the GMDH-ABC model outperformed the individual GMDH by more than 50% improvement compared to standard conventional GMDH models and the benchmark ANN model in all datasets. The Cocomo dataset improved by 49% compared to the conventional GMDH-LSM. Improvements of 71%, 63%, 67%, and 82% in accuracy were obtained for the Dershanis dataset, Albrecht dataset, Kemerer dataset, and ISBSG dataset, respectively, as compared with the conventional GMDH-LSM. The results indicated that the proposed GMDH-ABC model has the ability to achieve higher accuracy in software effort estimation

    Soft computing and non-parametric techniques for effective video surveillance systems

    Get PDF
    Esta tesis propone varios objetivos interconectados para el diseño de un sistema de vídeovigilancia cuyo funcionamiento es pensado para un amplio rango de condiciones. Primeramente se propone una métrica de evaluación del detector y sistema de seguimiento basada en una mínima referencia. Dicha técnica es una respuesta a la demanda de ajuste de forma rápida y fácil del sistema adecuándose a distintos entornos. También se propone una técnica de optimización basada en Estrategias Evolutivas y la combinación de funciones de idoneidad en varios pasos. El objetivo es obtener los parámetros de ajuste del detector y el sistema de seguimiento adecuados para el mejor funcionamiento en una amplia gama de situaciones posibles Finalmente, se propone la construcción de un clasificador basado en técnicas no paramétricas que pudieran modelar la distribución de datos de entrada independientemente de la fuente de generación de dichos datos. Se escogen actividades detectables a corto plazo que siguen un patrón de tiempo que puede ser fácilmente modelado mediante HMMs. La propuesta consiste en una modificación del algoritmo de Baum-Welch con el fin de modelar las probabilidades de emisión del HMM mediante una técnica no paramétrica basada en estimación de densidad con kernels (KDE). _____________________________________This thesis proposes several interconnected objectives for the design of a video-monitoring system whose operation is thought for a wide rank of conditions. Firstly an evaluation technique of the detector and tracking system is proposed and it is based on a minimum reference or ground-truth. This technique is an answer to the demand of fast and easy adjustment of the system adapting itself to different contexts. Also, this thesis proposes a technique of optimization based on Evolutionary Strategies and the combination of fitness functions. The objective is to obtain the parameters of adjustment of the detector and tracking system for the best operation in an ample range of possible situations. Finally, it is proposed the generation of a classifier in which a non-parametric statistic technique models the distribution of data regardless the source generation of such data. Short term detectable activities are chosen that follow a time pattern that can easily be modeled by Hidden Markov Models (HMMs). The proposal consists in a modification of the Baum-Welch algorithm with the purpose of modeling the emission probabilities of the HMM by means of a nonparametric technique based on the density estimation with kernels (KDE)

    28th International Symposium on Temporal Representation and Reasoning (TIME 2021)

    Get PDF
    The 28th International Symposium on Temporal Representation and Reasoning (TIME 2021) was planned to take place in Klagenfurt, Austria, but had to move to an online conference due to the insecurities and restrictions caused by the pandemic. Since its frst edition in 1994, TIME Symposium is quite unique in the panorama of the scientifc conferences as its main goal is to bring together researchers from distinct research areas involving the management and representation of temporal data as well as the reasoning about temporal aspects of information. Moreover, TIME Symposium aims to bridge theoretical and applied research, as well as to serve as an interdisciplinary forum for exchange among researchers from the areas of artifcial intelligence, database management, logic and verifcation, and beyond
    corecore