10,204 research outputs found

    Feature weighting techniques for CBR in software effort estimation studies: A review and empirical evaluation

    Get PDF
    Context : Software effort estimation is one of the most important activities in the software development process. Unfortunately, estimates are often substantially wrong. Numerous estimation methods have been proposed including Case-based Reasoning (CBR). In order to improve CBR estimation accuracy, many researchers have proposed feature weighting techniques (FWT). Objective: Our purpose is to systematically review the empirical evidence to determine whether FWT leads to improved predictions. In addition we evaluate these techniques from the perspectives of (i) approach (ii) strengths and weaknesses (iii) performance and (iv) experimental evaluation approach including the data sets used. Method: We conducted a systematic literature review of published, refereed primary studies on FWT (2000-2014). Results: We identified 19 relevant primary studies. These reported a range of different techniques. 17 out of 19 make benchmark comparisons with standard CBR and 16 out of 17 studies report improved accuracy. Using a one-sample sign test this positive impact is significant (p = 0:0003). Conclusion: The actionable conclusion from this study is that our review of all relevant empirical evidence supports the use of FWTs and we recommend that researchers and practitioners give serious consideration to their adoption

    Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX)

    Full text link
    The modeling and analysis generic interface for external numerical codes (MAGIX) is a model optimizer developed under the framework of the coherent set of astrophysical tools for spectroscopy (CATS) project. The MAGIX package provides a framework of an easy interface between existing codes and an iterating engine that attempts to minimize deviations of the model results from available observational data, constraining the values of the model parameters and providing corresponding error estimates. Many models (and, in principle, not only astrophysical models) can be plugged into MAGIX to explore their parameter space and find the set of parameter values that best fits observational/experimental data. MAGIX complies with the data structures and reduction tools of ALMA (Atacama Large Millimeter Array), but can be used with other astronomical and with non-astronomical data.Comment: 12 pages, 15 figures, 2 tables, paper is also available at http://www.aanda.org/articles/aa/pdf/forth/aa20063-12.pd

    Big data analytics:Computational intelligence techniques and application areas

    Get PDF
    Big Data has significant impact in developing functional smart cities and supporting modern societies. In this paper, we investigate the importance of Big Data in modern life and economy, and discuss challenges arising from Big Data utilization. Different computational intelligence techniques have been considered as tools for Big Data analytics. We also explore the powerful combination of Big Data and Computational Intelligence (CI) and identify a number of areas, where novel applications in real world smart city problems can be developed by utilizing these powerful tools and techniques. We present a case study for intelligent transportation in the context of a smart city, and a novel data modelling methodology based on a biologically inspired universal generative modelling approach called Hierarchical Spatial-Temporal State Machine (HSTSM). We further discuss various implications of policy, protection, valuation and commercialization related to Big Data, its applications and deployment

    Optimizing Effort and Time Parameters of COCOMO II Estimation using Fuzzy Multi-objective PSO

    Get PDF
    The  estimation  of  software  effort  is  an  essential and  crucial   activity   for  the  software   development   life  cycle. Software effort estimation is a challenge that often appears on the project of making a software. A poor estimate will produce result in a worse project management.  Various software cost estimation model has been introduced  to resolve this problem. Constructive Cost Model II (COCOMO II Model) create large extent most considerable  and broadly  used as model  for cost estimation.  To estimate   the  effort  and  the  development   time  of  a  software project,  COCOMO  II model uses cost drivers,  scale factors  and line  of  code.  However,  the  model  is  still  lacking  in  terms  of accuracy both in effort and development  time estimation.  In this study,   we   do   investigate   the   influence   of   components   and attributes to achieve new better accuracy improvement on COCOMO II model. And we introduced the use of Gaussian Membership  Function  (GMF)  Fuzzy  Logic  and Multi-Objective Particle Swarm Optimization method (MOPSO) algorithms in calibrating  and optimizing  the COCOMO  II model parameters. The   proposed   method   is   applied   on   Nasa93   dataset.   The experiment  result of proposed method able to reduce error down to  11.891%  and  8.082%  from  the  perspective  of  COCOMO  II model.  The  method  has  achieved  better  results  than  those  of previous   researches   and  deals  proficient   with  inexplicit   data input and further improve reliability of the estimation method

    Insights on Research Techniques towards Cost Estimation in Software Design

    Get PDF
    Software cost estimation is of the most challenging task in project management in order to ensuring smoother development operation and target achievement. There has been evolution of various standards tools and techniques for cost estimation practiced in the industry at present times. However, it was never investigated about the overall picturization of effectiveness of such techniques till date. This paper initiates its contribution by presenting taxonomies of conventional cost-estimation techniques and then investigates the research trends towards frequently addressed problems in it. The paper also reviews the existing techniques in well-structured manner in order to highlight the problems addressed, techniques used, advantages associated and limitation explored from literatures. Finally, we also brief the explored open research issues as an added contribution to this manuscript
    corecore