6,888 research outputs found

    Insights on Research Techniques towards Cost Estimation in Software Design

    Get PDF
    Software cost estimation is of the most challenging task in project management in order to ensuring smoother development operation and target achievement. There has been evolution of various standards tools and techniques for cost estimation practiced in the industry at present times. However, it was never investigated about the overall picturization of effectiveness of such techniques till date. This paper initiates its contribution by presenting taxonomies of conventional cost-estimation techniques and then investigates the research trends towards frequently addressed problems in it. The paper also reviews the existing techniques in well-structured manner in order to highlight the problems addressed, techniques used, advantages associated and limitation explored from literatures. Finally, we also brief the explored open research issues as an added contribution to this manuscript

    Machine Learning and Integrative Analysis of Biomedical Big Data.

    Get PDF
    Recent developments in high-throughput technologies have accelerated the accumulation of massive amounts of omics data from multiple sources: genome, epigenome, transcriptome, proteome, metabolome, etc. Traditionally, data from each source (e.g., genome) is analyzed in isolation using statistical and machine learning (ML) methods. Integrative analysis of multi-omics and clinical data is key to new biomedical discoveries and advancements in precision medicine. However, data integration poses new computational challenges as well as exacerbates the ones associated with single-omics studies. Specialized computational approaches are required to effectively and efficiently perform integrative analysis of biomedical data acquired from diverse modalities. In this review, we discuss state-of-the-art ML-based approaches for tackling five specific computational challenges associated with integrative analysis: curse of dimensionality, data heterogeneity, missing data, class imbalance and scalability issues

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated

    A Simple Neural Network Approach to Software Cost Estimation

    Get PDF
    The effort invested in a software project is one of the most challenging task and most analyzed variables in recent years in the process of project management Software cost estimation predicts the amount of effort and development time required to build a software system It is one of the most critical tasks and it helps the software industries to effectively manage their software development process There are a number of cost estimation models Each of these models have their own pros and cons in estimating the development cost and effort This paper investigates the use of Back-Propagation neural networks for software cost estimation The model is designed in such a manner that accommodates the widely used COCOMO model and improves its performance It deals effectively with imprecise and uncertain input and enhances the reliability of software cost estimates The model is tested using three publicly available software development datasets The test results from the trained neural network are compared with that of the COCOMO model From the experimental results it was concluded that using the proposed neural network model the accuracy of cost estimation can be improved and the estimated cost can be very close to the actual cos

    Data mining in soft computing framework: a survey

    Get PDF
    The present article provides a survey of the available literature on data mining using soft computing. A categorization has been provided based on the different soft computing tools and their hybridizations used, the data mining function implemented, and the preference criterion selected by the model. The utility of the different soft computing methodologies is highlighted. Generally fuzzy sets are suitable for handling the issues related to understandability of patterns, incomplete/noisy data, mixed media information and human interaction, and can provide approximate solutions faster. Neural networks are nonparametric, robust, and exhibit good learning and generalization capabilities in data-rich environments. Genetic algorithms provide efficient search algorithms to select a model, from mixed media data, based on some preference criterion/objective function. Rough sets are suitable for handling different types of uncertainty in data. Some challenges to data mining and the application of soft computing methodologies are indicated. An extensive bibliography is also included

    A Technique to Stock Market Prediction Using Fuzzy Clustering and Artificial Neural Networks

    Get PDF
    Stock market prediction is essential and of great interest because successful prediction of stock prices may promise smart benefits. These tasks are highly complicated and very difficult. Many researchers have made valiant attempts in data mining to devise an efficient system for stock market movement analysis. In this paper, we have developed an efficient approach to stock market prediction by employing fuzzy C-means clustering and artificial neural network. This research has been encouraged by the need of predicting the stock market to facilitate the investors about buy and hold strategy and to make profit. Firstly, the original stock market data are converted into interpreted historical (financial) data i.e. via technical indicators. Based on these technical indicators, datasets that are required for analysis are created. Subsequently, fuzzy-clustering technique is used to generate different training subsets. Subsequently, based on different training subsets, different ANN models are trained to formulate different base models. Finally, a meta-learner, fuzzy system module, is employed to predict the stock price. The results for the stock market prediction are validated through evaluation metrics, namely mean absolute deviation, mean square error, root mean square error, mean absolute percentage error used to estimate the forecasting accuracy in the stock market. Comparative analysis is carried out for single Neural Network (NN) and existing technique neural. The obtained results show that the proposed approach produces better results than the other techniques in terms of accuracy

    Effort Estimation of Agile and Web-Based Software Using Artificial Neural Networks

    Get PDF
    The agile methodology of software development is accepted as a superior alternative to conventional methods of software development, because of its inherent benefits like iterative development, rapid delivery and reduced risk. Hence, software developers are required to estimate the effort necessary to develop projects by agile methodology in an efficient manner because the requirements keep on changing. Web has become a part and parcel of our lives. People depend on Internet for almost everything these days. Many business units depend on Internet for communication with clients and for outsourcing load to other branches. In such a scenario, there is a necessity of efficient development of web-based software. For improving the efficiency of software development, resource utilization must be optimum. For achieving this, we need to be able to ascertain effectively, what kind of people/materials are required in what quantity, for development. This research aims at developing efficient effort estimation models for agile and web-based software by using various neural networks such as Feed-Forward Neural Network (FFNN), Radial Basis Function Neural Network (RBFN), Functional Link Artificial Neural Network (FLANN) and Probabilistic Neural Network (PNN) and provide a comparative assessment of their performance. The approach used for agile software effort estimation is the Story Point Approach and that for web-based software effort estimation is the IFPUG Function Point Approach
    corecore