7,951 research outputs found
RECOMMENDATIONS FOR IMPROVING SOFTWARE COST ESTIMATION IN DOD ACQUISITION
Acquisition initiatives within the Department of Defense (DOD) are becoming increasingly reliant on software. While the DOD has ample experience in estimating costs of hardware acquisition, expertise in estimating software acquisition costs is lacking. The objective of this capstone project is to summarize the current software cost estimating methods, analyze existing software cost estimating models, and suggest areas and methods for improvement. To accomplish this, surveys were conducted to gather program cost data, which was run through existing cost estimating models. From here, the outputs were compared to actual program costs. This established a baseline for the effectiveness of existing methods and guided suggestions for areas of improvement. The Software Resource Data Reports (SRDR) data used seemed to have spurious data reporting from at least one source, and the base cost estimation models were not found to be sufficiently accurate in our study. The capstone finds that calibrating the cost models to the data available improved those models dramatically. In all, the capstone recommends performing data realism checks upon SRDR submissions to ensure data accuracy and calibrating cost models for each contractor with the available data before using them to estimate DOD Acquisition costs.Civilian, Department of the ArmyCivilian, Department of the ArmyCivilian, Department of the ArmyCivilian, Department of the ArmyApproved for public release. Distribution is unlimited
Empirical research on Software Effort Estimation Accuracy
To improve the software development process is named by both the European Union and the United States government as an important task for society. The constant problem with effort overruns and estimation inaccuracy is a main part of the software development problem. Empirical research on software effort estimation is a key part of the continuing effort by researchers and practitioners to improve the way in which software development projects are carried out.
As part of this effort, a study on eighteen of the latest projects at a Norwegian software consultancy was done. The study was done by interviewing the project managers responsible for the projects, having them provide key project data, and their assessments of different project properties related to effort estimation. The study focused on answering research questions related to:
⢠The effect the contractor-customer relationship and customer properties have on estimation accuracy
⢠The effect utilizing experience data has on estimation accuracy
⢠The role of estimation accuracy when assessing software project success
The analysis of the collected empirical data showed that reduced effort overruns was associated with increased contact frequency with the customer and contracts that share the risk between contractor and customer.
Utilization of experience data, and the use of checklists, was also found to have a positive impact on estimation accuracy.
There was not found any strong correlation between project managersâ project success assessment and estimation accuracy, indicating that estimation accuracy and project manager success assessment contribute with two different, but important viewpoints when software project success is to be assessed.
In addition to the empirical study and its results, the thesis presents a review of existing group combination techniques for software effort estimation. The review was motivated by recent studies that have suggested that to do software estimation as a group is beneficial. The review presents techniques that vary largely as to how they structure the interaction among the group members, and how their opinions are aggregated. A thorough discussion on the argumentation behind the techniques, and the consequences they have is given in the review.
The empirical data collected during the work with this thesis suggests different ways in which software contractors could improve their estimation ability and reduce their effort overruns.
The conclusions of this thesis is, that to increase estimation accuracy, software contractors should: (i) involve the customer, and nurture the customer relationship, (ii) add some repeatable structure to the estimation process, but be careful not to add too much structure, (iii) gather and utilize experience data in the estimation process and (iv) evaluate projects when they are done. In doing the evaluation both objective data on effort, schedule and functionality compliance and subjective assessments of project success from key stakeholders, as customer, user, project manager, developers and management should be gathered
A Systematic Mapping of Factors Affecting Accuracy of Software Development Effort Estimation
Software projects often do not meet their scheduling and budgeting targets. Inaccurate estimates are often responsible for this mismatch. This study investigates extant research on factors that affect accuracy of software development effort estimation. The purpose is to synthesize existing knowledge, propose directions for future research, and improve estimation accuracy in practice. A systematic mapping study (a comprehensive review of existing research) is conducted to identify such factors and their impact on estimation accuracy. Thirty-two factors assigned to four categories (estimation process, estimatorâs characteristics, project to be estimated, and external context) are identified in a variety of research studies. Although the significant impact of several factors has been shown, results are limited by the lack of insight into the extent of these impacts. Our results imply a shift in research focus and design to gather more in-depth insights. Moreover, our results emphasize the need to argue for specific design decisions to enable a better understanding of possible influences of the study design on the credibility of the results. For software developers, our results provide a useful map to check the assumptions that undergird their estimates, to build comprehensive experience databases, and to adequately staff design projects
Effective Utilization of Historical Data to Increase Organizational Performance: Focus on Sales/ Tendering and Projects
Master's thesis in Offshore technologyIn Oil and Gas industry there was not enough focus on this topic as cost was not a big factor in good olden days. But the sensational drop in oil prices below US$40 per barrel at the end of 2015 made the price more than 60 percent down compared to the one in previous years. Itâs clear that the sector is going through one of the most transformative periods in its history. This situation has created more challenges to all O&G company leaders by forcing them to change their business strategies.
The operating companies in the Oil and Gas industry have been focusing to reduce costs and increase organizational performance. Accordingly suppliers companies need to acknowledge their focus on the efficiency and optimization of resources to be able to sustain and grow in a competitive market. It demands better control of estimates and cost on future sales/tendering process. As quoted by one of the Operations Managers âAn informed organization saves cost and wins fasterâ. The only way to get reliable information for any organization is by analyzing âwhat happened in the pastâ and what we learned from it. In other words this is achieved through utilization of historical data from previous projects and by developing benchmarking metrics. Further, usage of the historical data can improve estimation and scheduling, support strategic planning, and improve the organizational processes.
The historical project data or information can help in making strategic business decisions in any Organization. It can play a significant role in providing very distinct advantage over the competitors. Historical data can help the management to decide what projects are right for the future of the company and which projects can be avoided. Further, it can help to learn from past mistakes and win future bids by not repeating them. Most of the top management understands the importance of having and using historical project information or data. The problem is that very few companies have the methodologies, procedures, and systems in place to effectively use this information to improve their project processes and to support the estimation, scheduling, and control of future projects (opportunities).
The present work focuses on historical data, estimation process and lessons learned for enhancing organizational performance. Further, the work includes a case study and number of expert interviews conducted at ABB.
The work discusses how to collect, normalize, and analyze historical project data to develop practical information. Three models have been developed for project estimation process with a feedback loop, Lessons learned process model and Historical data utilization process. The recommendations have been made to use the historical data for establishing references for the sales/tendering department for future estimates, which can reduce the dependency on manual or a single personâs judgment and improve the estimation process. Some suggestions have also been made for establishing lessons learned process which can improve organizational performance.
The results from analysis show that by applying the recommended processes, organizations can achieve efficiency through easy access and storage of historical database, easy access to lessons learned, measurable KPIs. Also use of key variables like project complexity and severity of requirements for estimation process and historical data process can form a better relation for data analysis and utilization.AB
A requirements engineering framework for integrated systems development for the construction industry
Computer Integrated Construction (CIC) systems are computer environments through which
collaborative working can be undertaken. Although many CIC systems have been developed to demonstrate the
communication and collaboration within the construction projects, the uptake of CICs by the industry is still
inadequate. This is mainly due to the fact that research methodologies of the CIC development projects are
incomplete to bridge the technology transfer gap. Therefore, defining comprehensive methodologies for the
development of these systems and their effective implementation on real construction projects is vital.
Requirements Engineering (RE) can contribute to the effective uptake of these systems because it drives the
systems development for the targeted audience. This paper proposes a requirements engineering approach for
industry driven CIC systems development. While some CIC systems are investigated to build a broad and deep
contextual knowledge in the area, the EU funded research project, DIVERCITY (Distributed Virtual Workspace
for Enhancing Communication within the Construction Industry), is analysed as the main case study project
because its requirements engineering approach has the potential to determine a framework for the adaptation of
requirements engineering in order to contribute towards the uptake of CIC systems
A Double Machine Learning Trend Model for Citizen Science Data
1. Citizen and community-science (CS) datasets have great potential for
estimating interannual patterns of population change given the large volumes of
data collected globally every year. Yet, the flexible protocols that enable
many CS projects to collect large volumes of data typically lack the structure
necessary to keep consistent sampling across years. This leads to interannual
confounding, as changes to the observation process over time are confounded
with changes in species population sizes.
2. Here we describe a novel modeling approach designed to estimate species
population trends while controlling for the interannual confounding common in
citizen science data. The approach is based on Double Machine Learning, a
statistical framework that uses machine learning methods to estimate population
change and the propensity scores used to adjust for confounding discovered in
the data. Additionally, we develop a simulation method to identify and adjust
for residual confounding missed by the propensity scores. Using this new
method, we can produce spatially detailed trend estimates from citizen science
data.
3. To illustrate the approach, we estimated species trends using data from
the CS project eBird. We used a simulation study to assess the ability of the
method to estimate spatially varying trends in the face of real-world
confounding. Results showed that the trend estimates distinguished between
spatially constant and spatially varying trends at a 27km resolution. There
were low error rates on the estimated direction of population change
(increasing/decreasing) and high correlations on the estimated magnitude.
4. The ability to estimate spatially explicit trends while accounting for
confounding in citizen science data has the potential to fill important
information gaps, helping to estimate population trends for species, regions,
or seasons without rigorous monitoring data.Comment: 28 pages, 6 figure
Time Predictions: Understanding and Avoiding Unrealism in Project Planning and Everyday Life
time predictions; human judgement; overoptimism; uncertainty; project managemen
Demand Forecasting: Evidence-Based Methods
In recent decades, much comparative testing has been conducted to determine which forecasting methods are more effective under given conditions. This evidence-based approach leads to conclusions that differ substantially from current practice. This paper summarizes the primary findings on what to do â and what not to do. When quantitative data are scarce, impose structure by using expert surveys, intentions surveys, judgmental bootstrapping, prediction markets, structured analogies, and simulated interaction. When quantitative data are abundant, use extrapolation, quantitative analogies, rule-based forecasting, and causal methods. Among causal methods, use econometrics when prior knowledge is strong, data are reliable, and few variables are important. When there are many important variables and extensive knowledge, use index models. Use structured methods to incorporate prior knowledge from experiments and expertsâ domain knowledge as inputs to causal forecasts. Combine forecasts from different forecasters and methods. Avoid methods that are complex, that have not been validated, and that ignore domain knowledge; these include intuition, unstructured meetings, game theory, focus groups, neural networks, stepwise regression, and data mining
- âŚ