26,952 research outputs found

    Cost Estimation Tool for Commercial Software Development Industries

    Get PDF
    ABSTRACT: As cost associated to the software development is increasing continuously, there is a need to direct attention at better understanding of development process and calibrating previous models and methods making them applicable to existing environment such as proposing hybrid tools using the techniques based on existing models. The main aim of this paper is to improve the cost estimation accuracy of applications at all the phases of a software development lifecycle by calibrating COCOMO using a function point as a size measure instead of SLOC being used in COCOMO model. Here we have discussed the working of proposed software estimation tool that is a hybrid implementation of various software estimation and measurement techniques helping an organization to determine metrics such as Effort, Time and Cost, essential for improving turnaround time. This tool also works towards utilizing these metrics for project planning, scheduling and tracking

    On the Utility of Historical Project Statistics for Cost and Schedule Estimation: Results from a Simulation-based Case Study

    Get PDF
    The article of record as published may be found at https://doi.org/10.1016/0164-1212(90)90035-KEstimating the duration and cost of software projects has traditionally been, and continues to be, fraught with peril. This is in spite of the fact that over the last decade a large number of quantitative software estimation models have been developed. Our objective in this article is to challenge two fundamental assumptions that underlie research practices in the area of software estimation, which may be directly contributing to the industry's poor track record to date. Both concern the"fitness" of raw historical project statistics for calibrating and evaluating (new) estimation models. A system dynamics model of the software development process is developed and used as the experimentation vehicle for this study. An overview of the model's structure is presented, followed by a discussion of the two experiments conducted and their results. In the first, we demonstrate why it is inadequate to assess the accuracy of (new) estimation tools simply on the basis of how accurately they replicate old projects. Second, we show why raw historical project results do not necessarily constitute the most "preferred" and reliable benchmark for future estimation

    Super-resolution imaging and estimation of protein copy numbers at single synapses with DNA-PAINT

    Get PDF
    In the brain, the strength of each individual synapse is defined by the complement of proteins present or the "local proteome." Activity-dependent changes in synaptic strength are the result of changes in this local proteome and posttranslational protein modifications. Although most synaptic proteins have been identified, we still know little about protein copy numbers in individual synapses and variations between synapses. We use DNA-point accumulation for imaging in nanoscale topography as a single-molecule super-resolution imaging technique to visualize and quantify protein copy numbers in single synapses. The imaging technique provides near-molecular spatial resolution, is unaffected by photobleaching, enables imaging of large field of views, and provides quantitative molecular information. We demonstrate these benefits by accessing copy numbers of surface AMPA-type receptors at single synapses of rat hippocampal neurons along dendritic segments

    Half a billion simulations: evolutionary algorithms and distributed computing for calibrating the SimpopLocal geographical model

    Get PDF
    Multi-agent geographical models integrate very large numbers of spatial interactions. In order to validate those models large amount of computing is necessary for their simulation and calibration. Here a new data processing chain including an automated calibration procedure is experimented on a computational grid using evolutionary algorithms. This is applied for the first time to a geographical model designed to simulate the evolution of an early urban settlement system. The method enables us to reduce the computing time and provides robust results. Using this method, we identify several parameter settings that minimise three objective functions that quantify how closely the model results match a reference pattern. As the values of each parameter in different settings are very close, this estimation considerably reduces the initial possible domain of variation of the parameters. The model is thus a useful tool for further multiple applications on empirical historical situations

    The Dark Energy Survey Data Management System

    Full text link
    The Dark Energy Survey collaboration will study cosmic acceleration with a 5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process and archive these data and the resulting science ready data products. The DESDM system consists of an integrated archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the DESDM system for operation in the high performance computing (HPC) environments at NCSA and Fermilab. Operating the DESDM system in an HPC environment offers both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested the current DESDM system on both simulated and real survey data. We used Teragrid to process 10 simulated DES nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database. We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera. Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate that astrometric and photometric data quality is excellent.Comment: To be published in the proceedings of the SPIE conference on Astronomical Instrumentation (held in Marseille in June 2008). This preprint is made available with the permission of SPIE. Further information together with preprint containing full quality images is available at http://desweb.cosmology.uiuc.edu/wik
    corecore