332 research outputs found

    Ada software productivity prototypes: A case study

    Get PDF
    A case study of the impact of Ada on a Command and Control project completed at the Jet Propulsion Laboratory (JPL) is given. The data for this study was collected as part of a general survey of software costs and productivity at JPL and other NASA sites. The task analyzed is a successful example of the use of rapid prototyping as applied to command and control for the U.S. Air Force and provides the U.S. Air Force Military Airlift Command with the ability to track aircraft, air crews and payloads worldwide. The task consists of a replicated database at several globally distributed sites. The local databases at each site can be updated within seconds after changes are entered at any one site. The system must be able to handle up to 400,000 activities per day. There are currently seven sites, each with a local area network of computers and a variety of user displays; the local area networks are tied together into a single wide area network. Using data obtained for eight modules, totaling approximately 500,000 source lines of code, researchers analyze the differences in productivities between subtasks. Factors considered are percentage of Ada used in coding, years of programmer experience, and the use of Ada tools and modern programming practices. The principle findings are the following. Productivity is very sensitive to programmer experience. The use of Ada software tools and the use of modern programming practices are important; without such use Ada is just a large complex language which can cause productivity to decrease. The impact of Ada on development effort phases is consistent with earlier reports at the project level but not at the module level

    Hierarchical Expert Networks for Meta-Learning

    Full text link
    The goal of meta-learning is to train a model on a variety of learning tasks, such that it can adapt to new problems within only a few iterations. Here we propose a principled information-theoretic model that optimally partitions the underlying problem space such that specialized expert decision-makers solve the resulting sub-problems. To drive this specialization we impose the same kind of information processing constraints both on the partitioning and the expert decision-makers. We argue that this specialization leads to efficient adaptation to new tasks. To demonstrate the generality of our approach we evaluate three meta-learning domains: image classification, regression, and reinforcement learning.Comment: Presented at the 4th ICML Workshop on Life Long Machine Learning, 202

    An Information-theoretic On-line Learning Principle for Specialization in Hierarchical Decision-Making Systems

    Full text link
    Information-theoretic bounded rationality describes utility-optimizing decision-makers whose limited information-processing capabilities are formalized by information constraints. One of the consequences of bounded rationality is that resource-limited decision-makers can join together to solve decision-making problems that are beyond the capabilities of each individual. Here, we study an information-theoretic principle that drives division of labor and specialization when decision-makers with information constraints are joined together. We devise an on-line learning rule of this principle that learns a partitioning of the problem space such that it can be solved by specialized linear policies. We demonstrate the approach for decision-making problems whose complexity exceeds the capabilities of individual decision-makers, but can be solved by combining the decision-makers optimally. The strength of the model is that it is abstract and principled, yet has direct applications in classification, regression, reinforcement learning and adaptive control

    Jean Chaintron, un militant communiste dans la préfectorale (1944-1947)

    No full text
    Ce mémoire traite de la période préfectorale du militant communiste Jean Chaintron de septembre 1944 à novembre 1947 en Haute-Vienne. Il s'intéresse, dans un premier temps, aux conditions de nomination d'un militant politique au poste de préfet. Dans un deuxième temps, il revient sur l'aspect administratif de l'expérience préfectorale de J. Chaintron. Enfin, ce travail aborde l'aspect politique de la nomination d'un préfet qui reste militant et membre du Comité central du Parti communiste français. En effet, cette appartenance remet en cause la neutralité d'un représentant du gouvernement et suscite conflits et polémiques

    Estimating Software-Development Costs With Greater Accuracy

    Get PDF
    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system

    Partielle Normalisierung der kognitiven Funktionen und deren neuronalen Korrelate bei schwer depressiven Patienten nach Elektrokonvulsionstherapie (ECT)

    Full text link
    Die Dissertation beschäftigt sich mit der Frage nach dem Ausmaß der kognitiven Defizite von depressiven Patienten vor und nach einer Elektrokonvulsionstherapie (EKT). Die erste Studie zeigte, dass bestimmte Gedächtnisprozesse wie das unmittelbare Gedächtnis sich nach erfolgreicher EKT verbesserten, während andere wie das verzögerte Gedächtnis beeinträchtigt blieb. In der zweiten Studie konnte nachwiesen werden, dass die depressive Symptomatik mit funktionellen Defiziten in sekundären Arealen des auditorischen Netzwerks einherging, die auch nach EKT erhalten blieben. Die letzten zwei Studien mit einem crossmodalen affektiven Primingparadigma zeigten, dass depressive Patienten vor EKT im Vergleich zu Gesunden kaum zerebrale Aktivierung im limbischen System als neuronale Antwort auf die emotionalen Reize hatten. Zusätzlich reagierten sie signifikant langsamer vor und nach EKT. Nach EKT war eine symptomatische Verbesserung mit partiell angestiegener neuronaler Aktivität assoziiert

    Accurate Estimates Without Local Data?

    Get PDF
    The article of record as published may be found at http://dx.doi.org/10.1002/spip.414Models of software projects input project details and output predictions via their internal tunings. The output predictions, therefore, are affected by variance in the project details P and variance in the internal tunings T. Local data is often used to constrain the internal tunings (reducing T). While constraining internal tunings with local data is always the preferred option, there exist some models for which constraining tuning is optional. We show empirically that, for the USC COCOMO family of models, the effects of P dominate the effects of T i.e. the output variance of these models can be controlled without using local data to constrain the tuning variance (in ten case studies, we show that the estimates generated by only constraining P are very similar to those produced by constraining T with historical data). We conclude that, if possible, models should be designed such that the effects of the project options dominate the effects of the tuning options. Such models can be used for the purposes of decision making without elaborate, tedious, and time‐consuming data collection from the local domain. Copyright © 2009 John Wiley & Sons, Ltd

    How Engineers Really Think About Risk: A Study of JPL Engineers

    Get PDF
    25th International Forum on COCOMO and Systems/Software Cost Modeling presentatio

    Source Lines Counter (SLiC) Version 4.0

    Get PDF
    Source Lines Counter (SLiC) is a software utility designed to measure software source code size using logical source statements and other common measures for 22 of the programming languages commonly used at NASA and the aerospace industry. Such metrics can be used in a wide variety of applications, from parametric cost estimation to software defect analysis. SLiC has a variety of unique features such as automatic code search, automatic file detection, hierarchical directory totals, and spreadsheet-compatible output. SLiC was written for extensibility; new programming language support can be added with minimal effort in a short amount of time. SLiC runs on a variety of platforms including UNIX, Windows, and Mac OSX. Its straightforward command-line interface allows for customization and incorporation into the software build process for tracking development metrics.
    corecore