19 research outputs found

    Volume III: A Technology for Predictable Assembly from Certifiable Components

    No full text
    This report is the final volume in a three-volume series on component-based software engineering. Volumes I and II identified market conditions and technical concepts of component-based software technology, respectively. Volume III (this report) focuses on how component technology can be extended to achieve predictable assembly from certifiable components (PACC). An assembly of software components is predictable if its runtime behavior can be predicted from the properties of its components and their patterns of interactions. A component is certifiable if its (predictive) properties can be objectively measured or otherwise verified by independent third parties. This report identifies the key technical concepts of PACC, with an emphasis on the theory of prediction-enabled component technology (PECT)

    Software Component Certification: 10 Useful Distinctions

    No full text
    Using software components to develop mission-critical systems poses a number of technical, organizational, and economic challenges. One persistent and largely unaddressed challenge is how the consumers of software components—that is, the developers of mission-critical systems—can obtain a meaningful level of trust in the runtime behavior of software components. The most frequently cited concerns are centered on issues of security; for example, trust that a component does not contain malicious code or exhibit vulnerabilities that can be exploited by malicious code. There are, however, other concerns about software component behavior that can be just as important. For example, in an embedded weapon system, it may be crucial to trust that a component will always execute a function within a particular time bound or never introduce unbounded priority inversion. Certification is a practical, proven means of establishing trust in various sorts of things in other disciplines and is, therefore, a natural contender for developing trust in software components. This technical note does not propose a particular certification regimen for components. Rather, it introduces a series of 10 distinctions that can help in understanding different aspects of certification in the context of software components

    Issues and Techniques of CASE Integration with Configuration Management

    No full text
    Commercial computer-aided software engineering (CASE) tool technology has emerged as an important component of practical software development environments. Issues of CASE tool integration have received heightened attention in recent years, with various commercial products and technical approaches promising to make inroads into this difficult problem. One aspect of CASE integration that has not been adequately addressed is the integration of CASE tools with configuration management (CM), including both CM policies and systems. Organizations need to address how to make CASE tools from different vendors work effectively with an organization's CM policies and tools (in effect, integrate CASE with CM) within the context of the rapidly evolving state of commercial integration technology. This report describes key issues of the integration of CASE with CM from a third-party integrator's perspective, i.e., how to approach the integration of CASE and CM in such a way as to not require fundamental changes to the implementation of the tools or CM systems themselves

    Snapshot of CCL: A Language for Predictable Assembly

    No full text
    Construction and composition language (CCL) plays several roles in our approach to achieving automated predictable assembly. CCL is used to produce specifications that contain structural, behavioral, and analysis-specific information about component technologies, as well as components and assemblies in such technologies. These specifications are translated to one or more reasoning frameworks that analyze and predict the runtime properties of assemblies. CCL processors can also be used to automate many of the constructive activities of component-based development through various forms of program generation. Using a common specification for prediction and construction improves confidence that analysis models match implementations. This report presents a snapshot of CCL by examining a small example CCL specification

    Using the Vickrey-Clarke-Groves Auction Mechanism for Enhanced Bandwidth Allocation in Tactical Data Networks

    No full text
    A mechanism is an institution such as an auction, voting protocol, or a market that defines the rules for how humans are allowed to interact, and governs the procedure for how collective decisions are made. Computational mechanisms arise where computational agents work on behalf of humans. This report describes an investigation of the potential for using computational mechanisms to improve the quality of a combat group's common operating picture, in a setting where network bandwidth is scarce. Technical details are provided about a robust emulation of a tactical data network (based loosely on the Navy LINK-11) that was developed for the study. The report also outlines the basic principles of mechanism design, as well as the features of the Vickrey-Clarke-Groves (VCG) auction mechanism implemented for the study. The report describes how the VCG mechanism was used to allocate network bandwidth for sensor data fusion. Empirical results of the investigation are presented, and ideas for further exploration are offered. The overall conclusion of the study is that computational mechanism design is a promising alternative to traditional systems approaches to resource allocation in systems that are highly dynamic, involve many actors engaged in varying activities, and have varying—and possibly competing—goals

    Using Containers to Enforce Smart Constraints for Performance in Industrial Systems

    No full text
    Today, software engineering is concerned less with individual programs than with large-scale networks of interacting programs. For large-scale networks, engineering problems emerge that go well beyond functional correctness (the purview of programming) and encompass equally crucial nonfunctional qualities such as security, performance, availability, and fault tolerance. A pivotal challenge, then, is to provide techniques to routinely construct systems that have predictable nonfunctional quality. These techniques impose constraints on the problem being solved and on the form solutions can take. This technical note shows how smart constraints can be embedded in software infrastructure, so that systems conforming to those constraints are predictable by construction

    Distributed Object Technology With CORBA and Java: Key Concepts and Implications

    No full text
    The purpose of this report is to analyze the potential impact of distributed object technology (DOT) on software engineering practice. The analysis culminates with the conclusion that the technology will have a significant influence on both the design and reengineering of information systems and the processes used to build them. We see a profound impact and fundamental change in both technical thinking and practice as a result of the related technologies we group together as DOT

    Statistical Models for Empirical Component Properties and Assembly-Level Property Predictions: Toward Standard Labeling

    No full text
    One risk inherent in the use of software components has been that the behavior of assemblies of components is discovered only after their integration. The objective of our work is to enable designers to use known (and certified) component properties as parameters to models that can be used to predict assembly-level properties. Our concern in this paper is with empirical component properties and compositional reasoning, rather than formal properties and reasoning. Empirical component properties must be measured; assessing the effectiveness of predictions based on these properties also involves measurement. This, in turn, introduces systematic and random measurement error. As a consequence, statistical models are needed to describe empirical component properties and predictions. In this position paper, we identify the statistical models that we have found useful in our research, and which we believe can form a basis for standard industry labels for component properties and prediction theories

    Workshop on Model-Driven Architecture and Program Generation

    No full text
    This technical note summarizes the results of a workshop held on June 2, 2006, at the Software Engineering Institute in Pittsburgh, Pennsylvania (USA). The workshop explored business and technical aspects of program generation in the context of the Object Management Group's model-driven architecture development approach. The workshop was structured around consideration of the perspectives of five different communities: standards body, vendor, acquisition, development, and research. This note recapitulates these individual perspectives and highlights important themes

    Pin Component Technology (V1.0) and Its C Interface

    No full text
    Pin is a basic, simple component technology suitable for building embedded software applications. Pin implements the container idiom for software components. Containers provide a prefabricated shell in which custom code executes and through which all interactions between custom code and its external environment are mediated. Pin is a component technology for pure assembly—systems are assembled by selecting components and connecting their interfaces (which are composed of communication channels called pins). This report describes the main concepts of Pin and documents the C-language interface to Pin V1.0
    corecore