200 research outputs found

    Grosch's law: a statistical illusion?.

    Get PDF
    In this paper a central law on economies of scale in computer hardware pricing, Grosch's law is discussed. The history and various validation efforts are examined in detail. It is shown how the last set of validations during the eighties may be interpreted as a statistical misinterpretation, although this effect may have been present in all validation attempts, including the earliest ones. Simulation experiments reveal that constant returns to scale in combination with decreasing computer prices may give the illusion of Grosch's law when performing regression models against computer prices over many years. The paper also shows how the appropriate definition of computer capacity, and in particular Kleinrock's power definition, plays a central role in economies of scale for computer prices.Law;

    Modeling the dialogue aspects of an information system.

    Get PDF
    In this paper we investigate techniques offered by current object-oriented development methods for the specification of the user-system dialogue aspect of a software system. Current development methods do not give very extensive guidelines on how to model this aspect and the available techniques need some refinement and elaboration to fit this particular task in the software specification process. The paper first compares a number of approaches. The common elements of these approaches are summarized and further developed into one comprehensive set of techniques that addresses the needs of functional requirements analysis.

    Complexity measures for object-oriented conceptual models of an application domain.

    Get PDF
    According to Norman Fenton few work has been done on measuring the complexity of the problems underlying software development. Nonetheless, it is believed that this attribute has a significant impact on software quality and development effort. A substantial portion of the underlying problems are captured in the conceptual model of the application domain. Based on previous work on conceptual modelling of aplication domains, the attribute 'complexity of a conceptual model' is formally defined in this papaer using elementary concepts from Measure Theory. Moreover, a number of complexity measures are defined and validated against this complexity definition. It is argued and demonstrated that these problem domain measures are part of a solution to the problem outlined by Norman Fenton.Model; Models;

    Queue lengths and waiting times in the two-class two-server queue with nonpreemptive heterogeneous priority structures.

    Get PDF
    Our aim is to analyze a multiserver queue with nonpreemptive heterogeneous priority structures, which arises in the performance evaluation of batch initiator settings n MVS. We use matrix-geometric methods and derive the stationary distribution of queue lenghts and waitng times for the Markovian two-class two-server case.Structure; Performance; Performance evaluation; Evaluation; Methods; Distribution;

    On closed queueing networks with mixed preemptive resume priority servers.

    Get PDF
    This paper discusses a typical closed queueing network model in which multiple preemptive resume servers are present with different priority structures at each priority node. An algorithm is developed that is applicable for the three-node two-class model and results are compared to point estimates obtained from simulation. The algorithm is partly based on the Delay/MVA algorithm developed by Bondi and Chuang, because of the accuracy with which instant arrival queue lengths at fcfs servers are calculated. Results are also compared with results obtained from the Shadow Approximation.Networks;

    Characterising aggregations with existence dependency.

    Get PDF
    Abstract: The concept of aggregation is considered as one of the basic principles in object-oriented analysis. There is however no standard definition of this concept and each object-oriented analysis method has its own definition of aggregation. The aim of this paper is not to discuss the different types of aggregation that exist. However, having assessed the complexity of the concept, we will illustrate how a basic set of formal concepts is sufficient to define of the structural and behavioral aspects of different existing flavours of aggregation. If a development method wants to offer a rich concept such as aggregation, it can define the semantics of the desired flavour of the aggregation using these core formal concepts. Analysts then have the choice to use the aggregation defined by the method or to fall back on the core concepts if a different flavour of aggregation is needed to model the situation at hand.Principles; Model;

    Generic object models and business process (re)design.

    Get PDF
    This paper explores the capacities of generic object-relationship models in the context of business process modeling and business process re-engineering. The presentation is based on a framework for strategic business function typology. It is shown how generic models can be developed for each kind of business function within the typology. Business process re-engineering can be represented by transformations of business models, corresponding to shifts within the typology framework. Although the results of the paper are presented by means of one particular dialect of the object-relationship approach, the results remain valid for all object oriented approaches that make use of objects and relationships. This paper contributes to the further formalisation of business process modeling.Models; Model; Processes;

    Activity Based Costing techniques for workload characterization.

    Get PDF
    This paper addresses the problem of non-captured service demands in workload monitoring data. Capture ratios are the coefficients that correct the workload service demands so that they fit the global system monitoring data. This paper proposes new techniques for the determination of capture ratios by means of Activity Based Costing techniques. The techniques are illustrated by means of a case study, which also illustrates the non-trivial nature of capture ratios in practical performance analysis.Activity based costing;

    DISTANCE: a framework for software measure construction.

    Get PDF
    In this paper we present a framework for software measurement that is specifically suited to satisfy the measurement needs of empirical software engineering research. The framework offers an approach to measurement that builds upon the easily imagined, detected and visualised concepts of similarity and dissimilarity between software entities. These concepts are used both to model the software attributes of interest and to define the corresponding software measures. Central to the framework is a process model that embeds constructive procedures for attribute modelling and measure construction into a goal-oriented approach to empirical software engineering studies. The underlying measurement theoretic principles of our approach ensure the construct validity of the resulting measures. The approach was tested on a popular suite of object-oriented design measures. We further show that our measure construction method compares favourably to related work.Software;
    • …
    corecore