21 research outputs found

    A conceptual modeling methodology based on niches and granularity

    Get PDF
    This paper presents a methodology for conceptual modeling which is based on a new modeling primitive, the niche, and associated constructs granularity and reconciliation. A niche is an environment where entities interact for a specific purpose, playing specific roles, and according to the norms and constraints of that environment. Granularity refers to the relative level of power or influence of an entity within a niche. Reconciliation is a relationship from N entities onto one reconciled entity, and represents explicitly a situation where two or more different perspectives of the same entity have been reconciled, by negotiation, into a single consensus view. The methodology we propose provides a systematic method of designing conceptual models along with a process for normalising inappropriate relationships. Normalising is a prescriptive process for identifying and remedying inconsistencies within a model based on granularities. Drawing on a number of case studies, we show how niches and granularity make complexity easier to manage, highlight inaccuracies in a model, identify opportunities for achieving project goals, and reduce semantic heterogeneity

    Planck early results. II. The thermal performance of Planck

    Get PDF
    The performance of the Planck instruments in space is enabled by their low operating temperatures, 20 K for LFI and 0.1 K for HFI, achieved through a combination of passive radiative cooling and three active mechanical coolers. The scientific requirement for very broad frequency coverage led to two detector technologies with widely different temperature and cooling needs. Active coolers could satisfy these needs; a helium cryostat, as used by previous cryogenic space missions (IRAS, COBE, ISO, Spitzer, AKARI), could not. Radiative cooling is provided by three V-groove radiators and a large telescope baffle. The active coolers are a hydrogen sorption cooler (<20 K), a 4He Joule-Thomson cooler (4.7 K), and a 3He-4He dilution cooler (1.4 K and 0.1 K). The flight system was at ambient temperature at launch and cooled in space to operating conditions. The HFI bolometer plate reached 93 mK on 3 July 2009, 50 days after launch. The solar panel always faces the Sun, shadowing the rest of Planck, and operates at a mean temperature of 384 K. At the other end of the spacecraft, the telescope baffle operates at 42.3 K and the telescope primary mirror operates at 35.9 K. The temperatures of key parts of the instruments are stabilized by both active and passive methods. Temperature fluctuations are driven by changes in the distance from the Sun, sorption cooler cycling and fluctuations in gas-liquid flow, and fluctuations in cosmic ray flux on the dilution and bolometer plates. These fluctuations do not compromise the science data

    Data Engineering. Proceedings of the 15th International Conference

    No full text

    What Do We Know and How Well Do We Know It? Current Knowledge about Software Engineering Practices

    No full text
    Context: The ‘prescriptions’ used in software engineering for developing and maintaining systems make use of a set of ‘practice models’, which have largely been derived by codifying successful experiences of expert practitioners. Aim: To review the ways in which empirical practices, and evidence-based studies in particular, have begun to provide more systematic sources of evidence about what practices work, when, and why. Method: This review examines the current situation regarding empirical studies in software engineering and examine some of the ways in which evidence-based studies can inform and influence practice. Results: A mix of secondary and tertiary studies have been used to illustrate the issues. Conclusion: The corpus of evidence-based knowledge for software engineering is still developing. However, outcomes so far are encouraging, and indicate that in the future we can expect evidence-based research to play a larger role in informing practice, standards and teaching

    Documenting and designing QVTo model transformations through mathematics

    No full text
    Model transformations play an essential role in Model Driven Engineering (MDE), as they provide the means to use models as first-class artifacts in the software development process. While there exist a number of languages specifically designed to program model transformations, the practical challenges of documenting and designing model transformations are hardly addressed. In this paper we demonstrate how QVTo model transformations can be described and designed informally through the mathematical notation of set theory and functions. We align the QVTo concepts with the mathematical concepts, and, building on the latter, we formulate two design principles of developing QVTo transformations: structural decomposition and chaining model transformations

    Responsible data science: using event data in a “people friendly” manner

    No full text
    The omnipresence of event data and powerful process mining techniques make it possible to quickly learn process models describing what people and organizations really do. Recent breakthroughs in process mining resulted in powerful techniques to discover the real processes, to detect deviations from normative process models, and to analyze bottlenecks and waste. Process mining and other data science techniques can be used to improve processes within any organization. However, there are also great concerns about the use of data for such purposes. Increasingly, customers, patients, and other stakeholders worry about “irresponsible” forms of data science. Automated data decisions may be unfair or non-transparent. Confidential data may be shared unintentionally or abused by third parties. Each step in the “data science pipeline” (from raw data to decisions) may create inaccuracies, e.g., if the data used to learn a model reflects existing social biases, the algorithm is likely to incorporate these biases. These concerns could lead to resistance against the large-scale use of data and make it impossible to reap the benefits of process mining and other data science approaches. This paper discusses Responsible Process Mining (RPM) as a new challenge in the broader field of Responsible Data Science (RDS). Rather than avoiding the use of (event) data altogether, we strongly believe that techniques, infrastructures and approaches can be made responsible by design. Not addressing the challenges related to RPM/RDS may lead to a society where (event) data are misused or analysis results are deeply mistrusted
    corecore