128,088 research outputs found

    Applying formal methods to standard development: the open distributed processing experience

    Get PDF
    Since their introduction, formal methods have been applied in various ways to different standards. This paper gives an account of these applications, focusing on one application in particular: the development of a framework for creating standards for Open Distributed Processing (ODP). Following an introduction to ODP, the paper gives an insight into the current work on formalising the architecture of the Reference Model of ODP (RM-ODP), highlighting the advantages to be gained. The different approaches currently being taken are shown, together with their associated advantages and disadvantages. The paper concludes that there is no one all-purpose approach which can be used in preference to all others, but that a combination of approaches is desirable to best fulfil the potential of formal methods in developing an architectural semantics for OD

    Sensory Measurements: Coordination and Standardization

    Get PDF
    Do sensory measurements deserve the label of “measurement”? We argue that they do. They fit with an epistemological view of measurement held in current philosophy of science, and they face the same kinds of epistemological challenges as physical measurements do: the problem of coordination and the problem of standardization. These problems are addressed through the process of “epistemic iteration,” for all measurements. We also argue for distinguishing the problem of standardization from the problem of coordination. To exemplify our claims, we draw on olfactory performance tests, especially studies linking olfactory decline to neurodegenerative disorders

    Stabilizing knowledge through standards - A perspective for the humanities

    Get PDF
    It is usual to consider that standards generate mixed feelings among scientists. They are often seen as not really reflecting the state of the art in a given domain and a hindrance to scientific creativity. Still, scientists should theoretically be at the best place to bring their expertise into standard developments, being even more neutral on issues that may typically be related to competing industrial interests. Even if it could be thought of as even more complex to think about developping standards in the humanities, we will show how this can be made feasible through the experience gained both within the Text Encoding Initiative consortium and the International Organisation for Standardisation. By taking the specific case of lexical resources, we will try to show how this brings about new ideas for designing future research infrastructures in the human and social sciences

    An Analysis of Service Ontologies

    Get PDF
    Services are increasingly shaping the world’s economic activity. Service provision and consumption have been profiting from advances in ICT, but the decentralization and heterogeneity of the involved service entities still pose engineering challenges. One of these challenges is to achieve semantic interoperability among these autonomous entities. Semantic web technology aims at addressing this challenge on a large scale, and has matured over the last years. This is evident from the various efforts reported in the literature in which service knowledge is represented in terms of ontologies developed either in individual research projects or in standardization bodies. This paper aims at analyzing the most relevant service ontologies available today for their suitability to cope with the service semantic interoperability challenge. We take the vision of the Internet of Services (IoS) as our motivation to identify the requirements for service ontologies. We adopt a formal approach to ontology design and evaluation in our analysis. We start by defining informal competency questions derived from a motivating scenario, and we identify relevant concepts and properties in service ontologies that match the formal ontological representation of these questions. We analyze the service ontologies with our concepts and questions, so that each ontology is positioned and evaluated according to its utility. The gaps we identify as the result of our analysis provide an indication of open challenges and future work

    Life Cycle Costing and Food Systems: Concepts, Trends, and Challenges of Impact Valuation

    Get PDF
    Our global food systems create pervasive environmental, social, and health impacts. Impact valuation is an emerging concept that aims to quantify all environmental, social, and health costs of food systems in an attempt to make the true cost of food more transparent. It also is designed to facilitate the transformation of global food systems. The concept of impact valuation is emerging at the same time as, and partly as a response to, calls for the development of legal mechanisms to address environmental, social, and health concerns. Information has long been understood both as a necessary precursor for regulation and as a regulatory tool in and of itself. With global supply chains and widespread impacts, data necessary to produce robust and complete impact valuation requires participation and cooperation from a variety of food system actors. New costing methods, beyond basic accounting, are necessary to incorporate the scope of impacts and stakeholders. Furthermore, there are a range of unanswered questions surrounding realizations of impact valuation methods, e.g. data sharing, international privacy, corporate transparency, limitations on valuation itself, and data collection standardization. Because of the proliferation of calls for costing tools, this article steps back and assesses the current development of impact valuation methods. In this article, we review current methods and initiatives for the implementation of food system impact valuation. We conclude that in some instances, calls for the implementation of costing have outpaced available and reliable data collection and current costing techniques. Many existing initiatives are being developed without adequate consideration of the legal challenges that hinder implementation. Finally, we conclude with a reminder that although impact valuation tools are most often sought and implemented in service of market-based tools for reform, they can also serve as a basis for robust public policies

    ClouNS - A Cloud-native Application Reference Model for Enterprise Architects

    Full text link
    The capability to operate cloud-native applications can generate enormous business growth and value. But enterprise architects should be aware that cloud-native applications are vulnerable to vendor lock-in. We investigated cloud-native application design principles, public cloud service providers, and industrial cloud standards. All results indicate that most cloud service categories seem to foster vendor lock-in situations which might be especially problematic for enterprise architectures. This might sound disillusioning at first. However, we present a reference model for cloud-native applications that relies only on a small subset of well standardized IaaS services. The reference model can be used for codifying cloud technologies. It can guide technology identification, classification, adoption, research and development processes for cloud-native application and for vendor lock-in aware enterprise architecture engineering methodologies
    • …
    corecore