83,896 research outputs found

    archiTECTONICS: Pre- and Trans-Disciplinary Reference in Beginning Design

    Get PDF
    This presentation was part of the session : Pedagogy: Procedures, Scaffolds, Strategies, Tactics24th National Conference on the Beginning Design StudentPedagogical approaches to beginning design in architecture often assume trans-disciplinary modes of exploration to filter problem parameters and sculpt perceptual outlook for iterative potential. A closer look suggests moments within the architectural design process that come before, or around, the discipline itself in the form of other disciplines accompanied by basic principles, such as Visual Literacy. Iterating and perceiving through every disciplinary dynamic, instance, and/or action in the process of designing transcends, builds, and structures its neighbor for explorative sequencing, intention, and growth of sensibilities in design resolution. An acute awareness of disciplinary state, in a maturing design process, can alleviate obscurity of ideological foundation and facilitate growth for trans-disciplinary thinking, making, and communicating in a root discipline such as architecture. How can beginning design instructors guide young designers to keep ideas and concepts for design in focus, recognizing that root disciplines transcend pre- and trans-disciplinary processes? Does recognizing variation in pace, induced by digital and analog tools, and intention of design iteration, by discipline, instill clarity by pre-disciplinary thinking, perception, and operation? Trans-disciplinary exercise provokes awareness of pre-disciplinary foundations furthering possibilities for unique root-disciplinary understandings and results. The developed exercise, archiTECTONIC, recognizes and cycles through reasoning, conceptualization, and iteration in a trans-disciplinary sequence, allowing the beginning design student to recognize pre-disciplinary ideology, pace, and purpose when processing ideas through fundamentals of architectural design. Engaging this as a strategy for seeing, thinking, and maneuvering through a dynamic process provides design liberty and clarity for processing and communicating in a root discipline, in this case architecture

    Issues about the Adoption of Formal Methods for Dependable Composition of Web Services

    Full text link
    Web Services provide interoperable mechanisms for describing, locating and invoking services over the Internet; composition further enables to build complex services out of simpler ones for complex B2B applications. While current studies on these topics are mostly focused - from the technical viewpoint - on standards and protocols, this paper investigates the adoption of formal methods, especially for composition. We logically classify and analyze three different (but interconnected) kinds of important issues towards this goal, namely foundations, verification and extensions. The aim of this work is to individuate the proper questions on the adoption of formal methods for dependable composition of Web Services, not necessarily to find the optimal answers. Nevertheless, we still try to propose some tentative answers based on our proposal for a composition calculus, which we hope can animate a proper discussion

    A Calculus for Orchestration of Web Services

    Get PDF
    Service-oriented computing, an emerging paradigm for distributed computing based on the use of services, is calling for the development of tools and techniques to build safe and trustworthy systems, and to analyse their behaviour. Therefore, many researchers have proposed to use process calculi, a cornerstone of current foundational research on specification and analysis of concurrent, reactive, and distributed systems. In this paper, we follow this approach and introduce CWS, a process calculus expressly designed for specifying and combining service-oriented applications, while modelling their dynamic behaviour. We show that CWS can model all the phases of the life cycle of service-oriented applications, such as publication, discovery, negotiation, orchestration, deployment, reconfiguration and execution. We illustrate the specification style that CWS supports by means of a large case study from the automotive domain and a number of more specific examples drawn from it

    Fairness Testing: Testing Software for Discrimination

    Full text link
    This paper defines software fairness and discrimination and develops a testing-based method for measuring if and how much software discriminates, focusing on causality in discriminatory behavior. Evidence of software discrimination has been found in modern software systems that recommend criminal sentences, grant access to financial products, and determine who is allowed to participate in promotions. Our approach, Themis, generates efficient test suites to measure discrimination. Given a schema describing valid system inputs, Themis generates discrimination tests automatically and does not require an oracle. We evaluate Themis on 20 software systems, 12 of which come from prior work with explicit focus on avoiding discrimination. We find that (1) Themis is effective at discovering software discrimination, (2) state-of-the-art techniques for removing discrimination from algorithms fail in many situations, at times discriminating against as much as 98% of an input subdomain, (3) Themis optimizations are effective at producing efficient test suites for measuring discrimination, and (4) Themis is more efficient on systems that exhibit more discrimination. We thus demonstrate that fairness testing is a critical aspect of the software development cycle in domains with possible discrimination and provide initial tools for measuring software discrimination.Comment: Sainyam Galhotra, Yuriy Brun, and Alexandra Meliou. 2017. Fairness Testing: Testing Software for Discrimination. In Proceedings of 2017 11th Joint Meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering (ESEC/FSE), Paderborn, Germany, September 4-8, 2017 (ESEC/FSE'17). https://doi.org/10.1145/3106237.3106277, ESEC/FSE, 201

    Operational specification for FCA using Z

    Get PDF
    We present an outline of a process by which operational software requirements specifications can be written for Formal Concept Analysis (FCA). The Z notation is used to specify the FCA model and the formal operations on it. We posit a novel approach whereby key features of Z and FCA can be integrated and put to work in contemporary software development, thus promoting operational specification as a useful application of conceptual structures.</p

    A customisable pipeline for continuously harvesting socially-minded Twitter users

    Full text link
    On social media platforms and Twitter in particular, specific classes of users such as influencers have been given satisfactory operational definitions in terms of network and content metrics. Others, for instance online activists, are not less important but their characterisation still requires experimenting. We make the hypothesis that such interesting users can be found within temporally and spatially localised contexts, i.e., small but topical fragments of the network containing interactions about social events or campaigns with a significant footprint on Twitter. To explore this hypothesis, we have designed a continuous user profile discovery pipeline that produces an ever-growing dataset of user profiles by harvesting and analysing contexts from the Twitter stream. The profiles dataset includes key network and content-based users metrics, enabling experimentation with user-defined score functions that characterise specific classes of online users. The paper describes the design and implementation of the pipeline and its empirical evaluation on a case study consisting of healthcare-related campaigns in the UK, showing how it supports the operational definitions of online activism, by comparing three experimental ranking functions. The code is publicly available.Comment: Procs. ICWE 2019, June 2019, Kore

    Bell nonlocality, signal locality and unpredictability (or What Bohr could have told Einstein at Solvay had he known about Bell experiments)

    Full text link
    The 1964 theorem of John Bell shows that no model that reproduces the predictions of quantum mechanics can simultaneously satisfy the assumptions of locality and determinism. On the other hand, the assumptions of \emph{signal locality} plus \emph{predictability} are also sufficient to derive Bell inequalities. This simple theorem, previously noted but published only relatively recently by Masanes, Acin and Gisin, has fundamental implications not entirely appreciated. Firstly, nothing can be concluded about the ontological assumptions of locality or determinism independently of each other -- it is possible to reproduce quantum mechanics with deterministic models that violate locality as well as indeterministic models that satisfy locality. On the other hand, the operational assumption of signal locality is an empirically testable (and well-tested) consequence of relativity. Thus Bell inequality violations imply that we can trust that some events are fundamentally \emph{unpredictable}, even if we cannot trust that they are indeterministic. This result grounds the quantum-mechanical prohibition of arbitrarily accurate predictions on the assumption of no superluminal signalling, regardless of any postulates of quantum mechanics. It also sheds a new light on an early stage of the historical debate between Einstein and Bohr.Comment: Substantially modified version; added HMW as co-autho

    On the formal statement of the special principle of relativity

    Get PDF
    The aim of the paper is to develop a proper mathematical formalism which can help to clarify the necessary conceptual plugins to the special principle of relativity and leads to a deeper understanding of the principle in its widest generality
    corecore