2,948 research outputs found

    Implementing atomic rendezvous within a transactional framework

    Get PDF
    International audienceThe authors address the problem of implementing the CSP (communicating sequential processes) rendezvous within a transactional framework. Instead of implementing a fair nondeterministic choice and assuming the correct functioning of processors and communication media, the authors propose an efficient transactional implementation of the atomic rendezvous in the presence of processor failures in a multiprocessor machine. Both atomicity and efficiency are obtained by using high-speed stable storage device

    Towards Goal-Directed Diagnosis (Preliminary Report)

    Get PDF
    Recent research has abstracted diagnosis away from the activity needed to acquire information and to act on diagnosed disorders. In some problem domains, however, such abstraction is counter-productive and does not reflect real-life practice, which integratesdiagnostic and therapeutic activity. Trauma management is a case in point. Here, we discuss a formalization of the integrated approach taken in TraumAID, a system we have developed to serve as an artificial aide to residents and physicians dealing with multiple trauma. Among other things, the active pursuit of information raises the question of what is and what is not worth pursuing. In TraumAID 2.0, we take the view that the process of diagnosis should continue only as long as it is likely to make a difference to future actions. That view is formalized in the goal-directed diagnostic paradigm (GDD). Unlike other diagnostic paradigms, goal-directed diagnosis is first and foremost concerned with setting goals based on its conclusions. It regards the traditional construction of an explanation for the faulty behavior as secondary. In order to explicitly represent goal-directedness, the diagnostic process is viewed as search in a space of attitude-beliefs. From this, we derive a high-level algorithm that produces appropriate requests for action while searching for an explanation. A complete explanation, however, is not the criterion for terminating action. Such a criterion, we argue, is better treated in terms of goal-means tradeoffs. TraumAID\u27s architecture, in so far as it embodies this goal-directed approach, assigns to a complementary planner the resolution of such tradeoffs

    A Logical Method for Policy Enforcement over Evolving Audit Logs

    Full text link
    We present an iterative algorithm for enforcing policies represented in a first-order logic, which can, in particular, express all transmission-related clauses in the HIPAA Privacy Rule. The logic has three features that raise challenges for enforcement --- uninterpreted predicates (used to model subjective concepts in privacy policies), real-time temporal properties, and quantification over infinite domains (such as the set of messages containing personal information). The algorithm operates over audit logs that are inherently incomplete and evolve over time. In each iteration, the algorithm provably checks as much of the policy as possible over the current log and outputs a residual policy that can only be checked when the log is extended with additional information. We prove correctness and termination properties of the algorithm. While these results are developed in a general form, accounting for many different sources of incompleteness in audit logs, we also prove that for the special case of logs that maintain a complete record of all relevant actions, the algorithm effectively enforces all safety and co-safety properties. The algorithm can significantly help automate enforcement of policies derived from the HIPAA Privacy Rule.Comment: Carnegie Mellon University CyLab Technical Report. 51 page

    Data mining by means of generalized patterns

    Get PDF
    The thesis is mainly focused on the study and the application of pattern discovery algorithms that aggregate database knowledge to discover and exploit valuable correlations, hidden in the analyzed data, at different abstraction levels. The aim of the research effort described in this work is two-fold: the discovery of associations, in the form of generalized patterns, from large data collections and the inference of semantic models, i.e., taxonomies and ontologies, suitable for driving the mining proces

    Doctor of Philosophy

    Get PDF
    dissertationAsynchronous design has a very promising potential even though it has largely received a cold reception from industry. Part of this reluctance has been due to the necessity of custom design languages and computer aided design (CAD) flows to design, optimize, and validate asynchronous modules and systems. Next generation asynchronous flows should support modern programming languages (e.g., Verilog) and application specific integrated circuits (ASIC) CAD tools. They also have to support multifrequency designs with mixed synchronous (clocked) and asynchronous (unclocked) designs. This work presents a novel relative timing (RT) based methodology for generating multifrequency designs using synchronous CAD tools and flows. Synchronous CAD tools must be constrained for them to work with asynchronous circuits. Identification of these constraints and characterization flow to automatically derive the constraints is presented. The effect of the constraints on the designs and the way they are handled by the synchronous CAD tools are analyzed and reported in this work. The automation of the generation of asynchronous design templates and also the constraint generation is an important problem. Algorithms for automation of reset addition to asynchronous circuits and power and/or performance optimizations applied to the circuits using logical effort are explored thus filling an important hole in the automation flow. Constraints representing cyclic asynchronous circuits as directed acyclic graphs (DAGs) to the CAD tools is necessary for applying synchronous CAD optimizations like sizing, path delay optimizations and also using static timing analysis (STA) on these circuits. A thorough investigation for the requirements of cycle cutting while preserving timing paths is presented with an algorithm to automate the process of generating them. A large set of designs for 4 phase handshake protocol circuit implementations with early and late data validity are characterized for area, power and performance. Benchmark circuits with automated scripts to generate various configurations for better understanding of the designs are proposed and analyzed. Extension to the methodology like addition of scan insertion using automatic test pattern generation (ATPG) tools to add testability of datapath in bundled data asynchronous circuit implementations and timing closure approaches are also described. Energy, area, and performance of purely asynchronous circuits and circuits with mixed synchronous and asynchronous blocks are explored. Results indicate the benefits that can be derived by generating circuits with asynchronous components using this methodology

    TRANSFORMING GOVERNMENT AGENCIES’ APPROACH TO EPARTICIPATION THROUGH EFFICIENT EXPLOITATION OF SOCIAL MEDIA

    Get PDF
    Government agencies are making considerable investments for exploiting the capabilities offered by ICT, and especially the Internet, to increase citizens’ engagement in their decision and policy making processes. However, this first generation of e-participation has been characterised by limited usage of the ‘official’ e-consultation spaces of government agencies by the citizens. The emergence of Web 2.0 social media offers big opportunities for overcoming this problem, and proceeding to a second generation of broader, deeper and more advanced e-participation. This paper presents a methodology for the efficient exploitation of Web 2.0 social media by government agencies in order to broaden and enhance e-participation. It is based on a central platform which enables posting content and deploying micro web applications (‘Policy Gadgets’-Padgets) to multiple popular Web 2.0 social media, and also collecting users’ interactions with them (e.g. views, comments, ratings) in an efficient manner using their application programming interfaces (API). These interactions’ data undergo various levels of processing, such as calculation of useful analytics, opinion mining and simulation modelling, in order to provide effective support to public decision and policy makers. The proposed methodology allows government agencies to adopt advanced and highly effective ‘hybrid’ e-participation approaches

    Formalizing Knowledge by Ontologies: OWL and KIF

    Get PDF
    During the last years, the activities of knowledge formalization and sharing useful to allow for semantically enabled management of information have been attracting growing attention, expecially in distributed environments like the Web. In this report, after a general introduction about the basis of knowledge abstraction and its formalization through ontologies, we briefly present a list of relevant formal languages used to represent knowledge: CycL, FLogic, LOOM, KIF, Ontolingua, RDF(S) and OWL. Then we focus our attention on the Web Ontology Language (OWL) and the Knowledge Interchange Format (KIF). OWL is the main language used to describe and share ontologies over the Web: there are three OWL sublanguages with a growing degree of expressiveness. We describe its structure as well as the way it is used in order to reasons over asserted knowledge. Moreover we briefly present three relevant OWL ontology editors: Prot?eg?e, SWOOP and Ontotrack and two important OWL reasoners: Pellet and FACT++. KIF is mainly a standard to describe knowledge among different computer systems so as to facilitate its exchange. We describe the main elements of KIF syntax; we also consider Sigma, an environment for creating, testing, modifying, and performing inference with KIF ontologies. We comment some meaningful example of both OWL and KIF ontologies and, in conclusion, we compare their main expresive features
    • …
    corecore