400 research outputs found

    A Compiled Implementation of Strong Reduction

    Get PDF
    International audienceMotivated by applications to proof assistants based on dependent types, we develop and prove correct a strong reducer and β-equivalence checker for the λ-calculus with products, sums, and guarded fixpoints. Our approach is based on compilation to the bytecode of an abstract machine performing weak reductions on non-closed terms, derived with minimal modifications from the ZAM machine used in the Objective Caml bytecode interpreter, and complemented by a recursive "read back" procedure. An implementation in the Coq proof assistant demonstrates important speed-ups compared with the original interpreter-based implementation of strong reduction in Coq

    Explicit robust schemes for implementation of a class of principal value-based constitutive models: Symbolic and numeric implementation

    Get PDF
    The issue of developing effective and robust schemes to implement a class of the Ogden-type hyperelastic constitutive models is addressed. To this end, special purpose functions (running under MACSYMA) are developed for the symbolic derivation, evaluation, and automatic FORTRAN code generation of explicit expressions for the corresponding stress function and material tangent stiffness tensors. These explicit forms are valid over the entire deformation range, since the singularities resulting from repeated principal-stretch values have been theoretically removed. The required computational algorithms are outlined, and the resulting FORTRAN computer code is presented

    COST ESCALATION ANALYSIS (PRICE ADJUSTMENT) ON MULTI YEARS INFRASTRUCTURE PROJECTS (PLUAL YEARS)

    Get PDF
    The Kalimati Long Storage Project for Raw Water is one of the projects implemented in Sidoarjo Regency under the supervision of the Brantas River Basin which is a Multiyears project. Where project activities last for 2 (two) years or approximately 720 working days. Multiyears (Multiple Years) projects have risks in the implementation process. One of the risks in the Multiyears project is the adjustment of the unit price of contract components which include building materials, labor, and equipment to the contract value at the time of the auction. In bidding prices in the tender/auction process, contractors must pay attention to construction costs, overhead and profits.   The accuracy of the construction cost estimation is in accordance with the project stages from planning, design to final estimation at the time of project completion. The guidelines in this study use the escalation calculation that has been regulated based on Presidential Regulation Number 70 of 2012 article 92 which has been stated in the project agreement contract. This research was conducted on a multi-year project based on government regulations and literature.  From the analysis of the cost escalation calculation for October 2018 – December 2019 there is a weight difference of 6.74% from the escalation value. The results of the calculation of the escalation value obtained a value of Rp. 13,457,629.000.00. While the value of the increase in costs of Rp. 199,732,279,557.48 with a total payment of Rp. 357,235,377,870.00. There is a weight difference of 6.75% from the escalation value. The results of the calculation of the escalation value obtained a value of Rp. 13,485.882,216.22. While the value of the increase in costs of Rp. 199,732,279,557.48 with a total payment of Rp. 357,263,630,870.0

    Statistical Characterization of Wildfire Dynamics: Studying the Relation Between Burned Area and Head of the Fire

    Get PDF
    We show that the probability density function (PDF) of an area enclosed by a random perimeter is driven by the PDF of the integration bounds and the mean value of the perimeter function. With reference to wildfires, if the integration interval is aligned with the main direction of propagation, the PDF of the burned area is driven by the PDF of the position of the head of the fire times the mean value of the fire front. We show that if the random fire front perimeter is modelled as an ellipse-like curve with stochastic noise, the relation between the probability distribution of the burned area and the one of the head of the fire is linear, with the constant mean value of the perimeter as a factor. Different random noise models have been developed and implemented for this purpose. Different behaviours of the stochastic position of the head of the fire have been studied, including Viegas’ rate of spread model. We show that for the realistic propagation model given by the operational wildfire cellular automata simulator Propagator [46], the PDF of the burned area is still driven by the position of the head of the fire and the mean value of the fire front perimeter, and the shape of the perimeter is not relevant. This has been shown for 5 case studies, ranging from easy ones to realistic complicated cases

    Understanding the challenges to implementing case management for people with dementia in primary care in England: a qualitative study using Normalization Process Theory

    Get PDF
    Background Case management has been suggested as a way of improving the quality and cost-effectiveness of support for people with dementia. In this study we adapted and implemented a successful United States’ model of case management in primary care in England. The results are reported elsewhere, but a key finding was that little case management took place. This paper reports the findings of the process evaluation which used Normalization Process Theory to understand the barriers to implementation. Methods Ethnographic methods were used to explore the views and experiences of case management. Interviews with 49 stakeholders (patients, carers, case managers, health and social care professionals) were supplemented with observation of case managers during meetings and initial assessments with patients. Transcripts and field notes were analysed initially using the constant comparative approach and emerging themes were then mapped onto the framework of Normalization Process Theory. Results The primary focus during implementation was on the case managers as isolated individuals, with little attention being paid to the social or organizational context within which they worked. Barriers relating to each of the four main constructs of Normalization Process Theory were identified, with a lack of clarity over the scope and boundaries of the intervention (coherence); variable investment in the intervention (cognitive participation); a lack of resources, skills and training to deliver case management (collective action); and limited reflection and feedback on the case manager role (reflexive monitoring). Conclusions Despite the intuitive appeal of case management to all stakeholders, there were multiple barriers to implementation in primary care in England including: difficulties in embedding case managers within existing well-established community networks; the challenges of protecting time for case management; and case managers’ inability to identify, and act on, emerging patient and carer needs (an essential, but previously unrecognised, training need). In the light of these barriers it is unclear whether primary care is the most appropriate setting for case management in England. The process evaluation highlights key aspects of implementation and training to be addressed in future studies of case management for dementia

    Auto-Configuration of ACL Policy in Case of Topology Change in Hybrid SDN

    Get PDF
    © 2016 IEEE. Software-defined networking (SDN) has emerged as a new network architecture, which decouples both the control and management planes from data plane at forwarding devices. However, SDN deployment is not widely adopted due to the budget constraints of organizations. This is because organizations are always reluctant to invest too much budget to establish a new network infrastructure from scratch. One feasible solution is to deploy a limited number of SDN-enabled devices along with traditional (legacy) network devices in the network of an organization by incrementally replacing traditional network by SDN, which is called hybrid SDN (Hybrid SDN) architecture. Network management and control in Hybrid SDN are vital tasks that require significant effort and resources. Manual handling of these tasks is error prone. Whenever network topology changes, network policies (e.g., access control list) configured at the interfaces of forwarding devices (switches/routers) may be violated. That creates severe security threats for the whole network and degrades the network performance. In this paper, we propose a new approach for Hybrid SDN that auto-detects the interfaces of forwarding devices and network policies that are affected due to change in network topology. In the proposed approach, we model network-wide policy and local policy at forwarding device using a three-tuple and a six-tuple, respectively. We compute graph to represent the topology of the network. By using graph difference technique, we detect a possible change in topology. In the case of topology change, we verify policy for updated topology by traversing tree using six-tuple. If there is any violation in policy implementation, then affected interfaces are indicated and policies that need to be configured are also indicated. Then, policies are configured on the updated topology according to specification in an improved way. Simulation results show that our proposed approach enhances the network efficiency in term of successful packet delivery ratio, the ratio of packets that violated the policy and normalized overhead

    A knowledge acquisition assistant for the expert system shell Nexpert-Object

    Get PDF
    This study addresses the problems of knowledge acquisition in expert system development examines programs whose goal is to solve part of these problems. Among them are knowledge acquisition tools, which provide the knowledge engineer with a set of Artificial Intelligence primitives, knowledge acquisition aids, which offer to the knowledge engineer a guidance in knowledge elicitation, and finally, automated systems, which try to replace the human interviewer with a machine interface. We propose an alternative technique to these approaches: an interactive syntactic analyzer of an emerging knowledge base written with the expert system shell called Nexpert Object. This program intends to help the knowledge engineer during the editing of a knowledge base, both from a knowledge engineering and a knowledge representation point of view. The implementation is a Desk Accessory written in C, running on Macintosh concurrently with Nexpert Object
    • …
    corecore