68 research outputs found

    Using Resources as Synchronizers to Manage Mobile Process Adaptation

    Get PDF
    Process management in Mobile Ad-hoc NETworks (MANETs) has to deal with different types of tasks and resources. Teams can be formed with specific goals, such as recognition of a damaged area for disaster assessment, where each member of a team is assigned some task to be performed according to some policy. However, in real situations, it is possible that task assignments and policies have to be revised due to different causes. In addition to typical causes for dynamic changes in adaptive workflows, mobility introduces some specific problems, e.g. the need for new connectivity-maintaining tasks, or reassignment of tasks originally for members who have become unreachable, or who have no sufficient resources to complete the original plan. As these modifications occur dynamically, it is difficult to manage them through hard-coded programs. Rather, we propose the use of a rule-based formalism, expressed in terms of multi-set rewriting. This supports a resource-centered view, in which both data-dependencies between tasks and plan-dependent ordering of tasks are expressed as production and consumption of resources of different types. In turn, rules are themselves seen as resources, so that they are prone to the same rewriting process, in order to redefine process schemas. The paper illustrates these notions and formalisms, and shows some cases of their application

    Simulation of the ambulatory processes in the biggest Brazilian cardiology hospital: a petri net approach

    Get PDF
    This paper presents a simulation of an ambulatory processes using timed Petri net (TPN). The simulation considers the flow of patients in the biggest Brazilian cardiology hospital. The TPN is used as a decision support system (DSS) to improve the processes, to reduce the waiting time of the patients in the ambulatory and in this way to assure a high-quality service to the patients. Simulations were carried out using the software Visual Object Net++. This is a free software and therefore the presented solution is a low-cost solution. Providing a low-cost solution has a huge importance in this work since the hospital is kept from the government efforts and operates with limited financial resources. The patients’ flow in the hospital can be faced as a service and the modelling and optimization of these services bring more efficiency to the system as well as improve the human factors involved. The results proved that some changes could be made in the processes to improve the performance of the system

    Static verification tool improvement in ASIC design flow: Tool Evaluation using a real design

    Get PDF
    Verification is now the most time-consuming step in the design flow for digital circuits. Design organizations are constantly researching improvements to accelerate verification tasks so that a functional and efficient silicon can be released to a demanding market to improve the company’s competitive position. Today, EDA (Electronic Design Automation) tools are a part of the development of each designed circuit and contribute to the verification work. Automating and simplifying the verification flow will help focus on resolving the underlying system issues. The company is interested in improving its verification flow to take advantage of the features available in the EDA market. Recognizing more synchronization structures, an improved hierarchical verification flow and incremental verification flow could potentially improve verification process throughput in the company. This study examines how changing the tool improves the static verification flow for the company. This research examines the background of EDA tools and the most relevant theory to understand the tool roles and the static rule checks they make. In addition, the deployment of static verification tools will be discussed, and clock-domain crossing and lint checking tools will be introduced from an EDA toolkit. A modular inspection flow compatible with the server infrastructure will be built for this purpose. The company’s proprietary and problematic synchronization structures will be implemented as an interface-level script, so that the inspection software understands the used structures to be suitable for specific use-cases. Design constraints are developed to improve the accuracy of the results. In addition, this study measures the performance of the programs. The use of computing resources is measured in the company’s design environment and compared to the current verification flow. In addition, the quantity and quality of the reported messages are compared to the old flow, and the user experience and correctness of the results are briefly assessed. In the study, configured software checks are performed on the company’s own subsystem under development, and the suitability of the software for the organization’s purposes is determined by examining the results. Flow performance, duration, and utilization of computational resources are measured with the generated script and software reports. In addition, the functionality and user-friendliness of the software's graphical user interface are briefly reviewed. The research finds that the clock-domain crossing verification flow is accelerated by over three times and the lint verification flow by a quarter. The inspections detect almost three times more potential issues in the code. The new tool flow requires under a third of the disk space and system memory consumed by the old verification flow. In addition, it is observed that the new software is overall more pleasant to use: The software is perceived to be somewhat more challenging to learn, but in return it provides more information to solve the underlying issues in the code. Finally, it is concluded that the company should consider adding the tool to complement its verification tool belt due to the performance, verification thoroughness and more moderate use of computation server resources

    On Managing Process Variants as an Information Resource

    Get PDF
    Many business solutions provide best practice process templates, both generic as well as for specific industry sectors. However, it is often the variance from template solutions that provide organizations with intellectual capital and competitive differentiation. Although variance must comply with various contractual, regulatory and operational constraints, it is still an important information resource, representing preferred work practices. In this paper, we present a modeling framework that is conducive to constrained variance, by supporting user driven process adaptations. The focus of the paper is on providing a means of utilizing the adaptations effectively for process improvement through effective management of the process variants repository (PVR). In particular, we will provide deliberations towards a facility to provide query functionality for PVR that is specifically targeted for effective search and retrieval of process variants

    Prescriptive Semantics For Big-Step Modelling Languages

    Get PDF
    With the popularity of model-driven methodologies and the abundance of modelling languages, a major question for a modeller is: Which language is suitable for modelling a system under study? To answer this question, one not only needs to know the range of relevant languages for modelling the system under study, but also needs to be able to compare these languages. In this dissertation, I consider these challenges from a semantic point of view for a diverse range of behavioural modelling languages that I refer to as the family of Big-Step Modelling Languages (BSMLs). There is a plethora of BSMLs, including statecharts, its variants, SCR, un-clocked variants of synchronous languages (e.g., Esterel and Argos), and reactive modules. BSMLs are often used to model systems that continuously interact with their environments. In a BSML model, the reaction of the model to an environmental input is a big step, which consists of a sequence of small steps, each of which can be the concurrent execution of a set of transitions. To provide a systematic method to understand and compare the semantics of BSMLs, this dissertation introduces the big-step semantic deconstruction framework that deconstructs the semantic design space of BSMLs into eight high-level, independent semantic aspects together with the enumeration of the common semantic options of each semantic aspect. The dissertation also presents a comparative analysis of the semantic options of each semantic aspect to assist one to choose one semantic option over another. A key idea in the big-step semantic deconstruction is that the high-level semantic aspects in the deconstruction recognize a big step as a whole, rather than only considering its constituent transitions operationally. A novelty of the big-step semantic deconstruction is that it lends itself to a systematic semantic formalization of most of the languages in the deconstruction. The dissertation presents a parametric, formal semantic definition method whose parameters correspond to the semantic aspects of the deconstruction, and thus it produces prescriptive semantics: The manifestation of a semantic option in the semantics of a BSML can be clearly identified. The way transitions are ordered to form a big step in a BSML is a source of semantic complexity: A modeller needs to be aware of the possible orders of the execution of transitions when constructing and analyzing a model. The dissertation introduces three semantic quality attributes that each exempts a modeller from considering an aspect of ordering in big steps. The ranges of BSMLs that support each of these semantic quality attributes are formally specified. These specifications indicate that achieving a semantic quality attribute in a BSML is a cross-cutting concern over the choices of its different semantic options. The semantic quality attributes together with the semantic analysis of individual semantic options can be used in tandem to assist a modeller or a semanticist to compare two BSMLs or to create a new, desired BSML from scratch. Through the big-step semantic deconstruction, I have discovered that some of the semantic aspects of BSMLs can be uniformly described as forms of synchronization. The dissertation presents a general synchronization framework for behavioural modelling languages. This framework is based on a notion of synchronization between transitions of complementary roles. It is parameterized by the number of interactions a transition can take part in, i.e., one vs. many, and the arity of the interaction mechanisms, i.e., exclusive vs. shared, which are considered for the complementary roles to result in 16 synchronization types. To enhance BSMLs with the capability to use the synchronization types, a synchronizer syntax is introduced for BSMLs, resulting in the family of Synchronizing Big-Step Modelling Languages (SBSMLs). Using the expressiveness of SBSMLs, the dissertation describes how underlying the semantics of many modelling constructs, such as multi-source, multi-destination transitions, various composition operators, and workflow patterns, there is a notion of synchronization that can be systematically modelled in SBSMLs

    "Design and verification of a digitally controlled output stage for automotive safety applications"

    Get PDF
    In these last decades the importance of electronics in automotive environment had an exponential increase. Electronic Integrated Circuits (ICs) are now playing a primary role in the economy of vehicles, especially since special laws and strict safety requirements have been introduced. The aim of the thesis, developed within Austramicrosystem design centre of Navacchio (PI), is the design and the verification of a digitally controlled output stage. Output stages are the final components of many sensor-based ICS. In fact their typical typical signal-chain starts from the sensing of a physical phenomenon, passing by its transduction in an electrical quantity, its digital conversion and processing, and ends with the drive of an actuator. The task of an output stage is to interpret the input digital signal and consequently drive an actuator. The target of this work was to improve the performances of the current output stage company solutions. This target has been achieved through the development and realization of a digitally-controlled loop. The proposed solution guarantees a performance improvement and adds the possibility to cyclically monitor the output voltage, detecting issues and reporting errors. A control algorithm has been developed and validated through its insertion in a mathematical modeling of the system. Then, to experimentally validate this control algorithm, an Integrated Circuit has been designed, realized and lastly measured. This thesis follows the workflow behind the realization of the Integrated Circuit and its successive measurement

    Many-core and heterogeneous architectures: programming models and compilation toolchains

    Get PDF
    1noL'abstract è presente nell'allegato / the abstract is in the attachmentopen677. INGEGNERIA INFORMATInopartially_openembargoed_20211002Barchi, Francesc

    Interactive Model-Based Compilation: A Modeller-Driven Development Approach

    Get PDF
    There is a growing tendency for using domain-specific languages, which help domain experts to stay focussed on abstract problem solutions. It is important to carefully design these languages and tools, which fundamentally perform model-to-model transformations. The quality of both usually decides the effectiveness of the subsequent development and therefore the quality of the final applications. However, as the complexity and safety requirements of modern systems grow, it becomes increasingly burdensome to create highly customized languages and difficult to provide reasonable overviews within these tools. This thesis introduces a new interactive model-based compilation methodology. Compilations for arbitrary model-to-model transformations are themselves described as models. They can be instantiated for particular inputs, e. g. a program, to create concrete compilation runs, which return the result of that compilation. The compilation instance is interactively observable. Intermediate results serve as new inputs and as documentation. They can be used to create highly customized views and facilitate understandability. This methodology guides modellers from the start of the compilation to the final result so that they can interactively refine their models. The methodology has been implemented and validated as the KIELER Compiler (KiCo) and is available as part of the KIELER open-source project. It is used to implement the current reference compiler for the SCCharts language, a statecharts dialect designed for specifying safety-critical reactive systems based on a synchronous model of computation. The interactive model-based compilation approach was key to the rapid prototyping of three different compilation strategies, as well as new language extensions, variations and closely related languages. The results are verified with benchmarks, which are again modelled using the same approach and technology. The usability of the SCCharts language and the KiCo tooling is documented with long-term surveys and real-life industrial, academic and teaching examples

    A study protocol for development and validation of a clinical prediction model for frailty (ModulEn): a new European commitment to tackling frailty

    Get PDF
    There is a growing need to implement and evaluate the technological solutions that allow the early detection of age-related frailty and enable assessment of the predictive values of frailty components. The broad use of these solutions may ensure an efficient and sustainable response of health and social care systems to the challenges related to demographic aging. In this paper, we present the protocol of the ModulEn study that aims to develop and validate a predictive model for frailty. For this purpose, the sample composed by older adults aged 65-80 years and recruited from the community will be invited to use an electronic device ACM Kronowise® 2.0. This device allows proactive and continuous monitoring of circadian health, physical activity, and sleep and eating habits. It will be used during a period of seven to ten days. The participants will also be given the questionnaires evaluating the variables of interest, including frailty level, as well as their experience and satisfaction with the device use. Data provided from these two sources will be combined and the relevant associations will be identified. In our view, the implications of this study' findings for clinical practice include the possibility to develop and validate tools for timely prevention of frailty progress. In the long term, the ModulEn may contribute to the critical reduction of frailty burden in Europe
    corecore