60,258 research outputs found

    Formal and Informal Methods for Multi-Core Design Space Exploration

    Full text link
    We propose a tool-supported methodology for design-space exploration for embedded systems. It provides means to define high-level models of applications and multi-processor architectures and evaluate the performance of different deployment (mapping, scheduling) strategies while taking uncertainty into account. We argue that this extension of the scope of formal verification is important for the viability of the domain.Comment: In Proceedings QAPL 2014, arXiv:1406.156

    A unified framework for solving a general class of conditional and robust set-membership estimation problems

    Full text link
    In this paper we present a unified framework for solving a general class of problems arising in the context of set-membership estimation/identification theory. More precisely, the paper aims at providing an original approach for the computation of optimal conditional and robust projection estimates in a nonlinear estimation setting where the operator relating the data and the parameter to be estimated is assumed to be a generic multivariate polynomial function and the uncertainties affecting the data are assumed to belong to semialgebraic sets. By noticing that the computation of both the conditional and the robust projection optimal estimators requires the solution to min-max optimization problems that share the same structure, we propose a unified two-stage approach based on semidefinite-relaxation techniques for solving such estimation problems. The key idea of the proposed procedure is to recognize that the optimal functional of the inner optimization problems can be approximated to any desired precision by a multivariate polynomial function by suitably exploiting recently proposed results in the field of parametric optimization. Two simulation examples are reported to show the effectiveness of the proposed approach.Comment: Accpeted for publication in the IEEE Transactions on Automatic Control (2014

    Seismic Risk Analysis of Revenue Losses, Gross Regional Product and transportation systems.

    Get PDF
    Natural threats like earthquakes, hurricanes or tsunamis have shown seri- ous impacts on communities. In the past, major earthquakes in the United States like Loma Prieta 1989, Northridge 1994, or recent events in Italy like L’Aquila 2009 or Emilia 2012 earthquake emphasized the importance of pre- paredness and awareness to reduce social impacts. Earthquakes impacted businesses and dramatically reduced the gross regional product. Seismic Hazard is traditionally assessed using Probabilistic Seismic Hazard Anal- ysis (PSHA). PSHA well represents the hazard at a specific location, but it’s unsatisfactory for spatially distributed systems. Scenario earthquakes overcome the problem representing the actual distribution of shaking over a spatially distributed system. The performance of distributed productive systems during the recovery process needs to be explored. Scenario earthquakes have been used to assess the risk in bridge networks and the social losses in terms of gross regional product reduction. The proposed method for scenario earthquakes has been applied to a real case study: Treviso, a city in the North East of Italy. The proposed method for scenario earthquakes requires three models: one representation of the sources (Italian Seismogenic Zonation 9), one attenuation relationship (Sa- betta and Pugliese 1996) and a model of the occurrence rate of magnitudes (Gutenberg Richter). A methodology has been proposed to reduce thou- sands of scenarios to a subset consistent with the hazard at each location. Earthquake scenarios, along with Mote Carlo method, have been used to simulate business damage. The response of business facilities to earthquake has been obtained from fragility curves for precast industrial building. Fur- thermore, from business damage the reduction of productivity has been simulated using economic data from the National statistical service and a proposed piecewise “loss of functionality model”. To simulate the economic process in the time domain, an innovative businesses recovery function has been proposed. The proposed method has been applied to generate scenarios earthquakes at the location of bridges and business areas. The proposed selection method- ology has been applied to reduce 8000 scenarios to a subset of 60. Subse- quently, these scenario earthquakes have been used to calculate three system performance parameters: the risk in transportation networks, the risk in terms of business damage and the losses of gross regional product. A novel model for business recovery process has been tested. The proposed model has been used to represent the business recovery process and simulate the effects of government aids allocated for reconstruction. The proposed method has efficiently modeled the seismic hazard using scenario earthquakes. The scenario earthquakes presented have been used to assess possible consequences of earthquakes in seismic prone zones and to increase the preparedness. Scenario earthquakes have been used to sim- ulate the effects to economy of the impacted area; a significant Gross Regional Product reduction has been shown, up to 77% with an earthquake with 0.0003 probability of occurrence. The results showed that limited funds available after the disaster can be distributed in a more efficient way

    A Framework for QoS-aware Execution of Workflows over the Cloud

    Full text link
    The Cloud Computing paradigm is providing system architects with a new powerful tool for building scalable applications. Clouds allow allocation of resources on a "pay-as-you-go" model, so that additional resources can be requested during peak loads and released after that. However, this flexibility asks for appropriate dynamic reconfiguration strategies. In this paper we describe SAVER (qoS-Aware workflows oVER the Cloud), a QoS-aware algorithm for executing workflows involving Web Services hosted in a Cloud environment. SAVER allows execution of arbitrary workflows subject to response time constraints. SAVER uses a passive monitor to identify workload fluctuations based on the observed system response time. The information collected by the monitor is used by a planner component to identify the minimum number of instances of each Web Service which should be allocated in order to satisfy the response time constraint. SAVER uses a simple Queueing Network (QN) model to identify the optimal resource allocation. Specifically, the QN model is used to identify bottlenecks, and predict the system performance as Cloud resources are allocated or released. The parameters used to evaluate the model are those collected by the monitor, which means that SAVER does not require any particular knowledge of the Web Services and workflows being executed. Our approach has been validated through numerical simulations, whose results are reported in this paper
    • …
    corecore