690 research outputs found
Verification of Hierarchical Artifact Systems
Data-driven workflows, of which IBM's Business Artifacts are a prime
exponent, have been successfully deployed in practice, adopted in industrial
standards, and have spawned a rich body of research in academia, focused
primarily on static analysis. The present work represents a significant advance
on the problem of artifact verification, by considering a much richer and more
realistic model than in previous work, incorporating core elements of IBM's
successful Guard-Stage-Milestone model. In particular, the model features task
hierarchy, concurrency, and richer artifact data. It also allows database key
and foreign key dependencies, as well as arithmetic constraints. The results
show decidability of verification and establish its complexity, making use of
novel techniques including a hierarchy of Vector Addition Systems and a variant
of quantifier elimination tailored to our context.Comment: Full version of the accepted PODS pape
Wang-Landau Algorithm: a Theoretical Analysis of the Saturation of the Error
In this work we present a theoretical analysis of the convergence of the
Wang-Landau algorithm [Phys. Rev. Lett. 86, 2050 (2001)] which was introduced
years ago to calculate the density of states in statistical models. We study
the dynamical behavior of the error in the calculation of the density of
states.We conclude that the source of the saturation of the error is due to the
decreasing variations of the refinement parameter. To overcome this limitation,
we present an analytical treatment in which the refinement parameter is scaled
down as a power law instead of exponentially. An extension of the analysis to
the N-fold way variation of the method is also discussed.Comment: 7 pages, 5 figure
Hybrid Governance and the Attribution of Political Responsibility: Experimental Evidence from the United States
How does the mode of public service delivery affect the attribution of responsibility for public goods? Through a survey experiment on a sample of more than 1,000 Americans, we provide evidence of how the allocation of public goods shapes voters’ support for incumbent politicians. We find that voters prefer a mixture of public–private financing and management when it comes to the delivery of infrastructure. However, once performance information is available, the mode of infrastructure delivery no longer influences their voting intention. The successful delivery of these infrastructure projects is what ultimately matters to voters. Moreover, this preference for a mixture of public and private involvement in public service delivery is stronger among citizens with high political knowledge, who are more likely to punish the incumbent for a failed first phase of the public service delivery. These findings deepen our understanding of how hybrid forms of public service delivery are perceived by voters and how performance information affects evaluations of the performance of public services and politicians alike. // In che misura le modalità di erogazione dei servizi pubblici influenzano l’attribuzione di responsabilità per la gestione dei beni pubblici? Tramite un esperimento su un campione di più di 1000 americani, dimostriamo come l’allocazione di beni pubblici condizioni il sostegno elettorale dei politici in carica da parte degli elettori. In merito al finanziamento e alla gestione dei servizi pubblici, risulta che gli elettori preferiscono un mix tra pubblico e privato. Tuttavia, non appena sono disponibili informazioni sulla performance, non sono più le modalità di erogazione del servizio a influenzare le intenzioni di voto degli elettori, bensì la riuscita dei progetti stessi. Inoltre, questa preferenza per un mix tra pubblico e privato nell’erogazione dei servizi pubblici è più marcata tra gli elettori con livelli più elevati di conoscenza politica, che si mostrano più propensi a punire il politico in carica nel caso di fallimento nella prima fase di erogazione del servizio pubblico. Questi risultati ci aiutano a capire meglio come certe forme ibride di erogazione dei servizi pubblici siano percepite dagli elettori e come le informazioni sulla performance influiscano sia la valutazione dei servizi pubblici che quella dei politici
Do androids dream of electric fences? Safety-aware reinforcement learning with latent shielding
The growing trend of fledgling reinforcement learning sys- tems making their way into real-world applications has been accompanied by growing concerns for their safety and ro- bustness. In recent years, a variety of approaches have been put forward to address the challenges of safety-aware rein- forcement learning; however, these methods often either re- quire a handcrafted model of the environment to be pro- vided beforehand, or that the environment is relatively simple and low-dimensional. We present a novel approach to safety- aware deep reinforcement learning in high-dimensional envi- ronments called latent shielding. Latent shielding leverages internal representations of the environment learnt by model- based agents to “imagine” future trajectories and avoid those deemed unsafe. We experimentally demonstrate that this approach leads to improved adherence to formally-defined safety specifications
Partnership Communities
We undertake the first quantitative and broadly comparative study of the structure and performance of partnership communities to our knowledge. Our study addresses several important research questions. How connected are the members of partnership communities? How can we understand the quality of the projects a community undertakes? How do political institutions shape their structure and performance? After defining partnership communities as networked communities of private firms which form the consortia that enter into long-term contractual arrangements with governments, we show how they are affected by government demand for partners. We then provide an overview of those factors predicting success in financing projects. Finally, we focus on the political economy of partnership communities. We develop and test theoretical predictions about how national institutions shape partnership communities and the quality of projects. We also investigate voters' preferences over alternative arrangements of infrastructure delivery before drawing out implications for research and practice
An observing system for the collection of fishery and oceanographic data
Fishery Observing System (FOS) was developed as a first and basic step towards fish stock abundance nowcasting/forecasting within the framework of the EU research program Mediterranean Forecasting System: Toward an Environmental Prediction (MFSTEP). The study of the relationship between abundance and environmental parameters also represents a crucial point towards forecasting. Eight fishing vessels were progressively equipped with FOS instrumentation to collect fishery and oceanographic data. The vessels belonged to different harbours of the Central and Northern Adriatic Sea. For this pilot application, anchovy (<I>Engraulis encrasicolus</I>, L.) was chosen as the target species. Geo-referenced catch data, associated with in-situ temperature and depth, were the FOS products but other parameters were associated with catch data as well. MFSTEP numerical circulation models provide many of these data. In particular, salinity was extracted from re-analysis data of numerical circulation models. Satellite-derived sea surface temperature (SST) and chlorophyll were also used as independent variables. Catch and effort data were used to estimate an abundance index (CPUE &ndash; Catch per Unit of Effort). Considering that catch records were gathered by different fishing vessels with different technical characteristics and operating on different fish densities, a standardized value of CPUE was calculated. A spatial and temporal average CPUE map was obtained together with a monthly mean time series in order to characterise the variability of anchovy abundance during the period of observation (October 2003&ndash;August 2005). In order to study the relationship between abundance and oceanographic parameters, Generalized Additive Models (GAM) were used. Preliminary results revealed a complex scenario: the southern sector of the domain is characterised by a stronger relationship than the central and northern sector where the interactions between the environment and the anchovy distribution are hidden by a higher percentage of variability within the system which is still unexplained. <br><br> GAM analysis showed that increasing the number of explanatory variables also increased the portion of variance explained by the model. Data exchange and interdisciplinary efforts will therefore be crucial for the success of this research activity
Corticomuscular Coherence Is Tuned to the Spontaneous Rhythmicity of Speech at 2-3 Hz
Human speech features rhythmicity that frames distinctive, fine-grained speech patterns. Speech can thus be counted among rhythmic motor behaviors that generally manifest characteristic spontaneous rates. However, the critical neural evidence for tuning of articulatory control to a spontaneous rate of speech has not been uncovered. The present study examined the spontaneous rhythmicity in speech production and its relationship to cortex–muscle neurocommunication, which is essential for speech control. Our MEG results show that, during articulation, coherent oscillatory coupling between the mouth sensorimotor cortex and the mouth muscles is strongest at the frequency of spontaneous rhythmicity of speech at 2–3 Hz, which is also the typical rate of word production. Corticomuscular coherence, a measure of efficient cortex–muscle neurocommunication, thus reveals behaviorally relevant oscillatory tuning for spoken language.Peer reviewe
Automated Verification of Quantum Protocols using MCMAS
We present a methodology for the automated verification of quantum protocols
using MCMAS, a symbolic model checker for multi-agent systems The method is
based on the logical framework developed by D'Hondt and Panangaden for
investigating epistemic and temporal properties, built on the model for
Distributed Measurement-based Quantum Computation (DMC), an extension of the
Measurement Calculus to distributed quantum systems. We describe the
translation map from DMC to interpreted systems, the typical formalism for
reasoning about time and knowledge in multi-agent systems. Then, we introduce
dmc2ispl, a compiler into the input language of the MCMAS model checker. We
demonstrate the technique by verifying the Quantum Teleportation Protocol, and
discuss the performance of the tool.Comment: In Proceedings QAPL 2012, arXiv:1207.055
Thermal fluctuations of an interface near a contact line
The effect of thermal fluctuations near a contact line of a liquid interface
partially wetting an impenetrable substrate is studied analytically and
numerically. Promoting both the interface profile and the contact line position
to random variables, we explore the equilibrium properties of the corresponding
fluctuating contact line problem based on an interfacial Hamiltonian involving
a "contact" binding potential. To facilitate an analytical treatment we
consider the case of a one-dimensional interface. The effective boundary
condition at the contact line is determined by a dimensionless parameter that
encodes the relative importance of thermal energy and substrate energy at the
microscopic scale. We find that this parameter controls the transition from a
partially wetting to a pseudo-partial wetting state, the latter being
characterized by a thin prewetting film of fixed thickness. In the partial
wetting regime, instead, the profile typically approaches the substrate via an
exponentially thinning prewetting film. We show that, independently of the
physics at the microscopic scale, Young's angle is recovered sufficiently far
from the substrate. The fluctuations of the interface and of the contact line
give rise to an effective disjoining pressure, exponentially decreasing with
height. Fluctuations therefore provide a regularization of the singular contact
forces occurring in the corresponding deterministic problem.Comment: 40 Pages, 12 Figure
Integrated HTA and FMECA methodology for the evaluation of robotic surgery
Robotic surgery has been strongly improved since the beginning of the twenty-first century and chased important level of technical and clinical performances. Within the robotic area, the most worldwide used surgical robot is the da Vinci® system made by Intuitive Surgical Inc.
The aim of this study was to evaluate at the hospital scale the robotic surgery (Hospital –Based Health Technology Assessment) in comparison to the open and laparoscopic procedures yet combining a FMECA analysis to accurately assess all those aspects involving patient and staff safety.
The total number of robotic procedures directly observed by the surgical department and reported in the following study was 44, including 28 urology interventions and 16 general surgeries. The study confirmed clinical benefits carried out with the robot but bigger complexity in managing the whole surgical system in terms of structural needs, staff and technology.
For the future, further steps regard the necessity to dispose of a wider number of robotic procedures in order to strength the analysis reliability and complete the socio-economic assessment with medium and long terms observation. Finally a new FMECA application will be essential to monitor the real effects of the suggested actions on the evaluated risks according to the already known and new failure modes
- …