445 research outputs found
Normalisation Control in Deep Inference via Atomic Flows
We introduce `atomic flows': they are graphs obtained from derivations by
tracing atom occurrences and forgetting the logical structure. We study simple
manipulations of atomic flows that correspond to complex reductions on
derivations. This allows us to prove, for propositional logic, a new and very
general normalisation theorem, which contains cut elimination as a special
case. We operate in deep inference, which is more general than other syntactic
paradigms, and where normalisation is more difficult to control. We argue that
atomic flows are a significant technical advance for normalisation theory,
because 1) the technique they support is largely independent of syntax; 2)
indeed, it is largely independent of logical inference rules; 3) they
constitute a powerful geometric formalism, which is more intuitive than syntax
Program Analysis Scenarios in Rascal
Rascal is a meta programming language focused on the implementation of domain-specific languages and on the rapid construction of tools for software analysis and software transformation. In this paper we focus on the use of Rascal for software analysis. We illustrate a range of scenarios for building new software analysis tools through a number of examples, including one showing integration with an existing Maude-based analysis. We then focus on ongoing work on alias analysis and type inference for PHP, showing how Rascal is being used, and sketching a hypothetical solution in Maude. We conclude with a high-level discussion on the commonalities and differences between Rascal and Maude when applied to program analysis
Changing market environment in private banking : a quantitative study of the DACH Private Banking market investigating customer behaviour with a focus on personality traits
The Private Banking segment faces several dynamics in the market environment, pressuring its
profitability and threatening its longterm existence. Based on an extensive literature review
and quantitative data collection, this thesis aims at clarifying incumbent banks’ action fields
and their challenges to counteract. Survey results about customers’ behaviour, needs and
expectations in Private Banking identify broad consent in digital infrastructure endorsement,
cost awareness and little significance of human advisors. However, varying opinions are found
on unrelated banking approaches. Propensities in this context are statistically significantly
influenced by (a) Age, (b) Time spent with financial concerns and (c) Personality Traits. The
author concludes the need for digital upgrades. Additionally, Private Banking institutions
should consider a differentiated customer approach for new controversial business forms to
retain and acquire classic Private Banking customers as well as attract potential future targ et
groups.O segmento da Private Banking enfrenta várias dinâmicas no ambiente do mercado,
pressionando a sua rentabilidade e ameaçando a sua existência a longo prazo. Com base numa
extensa revisão bibliográfica e recolha de dados quantitativos, esta tese visa clarificar os campos
de acção e os seus desafios para contrariar. Os resultados do inquérito sobre o comportamento,
necessidades e expectativas dos clientes no Private Banking identificam um amplo consenso no
endosso de infraestruturas digitais, consciência de custos e pouca importância dos conselheiros
humanos. No entanto, existem diferentes opiniões sobre abordagens bancárias não relacionadas.
As propensões neste contexto são estatisticamente influenciadas (a) pela idade, (b) pelo tempo
gasto com preocupações financeiras e (c) pelos traços de personalidade. O autor conclui a
necessidade de actualizações digitais. Além disso, as instituições de Private Banking devem
considerar uma abordagem diferenciada para novas formas de negócio controversas a fim de
reter e adquirir clientes clássicos do Private Banking, bem como atrair potenciais gruposalvo
futuros
Leveraging service-oriented business applications to a rigorous rule-centric dynamic behavioural architecture.
Today’s market competitiveness and globalisation are putting pressure on organisations to join their efforts, to focus more on cooperation and interaction and to add value to their businesses. That is, most information systems supporting these cross-organisations are characterised as service-oriented business applications, where all the emphasis is put on inter-service interactions rather than intra-service computations.
Unfortunately for the development of such inter-organisational service-oriented business systems, current service technology proposes only ad-hoc, manual and static standard web-service languages such as WSDL, BPEL and WS-CDL [3, 7].
The main objective of the work reported in this thesis is thus to leverage the development of service-oriented business applications towards more reliability and dynamic adaptability, placing emphasis on the use of business rules to govern activities, while composing services. The best available software-engineering techniques for adaptability, mainly aspect-oriented mechanisms, are also to be integrated with advanced formal techniques. More specifically, the proposed approach consists of the following incremental steps. First, it models any business activity behaviour governing any service-oriented business process as Event-Condition-Action (ECA) rules. Then such informal rules are made more interaction-centric, using adapted architectural connectors. Third, still at the conceptual-level, with the aim of adapting such ECA-driven connectors, this approach borrows aspect-oriented ideas and mechanisms, and proposes to intercept events, select the properties required for interacting entities, explicitly and separately execute such ECA-driven behavioural interactions and finally dynamically weave the results into the entities involved. To ensure compliance and to preserve the implementation of this architectural conceptualisation, the work adopts the Maude language as an executable operational formalisation. For that purpose, Maude is first endowed with the notions of components and interfaces. Further, the concept of ECA-driven behavioural interactions are specified and implemented as aspects. Finally, capitalising on Maude reflection, the thesis demonstrates how to weave such interaction executions into associated services
Automated Validation of State-Based Client-Centric Isolation with TLA <sup>+</sup>
Clear consistency guarantees on data are paramount for the design and implementation of distributed systems. When implementing distributed applications, developers require approaches to verify the data consistency guarantees of an implementation choice. Crooks et al. define a state-based and client-centric model of database isolation. This paper formalizes this state-based model in, reproduces their examples and shows how to model check runtime traces and algorithms with this formalization. The formalized model in enables semi-automatic model checking for different implementation alternatives for transactional operations and allows checking of conformance to isolation levels. We reproduce examples of the original paper and confirm the isolation guarantees of the combination of the well-known 2-phase locking and 2-phase commit algorithms. Using model checking this formalization can also help finding bugs in incorrect specifications. This improves feasibility of automated checking of isolation guarantees in synthesized synchronization implementations and it provides an environment for experimenting with new designs.</p
Using natural language processing for question answering in closed and open domains
With regard to the growth in the amount of social, environmental, and biomedical information available digitally, there is a growing need for Question Answering (QA) systems that can empower users to master this new wealth of information. Despite recent progress in QA, the quality of interpretation and extraction of the desired answer is not adequate. We believe that striving for higher accuracy in QA systems is subject to on-going research, i.e., it is better to have no answer is better than wrong answers. However, there are diverse queries, which the state of the art QA systems cannot interpret and answer properly.
The problem of interpreting a question in a way that could preserve its syntactic-semantic structure is considered as one of the most important challenges in this area. In this work we focus on the problems of semantic-based QA systems and analyzing the effectiveness of NLP techniques, query mapping, and answer inferencing both in closed (first scenario) and open (second scenario) domains. For this purpose, the architecture of Semantic-based closed and open domain Question Answering System (hereafter “ScoQAS”) over ontology resources is presented with two different prototyping: Ontology-based closed domain and an open domain under Linked Open Data (LOD) resource.
The ScoQAS is based on NLP techniques combining semantic-based structure-feature patterns for question classification and creating a question syntactic-semantic information structure (QSiS). The QSiS provides an actual potential by building constraints to formulate the related terms on syntactic-semantic aspects and generating a question graph (QGraph) which facilitates making inference for getting a precise answer in the closed domain. In addition, our approach provides a convenient method to map the formulated comprehensive information into SPARQL query template to crawl in the LOD resources in the open domain.
The main contributions of this dissertation are as follows:
1. Developing ScoQAS architecture integrated with common and specific components compatible with closed and open domain ontologies.
2. Analysing user’s question and building a question syntactic-semantic information structure (QSiS), which is constituted by several processes of the methodology: question classification, Expected Answer Type (EAT) determination, and generated constraints.
3. Presenting an empirical semantic-based structure-feature pattern for question classification and generalizing heuristic constraints to formulate the relations between the features in the recognized pattern in terms of syntactical and semantical.
4. Developing a syntactic-semantic QGraph for representing core components of the question.
5. Presenting an empirical graph-based answer inference in the closed domain.
In a nutshell, a semantic-based QA system is presented which provides some experimental results over the closed and open domains. The efficiency of the ScoQAS is evaluated using measures such as precision, recall, and F-measure on LOD challenges in the open domain. We focus on quantitative evaluation in the closed domain scenario. Due to the lack of predefined benchmark(s) in the first scenario, we define measures that demonstrate the actual complexity of the problem and the actual efficiency of the solutions. The results of the analysis corroborate the performance and effectiveness of our approach to achieve a reasonable accuracy.Con respecto al crecimiento en la cantidad de información social, ambiental y biomédica disponible digitalmente, existe una creciente necesidad de sistemas de la búsqueda de la respuesta (QA) que puedan ofrecer a los usuarios la gestión de esta nueva cantidad de información. A pesar del progreso reciente en QA, la calidad de interpretación y extracción de la respuesta deseada no es la adecuada. Creemos que trabajar para lograr una mayor precisión en los sistemas de QA es todavía un campo de investigación abierto. Es decir, es mejor no tener respuestas que tener respuestas incorrectas. Sin embargo, existen diversas consultas que los sistemas de QA en el estado del arte no pueden interpretar ni responder adecuadamente. El problema de interpretar una pregunta de una manera que podría preservar su estructura sintáctica-semántica es considerado como uno de los desafíos más importantes en esta área. En este trabajo nos centramos en los problemas de los sistemas de QA basados en semántica y en el análisis de la efectividad de las técnicas de PNL, y la aplicación de consultas e inferencia respuesta tanto en dominios cerrados (primer escenario) como abiertos (segundo escenario). Para este propósito, la arquitectura del sistema de búsqueda de respuestas en dominios cerrados y abiertos basado en semántica (en adelante "ScoQAS") sobre ontologías se presenta con dos prototipos diferentes: en dominio cerrado basado en el uso de ontologías y un dominio abierto dirigido a repositorios de Linked Open Data (LOD). El ScoQAS se basa en técnicas de PNL que combinan patrones de características de estructura semánticas para la clasificación de preguntas y la creación de una estructura de información sintáctico-semántica de preguntas (QSiS). El QSiS proporciona una manera la construcción de restricciones para formular los términos relacionados en aspectos sintáctico-semánticos y generar un grafo de preguntas (QGraph) el cual facilita derivar inferencias para obtener una respuesta precisa en el dominio cerrado. Además, nuestro enfoque proporciona un método adecuado para aplicar la información integral formulada en la plantilla de consulta SPARQL para navegar en los recursos LOD en el dominio abierto. Las principales contribuciones de este trabajo son los siguientes: 1. El desarrollo de la arquitectura ScoQAS integrada con componentes comunes y específicos compatibles con ontologías de dominio cerrado y abierto. 2. El análisis de la pregunta del usuario y la construcción de una estructura de información sintáctico-semántica de las preguntas (QSiS), que está constituida por varios procesos de la metodología: clasificación de preguntas, determinación del Tipo de Respuesta Esperada (EAT) y las restricciones generadas. 3. La presentación de un patrón empírico basado en la estructura semántica para clasificar las preguntas y generalizar las restricciones heurísticas para formular las relaciones entre las características en el patrón reconocido en términos sintácticos y semánticos. 4. El desarrollo de un QGraph sintáctico-semántico para representar los componentes centrales de la pregunta. 5. La presentación de la respuesta inferida a partir de un grafo empírico en el dominio cerrado. En pocas palabras, se presenta un sistema semántico de QA que proporciona algunos resultados experimentales sobre los dominios cerrados y abiertos. La eficiencia del ScoQAS se evalúa utilizando medidas tales como una precisión, cobertura y la medida-F en desafíos LOD para el dominio abierto. Para el dominio cerrado, nos centramos en la evaluación cuantitativa; su precisión se analiza en una ontología empresarial. La falta de un banco la pruebas predefinidas es uno de los principales desafíos de la evaluación en el primer escenario. Por lo tanto, definimos medidas que demuestran la complejidad real del problema y la eficiencia real de las soluciones. Los resultados del análisis corroboran el rendimient
Recommended from our members
Enabling Resilience in Cyber-Physical-Human Water Infrastructures
Rapid urbanization and growth in urban populations have forced community-scale infrastructures (e.g., water, power and natural gas distribution systems, and transportation networks) to operate at their limits. Aging (and failing) infrastructures around the world are becoming increasingly vulnerable to operational degradation, extreme weather, natural disasters and cyber attacks/failures. These trends have wide-ranging socioeconomic consequences and raise public safety concerns. In this thesis, we introduce the notion of cyber-physical-human infrastructures (CPHIs) - smart community-scale infrastructures that bridge technologies with physical infrastructures and people. CPHIs are highly dynamic stochastic systems characterized by complex physical models that exhibit regionwide variability and uncertainty under disruptions. Failures in these distributed settings tend to be difficult to predict and estimate, and expensive to repair. Real-time fault identification is crucial to ensure continuity of lifeline services to customers at adequate levels of quality. Emerging smart community technologies have the potential to transform our failing infrastructures into robust and resilient future CPHIs.In this thesis, we explore one such CPHI - community water infrastructures. Current urban water infrastructures, that are decades (sometimes over a 100 years) old, encompass diverse geophysical regimes. Water stress concerns include the scarcity of supply and an increase in demand due to urbanization. Deterioration and damage to the infrastructure can disrupt water service; contamination events can result in economic and public health consequences. Unfortunately, little investment has gone into modernizing this key lifeline.To enhance the resilience of water systems, we propose an integrated middleware framework for quick and accurate identification of failures in complex water networks that exhibit uncertain behavior. Our proposed approach integrates IoT-based sensing, domain-specific models and simulations with machine learning methods to identify failures (pipe breaks, contamination events). The composition of techniques results in cost-accuracy-latency tradeoffs in fault identification, inherent in CPHIs due to the constraints imposed by cyber components, physical mechanics and human operators. Three key resilience problems are addressed in this thesis; isolation of multiple faults under a small number of failures, state estimation of the water systems under extreme events such as earthquakes, and contaminant source identification in water networks using human-in-the-loop based sensing. By working with real world water agencies (WSSC, DC and LADWP, LA), we first develop an understanding of operations of water CPHI systems. We design and implement a sensor-simulation-data integration framework AquaSCALE, and apply it to localize multiple concurrent pipe failures. We use a mixture of infrastructure measurements (i.e., historical and live water pressure/flow), environmental data (i.e., weather) and human inputs (i.e., twitter feeds), combined and enhanced with the domain model and supervised learning techniques to locate multiple failures at fine levels of granularity (individual pipeline level) with detection time reduced by orders of magnitude (from hours/days to minutes). We next consider the resilience of water infrastructures under extreme events (i.e., earthquakes) - the challenge here is the lack of apriori knowledge and the increased number and severity of damages to infrastructures. We present a graphical model based approach for efficient online state estimation, where the offline graph factorization partitions a given network into disjoint subgraphs, and the belief propagation based inference is executed on-the-fly in a distributed manner on those subgraphs. Our proposed approach can isolate 80% broken pipes and 99% loss-of-service to end-users during an earthquake.Finally, we address issues of water quality - today this is a human-in-the-loop process where operators need to gather water samples for lab tests. We incorporate the necessary abstractions with event processing methods into a workflow, which iteratively selects and refines the set of potential failure points via human-driven grab sampling. Our approach utilizes Hidden Markov Model based representations for event inference, along with reinforcement learning methods for further refining event locations and reducing the cost of human efforts.The proposed techniques are integrated into a middleware architecture, which enables components to communicate/collaborate with one another. We validate our approaches through a prototype implementation with multiple real-world water networks, supply-demand patterns from water utilities and policies set by the U.S. EPA. While our focus here is on water infrastructures in a community, the developed end-to-end solution is applicable to other infrastructures and community services which operate in disruptive and resource-constrained environments
On the proof complexity of deep inference
International audienceWe obtain two results about the proof complexity of deep inference: (1) Deep-inference proof systems are as powerful as Frege ones, even when both are extended with the Tseitin extension rule or with the substitution rule; (2) there are analytic deep-inference proof systems that exhibit an exponential speedup over analytic Gentzen proof systems that they polynomially simulate
Behavioural analysis of sessions using the calculus of structures
This paper describes an approach to the behavioural analysis of sessions. The approach is made possible by the calculus of structures — a deep inference proof calculus, generalising the sequent calculus, where inference rules are applied in any context. The approach involves specifications of global and local sessions inspired by the Scribble language. The calculus features a novel operator that synchronises parts of a protocol that must be treated atomically. Firstly, the calculus can be used to determine whether local sessions can be compose in a type safe fashion such that sessions are capable of successfully completing. Secondly, the calculus defines a subtyping relation for sessions that allows causal dependencies to be weakened while retaining termination potential. Consistency and complexity results follow from proof theory
- …