9 research outputs found

    Weaving Rules into [email protected] for Embedded Smart Systems

    Get PDF
    Smart systems are characterised by their ability to analyse measured data in live and to react to changes according to expert rules. Therefore, such systems exploit appropriate data models together with actions, triggered by domain-related conditions. The challenge at hand is that smart systems usually need to process thousands of updates to detect which rules need to be triggered, often even on restricted hardware like a Raspberry Pi. Despite various approaches have been investigated to efficiently check conditions on data models, they either assume to fit into main memory or rely on high latency persistence storage systems that severely damage the reactivity of smart systems. To tackle this challenge, we propose a novel composition process, which weaves executable rules into a data model with lazy loading abilities. We quantitatively show, on a smart building case study, that our approach can handle, at low latency, big sets of rules on top of large-scale data models on restricted hardware.Comment: pre-print version, published in the proceedings of MOMO-17 Worksho

    Reducing Run-Time Adaptation Space via Analysis of Possible Utility Bounds

    Get PDF
    Self-adaptive systems often employ dynamic programming or similar techniques to select optimal adaptations at run-time. These techniques suffer from the “curse of dimensionality , increasing the cost of run-time adaptation decisions. We propose a novel approach that improves upon the state-of-the-art proactive self-adaptation techniques to reduce the number of possible adaptations that need be considered for each run-time adaptation decision. The approach, realized in a tool called Thallium, employs a combination of automated formal modeling techniques to (i) analyze a structural model of the system showing which configurations are reachable from other configurations and (ii) compute the utility that can be generated by the optimal adaptation over a bounded horizon in both the best- and worst-case scenarios. It then constructs triangular possibility values using those optimized bounds to automatically compare adjacent adaptations for each configuration, keeping only the alternatives with the best range of potential results. The experimental results corroborate Thallium’s ability to significantly reduce the number of states that need to be considered with each adaptation decision, freeing up vital resources at run-time

    An Analysis of x86-64 Inline Assembly in C Programs

    Get PDF
    C codebases frequently embed nonportable and unstandardized elements such as inline assembly code. Such elements are not well understood, which poses a problem to tool developers who aspire to support C code. This paper investigates the use of x86-64 inline assembly in 1264 C projects from GitHub and combines qualitative and quantitative analyses to answer questions that tool authors may have. We found that 28.1% of the most popular projects contain inline assembly code, although the majority contain only a few fragments with just one or two instructions. The most popular instructions constitute a small subset concerned largely with multicore semantics, performance optimization, and hardware control. Our findings are intended to help developers of C-focused tools, those testing compilers, and language designers seeking to reduce the reliance on inline assembly. They may also aid the design of tools focused on inline assembly itself

    Derivation and consistency checking of models in early software product line engineering

    Get PDF
    Dissertação para obtenção do Grau de Doutor em Engenharia InformáticaSoftware Product Line Engineering (SPLE) should offer the ability to express the derivation of product-specific assets, while checking for their consistency. The derivation of product-specific assets is possible using general-purpose programming languages in combination with techniques such as conditional compilation and code generation. On the other hand, consistency checking can be achieved through consistency rules in the form of architectural and design guidelines, programming conventions and well-formedness rules. Current approaches present four shortcomings: (1) focus on code derivation only, (2) ignore consistency problems between the variability model and other complementary specification models used in early SPLE, (3) force developers to learn new, difficult to master, languages to encode the derivation of assets, and (4) offer no tool support. This dissertation presents solutions that contribute to tackle these four shortcomings. These solutions are integrated in the approach Derivation and Consistency Checking of models in early SPLE (DCC4SPL) and its corresponding tool support. The two main components of our approach are the Variability Modelling Language for Requirements(VML4RE), a domain-specific language and derivation infrastructure, and the Variability Consistency Checker (VCC), a verification technique and tool. We validate DCC4SPL demonstrating that it is appropriate to find inconsistencies in early SPL model-based specifications and to specify the derivation of product-specific models.European Project AMPLE, contract IST-33710; Fundação para a Ciência e Tecnologia - SFRH/BD/46194/2008

    A Method and Tool for Finding Concurrency Bugs Involving Multiple Variables with Application to Modern Distributed Systems

    Get PDF
    Concurrency bugs are extremely hard to detect due to huge interleaving space. They are happening in the real world more often because of the prevalence of multi-threaded programs taking advantage of multi-core hardware, and microservice based distributed systems moving more and more applications to the cloud. As the most common non-deadlock concurrency bugs, atomicity violations are studied in many recent works, however, those methods are applicable only to single-variable atomicity violation, and don\u27t consider the specific challenge in distributed systems that have both pessimistic and optimistic concurrency control. This dissertation presents a tool using model checking to predict atomicity violation concurrency bugs involving two shared variables or shared resources. We developed a unique method inferring correlation between shared variables in multi-threaded programs and shared resources in microservice based distributed systems, that is based on dynamic analysis and is able to detect the correlation that would be missed by static analysis. For multi-threaded programs, we use a binary instrumentation tool to capture runtime information about shared variables and synchronization events, and for microservice based distributed systems, we use a web proxy to capture HTTP based traffic about API calls and the shared resources they access including distributed locks. Based on the detected correlation and runtime trace, the tool is powerful and can explore a vast interleaving space of a multi-threaded program or a microservice based distributed system given a small set of captured test runs. It is applicable to large real-world systems and can predict atomicity violations missed by other related works for multi-threaded programs and a couple of previous unknown atomicity violation in real world open source microservice based systems. A limitation is that redundant model checking may be performed if two recorded interleaved traces yield the same partial order model

    Psychometrics in Behavioral Software Engineering: A Methodological Introduction with Guidelines

    Full text link
    A meaningful and deep understanding of the human aspects of software engineering (SE) requires psychological constructs to be considered. Psychology theory can facilitate the systematic and sound development as well as the adoption of instruments (e.g., psychological tests, questionnaires) to assess these constructs. In particular, to ensure high quality, the psychometric properties of instruments need evaluation. In this paper, we provide an introduction to psychometric theory for the evaluation of measurement instruments for SE researchers. We present guidelines that enable using existing instruments and developing new ones adequately. We conducted a comprehensive review of the psychology literature framed by the Standards for Educational and Psychological Testing. We detail activities used when operationalizing new psychological constructs, such as item pooling, item review, pilot testing, item analysis, factor analysis, statistical property of items, reliability, validity, and fairness in testing and test bias. We provide an openly available example of a psychometric evaluation based on our guideline. We hope to encourage a culture change in SE research towards the adoption of established methods from psychology. To improve the quality of behavioral research in SE, studies focusing on introducing, validating, and then using psychometric instruments need to be more common.Comment: 56 pages (pp. 1-36 for the main paper, pp. 37-56 working example in the appendix), 8 figures in the main paper. Accepted for publication at ACM TOSE

    The CHORCH Approach: How to Model B2Bi Choreographies for Orchestration Execution

    Get PDF
    The establishment and implementation of cross-organizational business processes is an implication of today's market pressure for efficiency gains. In this context, Business-To-Business integration (B2Bi) focuses on the information integration aspects of business processes. A core task of B2Bi is providing adequate models that capture the message exchanges between integration partners. Following the terminology used in the SOA domain, such models will be called choreographies in the context of this work. Despite the enormous economic importance of B2Bi, existing choreography languages fall short of fulfilling all relevant requirements of B2Bi scenarios. Dedicated B2Bi choreography standards allow for inconsistent outcomes of basic interactions and do not provide unambiguous semantics for advanced interaction models. In contrast to this, more formal or technical choreography languages may provide unambiguous modeling semantics, but do not offer B2Bi domain concepts or an adequate level of abstraction. Defining valid and complete B2Bi choreography models becomes a challenging task in the face of these shortcomings. At the same time, invalid or underspecified choreography definitions are particularly costly considering the organizational setting of B2Bi scenarios. Models are not only needed to bridge the typical gap between business and IT, but also as negotiation means among the business users of the integration partners on the one hand and among the IT experts of the integration partners on the other. Misunderstandings between any two negotiation partners potentially affect the agreements between all other negotiation partners. The CHORCH approach offers tailored support for B2Bi by combining the strengths of both dedicated B2Bi standards and formal rigor. As choreography specification format, the ebXML Business Process Specification Schema (ebBP) standard is used. ebBP provides dedicated B2Bi domain concepts such as so-called BusinessTransactions (BTs) that abstractly specify the exchange of a request business document and an optional response business document. In addition, ebBP provides a format for specifying the sequence of BT executions for capturing complex interaction scenarios. CHORCH improves the offering of ebBP in several ways. Firstly, the execution model of BTs which allows for inconsistent outcomes among the integration partners is redefined such that only consistent outcomes are possible. Secondly, two binary choreography styles are defined as B2Bi implementation contract format in order to streamline implementation projects. Both choreography styles are formalized and provided with a formal execution semantics for ensuring unambiguity. In addition, validity criteria are defined that ensure implementability using BPEL-based orchestrations. Thirdly, the analysis of the synchronization dependencies of complex B2Bi scenarios is supported by means of a multi-party choreography style combined with an analysis framework. This choreography style also is formalized and standard state machine semantics are reused in order to ensure unambiguity. Moreover, validity criteria are defined that allow for analyzing corresponding models for typical multi-party choreography issues. Altogether, CHORCH provides choreography styles that are B2Bi adequate, simple, unambiguous, and implementable. The choreography styles are B2Bi adequate in providing B2Bi domain concepts, in abstracting from low-level implementation details and in covering the majority of real-world B2Bi scenarios. Simplicity is fostered by using state machines as underlying specification paradigm. This allows for thinking in the states of a B2Bi scenario and for simple control flow structures. Unambiguity is provided by formal execution semantics whereas implementability (for the binary choreography styles) is ensured by providing mapping rules to BPEL-based implementations. The validation of CHORCH's choreography styles is performed in a twofold way. Firstly, the implementation of the binary choreography styles based on Web Services and BPEL technology is demonstrated which proves implementability using relatively low-cost technologies. Moreover, the analysis algorithms for the multi-party choreography styles are validated using a Java-based prototype. Secondly, an abstract visualization of the choreography styles based on BPMN is provided that abstracts from the technicalities of the ebBP standard. This proves the amenability of CHORCH to development methods that start out with visual models. CHORCH defines how to use BPMN choreographies for the purpose of B2Bi choreography modeling and translates the formal rules for choreography validity into simple composition rules that demonstrate valid ways of connecting the respective modeling constructs. In summary, CHORCH allows integration partners to start out with a high-level visual model of their interactions in BPMN that identifies the types and sequences of the BusinessTransactions to be used. For multi-party choreographies, a framework for analyzing synchronization dependencies then is available. For binary choreographies, an ebBP refinement can be derived that fills in the technical parameters that are needed for deriving the implementation. Finally, Web Services and BPEL based implementations can be generated. Thus, CHORCH allows for stepwise closing the semantic gap between the information perspective of business process models and the corresponding implementations. It is noteworthy that CHORCH uses international standards throughout all relevant layers, i.e., BPMN, ebBP, Web Services and BPEL, which helps in bridging the heterogeneous IT landscapes of B2Bi partners. In addition, the adoption of core CHORCH deliverables as international standards of the RosettaNet community give testament to the practical relevance and promise dissemination throughout the B2Bi community.Betriebsübergreifende Geschäftsprozessintegration ist eine logische Konsequenz allgegenwärtigen Wettbewerbsdrucks. In diesem Kontext fokussiert Business-To-Business integration (B2Bi) auf die Informationsaustausche zwischen Unternehmen. Eine B2Bi-Kernanforderung ist die Bereitstellung adäquater Modelle zur Spezifikation der Nachrichtenaustausche zwischen Integrationspartnern. Diese werden im Rahmen dieser Arbeit in Anlehnung an Service-orientierte Architekturen (SOA)-Terminologie Choreographien genannt. Bestehende Choreographiesprachen decken die Anforderungen an B2Bi-Choreographien nicht vollständig ab. Dedizierte B2Bi-Choreographiestandards definieren inkonsistente Austauschprozeduren für grundlegende Interaktionen und nur unvollständige Semantiken für fortgeschrittene Interaktionen. Formale oder Technik-getriebene Choreographiesprachen bieten die benötigte Präzision, lassen aber Domänenkonzepte vermissen oder operieren auf einer niedrigen Abstraktionsebene. Angesichts solcher Mängel wird die Spezifikation valider und vollständiger B2Bi-Choreographien zu einer echten Herausforderung. Gleichzeitig sind mangelhafte Choreographiemodelle gerade im B2Bi-Bereich besonders problematisch, da diese nicht nur zwischen Fach- und IT-Abteilung, sondern auch über Unternehmensgrenzen hinweg eingesetzt werden. Der CHORCH-Ansatz schafft an dieser Stelle mittels maßgeschneiderter Choreographien Abhilfe, welche die Vorteile von B2Bi-Choreographien und von formalen Ansätzen kombinieren. Als Ausgangspunkt wird das ebXML Business Process Specification Schema (ebBP) verwendet, das als B2Bi-Choreographiestandard Domänenkonzepte wie zum Beispiel sogenannte BusinessTransactions (BTs) bietet. Eine BT ist der Basisbaustein von B2Bi-Choreographien und spezifiziert den Austausch eines Geschäftsdokuments sowie eines optionalen Antwortdokuments. Darüber hinaus bietet ebBP ein Format zur Spezifikation von BT-Kompositionen zur Unterstützung komplexer Interaktionen. CHORCH erweitert ebBP wie folgt. Erstens, das Ausführungsmodell für BTs wird neu definiert, um inkonsistente Ergebniszustände zu eliminieren. Zweitens, für Entwicklungsprojekte werden zwei binäre Choreographieklassen definiert, die als B2Bi-Implementierungskontrakt dienen sollen. Die Formalisierung beider Klassen sowie formale operationale Semantiken gewährleisten Eindeutigkeit, während Validitätskriterien die Ausführbarkeit entsprechender Modelle mittels BPEL-basierter Orchestrationen garantieren. Drittens, zur Analyse der Synchronisationsbeziehungen komplexer B2Bi-Szenarien wird eine Multi-Party-Choreographieklasse nebst Analyseframework definiert. Wiederum wird für diese Klasse eine Formalisierung definiert, die mittels Standard-Zustandsautomatensemantik Eindeutigkeit gewährleistet. Ferner garantieren Validitätskriterien die Anwendbarkeit der definierten Analysealgorithmen. Insgesamt bieten die Choreographieklassen des CHORCH-Ansatzes ein B2Bi-adäquates, einfaches, eindeutiges und implementierbares Modell der Nachrichtenaustausche zwischen B2Bi-Partnern. B2Bi-Adäquatheit wird durch Verwendung von B2Bi-Domänenkonzepten, Abstraktion von rein technischen Kommunikationsdetails und Abdeckung der meisten praktisch relevanten B2Bi-Szenarien gewährleistet. Einfachheit ist ein Ausfluss der Verwendung eines Zustandsmaschinen-basierten Modellierungsparadigmas, das die Definition des Interaktionsfortschritts in Form von Zuständen sowie einfache Kontrollflussstrukturen ermöglicht. Eindeutigkeit wird durch die Verwendung formaler Semantiken garantiert, während Implementierbarkeit (für die beiden binären Choreographieklassen) durch Angabe von Mapping-Regeln auf BPEL-Orchestrationen sichergestellt wird. Die Validierung der CHORCH-Choreographieklassen erfolgt in zweierlei Hinsicht. Erstens, die Implementierbarkeit der binären Choreographieklassen mit Hilfe von Web Services und BPEL wird durch die Definition entsprechender Mappingregeln belegt. Weiterhin wird das Analyseframework der Multi-Party-Choreographieklasse als Java-Prototyp implementiert. Zweitens, für alle Choreographieklassen wird eine abstrakte Visualisierung auf BPMN-Basis definiert, die von diversen technischen Parametern des ebBP-Formats abstrahiert. Damit wird die Integrierbarkeit der CHORCH-Choreographieklassen in Entwicklungsansätze, die ein visuelles Modell als Ausgangspunkt vorsehen, belegt. CHORCH definiert, wie sogenannte BPMN-Choreographien zum Zweck der B2Bi-Choreographiemodellierung zu verwenden sind und übersetzt die Validitätskriterien der CHORCH-Choreographieklassen in einfache Modell-Kompositionsregeln. In seiner Gesamtheit bietet CHORCH somit einen Ansatz, mit Hilfe dessen B2Bi-Partner zunächst die Typen und zulässigen Reihenfolgen ihrer Geschäftsdokumentaustausche auf Basis eines abstrakten visuellen BPMN-Modells identifizieren können. Im Fall von Multi-Party-Choreographien steht dann ein Framework zur Analyse der Synchronisationsbeziehungen zwischen den Integrationspartnern zur Verfügung. Im Fall von binären Choreographien können ebBP-Verfeinerungen abgeleitet werden, welche die Modelle um technische Parameter anreichern, die zur Ableitung einer Implementierung benötigt werden. Diese ebBP-Modelle sind in Web Services- und BPEL-basierte Implementierungen übersetzbar. Damit erlaubt CHORCH die schrittweise Überbrückung der semantischen Lücke zwischen der Informationsaustauschperspektive von Geschäftsprozessmodellen und den zugehörigen Implementierungen. Ein beachtenswerter Aspekt des CHORCH-Ansatzes ist die Verwendung einschlägiger internationaler Standards auf allen Abstraktionsebenen, im Einzelnen BPMN, ebBP, Web Services und BPEL. Die Verwendung von Standards trägt dem heterogenen Umfeld von B2Bi-Szenarien Rechnung. Zusätzlich wurden Kernergebnisse des CHORCH-Ansatzes als internationale Standards der RosettaNet-B2Bi-Community veröffentlicht. Dies belegt die praktische Relevanz des Ansatzes und fördert die Verbreitung innerhalb der B2Bi-Community

    Soporte organizacional de medición y evaluación orientada a objetivos y sensible al contexto

    Get PDF
    El éxito de un producto o servicio de software depende en gran medida de la satisfacción de un cierto nivel de calidad observado sobre el mismo producto o servicio así como sobre los recursos y procesos involucrados en su obtención y su operación. Conocer de forma cuantitativa y objetiva el nivel de calidad de un producto, servicio, proceso o recurso es fundamental como parámetro para la toma de decisiones hacia un camino de mejora. Sin embargo, lograr esto no es una tarea trivial ya que la calidad es un concepto abstracto y relativo que debe ser considerado desde diversos puntos de vista, involucrando diversos aspectos y en contextos de aplicación particulares. Para lograr lo anterior, es crucial para una organización contar con programas y procesos de Aseguramiento de Calidad para definir las tareas necesarias para detectar y corregir problemas en la calidad así como procesos de Medición y Evaluación que provean la información cuantitativa y objetiva sobre los niveles de calidad reales de cada una de las entidades relevantes involucradas. Sin embargo, configurar, ejecutar y mantener un programa de medición y evaluación robusto y consistente no es una tarea simple. Cuando se implementan programas de medición y evaluación en organizaciones de software, ciertos aspectos técnicos clave deben ser resueltos. En primer lugar, se deben establecer claramente, mediante la especificación de un proceso, las actividades a realizar, así como los recursos y los artefactos que serán utilizados y producidos durante su ejecución. Estas actividades deben organizarse de tal manera que puedan ser coordinadas con actividades de soporte relacionadas con actividades ingenieriles. En segundo lugar, se debe establecer de forma explícita un marco conceptual que facilite un entendimiento común de los términos y relaciones utilizados en las actividades mencionadas entre los proyectos de la organización así como el intercambio y reuso consistente de instancias de tales conceptos para que los resultados puedan ser repetibles, comparables y consistentes. En tercer lugar, se deben utilizar métodos, técnicas y herramientas específicos y apropiados para llevar a cabo efectivamente las actividades definidas. En este sentido existe un número importante de propuestas que definen enfoques, procesos, modelos, métodos y herramientas para llevar a cabo las actividades de medición y evaluación. Sin embargo, la mayoría de estas propuestas no proveen un enfoque integrado que incluya todos estos elementos. Adicionalmente muchas de estas propuestas carece de una base conceptual que defina de forma clara y estructurada los conceptos y relaciones involucradas en tales actividades. Y en los casos donde existe una terminología definida no existe un consenso general (aún entre propuestas de la misma fuente) sobre los términos involucrados que permita integrar diferentes propuestas, complicando aún más los problemas de implementación de programas de medición y evaluación. En esta tesis se muestran los resultados de una investigación destinada a crear una solución para cubrir dicha falta proponiendo un marco conceptual con base ontológica que define de forma clara y estructurada los elementos de información (los términos y relaciones) utilizados para especificar el diseño e implementación de actividades de medición y evaluación. El marco propuesto fue creado a partir de los términos definidos en la literatura relevante, siguiendo un enfoque orientado a objetivos y enfocado a la organización. Además el marco sigue un enfoque sensible al contexto para proveer un soporte a la toma de decisiones más coherente y consistente entre los proyectos de la organización. El marco resultante conforma una plataforma de soporte para la definición de procesos y métodos para la ejecución de proyectos de medición y evaluación en la organización. La propuesta incluye además una arquitectura de soporte que permite la integración del marco al dominio de aplicación de la organización. Finalmente, estos elementos han sido implementados en una herramienta web que permite visualizar los beneficios de la propuesta.Facultad de Informátic
    corecore