3,064 research outputs found

    GRACE as a unifying approach to graph-transformation-based specification1 1This work was partially supported by the ESPRIT Working Group Applications of Graph Transformation (APPLIGRAPH) and the EC TMR Network GETGRATS (General Theory of Graph Transformation Systems).

    Get PDF
    AbstractIn this paper, we sketch some basic ideas and features of the graph-transformation-based specification language GRACE. The aim of GRACE is to support the modeling of a wide spectrum of graph and graphical processes in a structured and uniform way including visualization and verification

    An Abstract Module Concept for Graph Transformation Systems

    Get PDF
    Graph transformation systems are a well known formal specification technique that support the rule based specification of the dynamic behaviour of systems. Recently, many specification languages for graph transformation systems have been developed, and modularization techniques are then needed in order to deal with large and complex graph transformation specifications, to enhance the reuse of specifications, and to hide implementation details. In this paper we present an abstract categorical approach to modularization of graph transformation systems. Modules are called cat–modules and defined over a generic category cat of graph transformation specifications and morphisms. We describe the main characteristics and properties of cat–modules, their interconnection operations, namely union, composition and refinement of modules, and some compatibility properties between such operations

    Safe code transfromations for speculative execution in real-time systems

    Get PDF
    Although compiler optimization techniques are standard and successful in non-real-time systems, if naively applied, they can destroy safety guarantees and deadlines in hard real-time systems. For this reason, real-time systems developers have tended to avoid automatic compiler optimization of their code. However, real-time applications in several areas have been growing substantially in size and complexity in recent years. This size and complexity makes it impossible for real-time programmers to write optimal code, and consequently indicates a need for compiler optimization. Recently researchers have developed or modified analyses and transformations to improve performance without degrading worst-case execution times. Moreover, these optimization techniques can sometimes transform programs which may not meet constraints/deadlines, or which result in timeouts, into deadline-satisfying programs. One such technique, speculative execution, also used for example in parallel computing and databases, can enhance performance by executing parts of the code whose execution may or may not be needed. In some cases, rollback is necessary if the computation turns out to be invalid. However, speculative execution must be applied carefully to real-time systems so that the worst-case execution path is not extended. Deterministic worst-case execution for satisfying hard real-time constraints, and speculative execution with rollback for improving average-case throughput, appear to lie on opposite ends of a spectrum of performance requirements and strategies. Deterministic worst-case execution for satisfying hard real-time constraints, and speculative execution with rollback for improving average-case throughput, appear to lie on opposite ends of a spectrum of performance requirements and strategies. Nonetheless, this thesis shows that there are situations in which speculative execution can improve the performance of a hard real-time system, either by enhancing average performance while not affecting the worst-case, or by actually decreasing the worst-case execution time. The thesis proposes a set of compiler transformation rules to identify opportunities for speculative execution and to transform the code. Proofs for semantic correctness and timeliness preservation are provided to verify safety of applying transformation rules to real-time systems. Moreover, an extensive experiment using simulation of randomly generated real-time programs have been conducted to evaluate applicability and profitability of speculative execution. The simulation results indicate that speculative execution improves average execution time and program timeliness. Finally, a prototype implementation is described in which these transformations can be evaluated for realistic applications

    Humanoid Robot handling Hand-Signs Recognition

    Get PDF
    Recent advancements in human-robot interaction have led to tremendous improvement for humanoid robots but still lacks social acceptance among people. Though verbal communication is the primary means of human-robot interaction, non-verbal communication that is proven to be an integral part of the human interactions is not widely used in humanoid robots. This thesis aims to achieve human-robot interaction via non-verbal communication, especially using hand-signs. It presents a prototype system that simulates hand-signs recognition in the NAO humanoid robot, and further an online questionnaire is used to examine people's opinion on the use of non-verbal communication to interact with a humanoid robot. The positive results derived from the study indicates people's willingness to use non-verbal communication as a means to communicate with humanoid robots, thus encouraging robot designers to use non-verbal communications for enhancing human-robot interaction

    Photocatalytic Water Splitting using a Modified Pt-TiO2. Kinetic Modeling and Hydrogen Production Efficiency

    Get PDF
    Nowadays, the world experiences a high energy demand caused by the expansion of the industry sector as well as by increasing world population. There is, as a result, a steady depletion of non-renewable fossil fuels. This also leads to significant contaminant emissions such as CO2, contributing to green house gases and other noxious pollutants such as NOx and SOx. Thus, it is of high importance and interest to promote new alternative and environmental-friendly sources of energy. Heterogeneous photocatalysis as practiced in the present PhD dissertation is a promising alternative, producing hydrogen and simultaneously using a renewable organic scavenger (ethanol) at ambient conditions. In addition, heterogeneous photocatalysis can be, in principle, promoted by the interaction of a semiconductor material and photons in the solar light spectrum (UV-Visible-IR radiation). The present PhD dissertation demonstrates that hydrogen can be produced photocatalytically using a modified Degussa P25 (TiO2)-Pt photocatalyst in a slurry medium under near-UV irradiation and having ethanol as a sacrificial reagent (scavenger). The modified DP25-Pt photocatalyst was prepared using the incipient wetness impregnation technique. The Pt modified photocatalyst exhibited a 2.73 eV reduced band gap. Experiments were performed in a Photo-CREC Water II Reactor (PCW-II Reactor). This novel unit provides both radial and axial symmetrical irradiation profiles. Macroscopic energy balances developed in this unit, showed a 95% LVREA at 0.15 g of photocatalyst per liter of aqueous solution. Runs in the PCW-II Reactor showed hydrogen formation via H‱ radicals under oxygen free conditions. The use of 2 v/v% ethanol as sacrificial reagent enabled producing significant hydrogen amounts with the simultaneous formation of CH4 and C2H6 by products. It is proven that hydrogen formation in the presence of ethanol is a function of water solution pH and Pt loading on the TiO2 photocatalyst. Regarding the consumption of an ethanol scavenger, experimental findings are supported by an “in series-parallel” reaction network and a kinetic model. Kinetic model parameters were estimated using numerical non-linear regression. These kinetic parameters were determined under rigorous statistical methods. These methods were adapted to give an adequate fit to the experimental data and to all the by product species resulting from the photocatalytic hydrogen production “in series-parallel” kinetic model. Furthermore, hydrogen production, in the context of the present research, was also described using an “in parallel” reaction network. In this case, once again, kinetic parameters were established using carefully determined statistical methods. Concerning energy efficiencies, it was observed that the best obtained 7.9% quantum yield for hydrogen production indicates a good degree of photon utilization. This is particularly true in view of the fact that hydrogen production requires two simultaneous or quasi-simultaneous photons interacting with a semiconductor site. It was also proven via the Photochemical Thermodynamic Efficiency Factors (PTEFs) that observed PTEFs are in accordance with the thermodynamics remaining in all cases below 1. One can thus, conclude, with the result of the present research, the value of a modified DP25-Pt photocatalyst operating in the Photo-CREC Water II Reactor for hydrogen production, via photocatalytic water splitting

    Correct-by-Construction Development of Dynamic Topology Control Algorithms

    Get PDF
    Wireless devices are influencing our everyday lives today and will even more so in the future. A wireless sensor network (WSN) consists of dozens to hundreds of small, cheap, battery-powered, resource-constrained sensor devices (motes) that cooperate to serve a common purpose. These networks are applied in safety- and security-critical areas (e.g., e-health, intrusion detection). The topology of such a system is an attributed graph consisting of nodes representing the devices and edges representing the communication links between devices. Topology control (TC) improves the energy consumption behavior of a WSN by blocking costly links. This allows a mote to reduce its transmission power. A TC algorithm must fulfill important consistency properties (e.g., that the resulting topology is connected). The traditional development process for TC algorithms only considers consistency properties during the initial specification phase. The actual implementation is carried out manually, which is error prone and time consuming. Thus, it is difficult to verify that the implementation fulfills the required consistency properties. The problem becomes even more severe if the development process is iterative. Additionally, many TC algorithms are batch algorithms, which process the entire topology, irrespective of the extent of the topology modifications since the last execution. Therefore, dynamic TC is desirable, which reacts to change events of the topology. In this thesis, we propose a model-driven correct-by-construction methodology for developing dynamic TC algorithms. We model local consistency properties using graph constraints and global consistency properties using second-order logic. Graph transformation rules capture the different types of topology modifications. To specify the control flow of a TC algorithm, we employ the programmed graph transformation language story-driven modeling. We presume that local consistency properties jointly imply the global consistency properties. We ensure the fulfillment of the local consistency properties by synthesizing weakest preconditions for each rule. The synthesized preconditions prohibit the application of a rule if and only if the application would lead to a violation of a consistency property. Still, this restriction is infeasible for topology modifications that need to be executed in any case. Therefore, as a major contribution of this thesis, we propose the anticipation loop synthesis algorithm, which transforms the synthesized preconditions into routines that anticipate all violations of these preconditions. This algorithm also enables the correct-by-construction runtime reconfiguration of adaptive WSNs. We provide tooling for both common evaluation steps. Cobolt allows to evaluate the specified TC algorithms rapidly using the network simulator Simonstrator. cMoflon generates embedded C code for hardware testbeds that build on the sensor operating system Contiki

    Hybrid Architecture: The Integration of a Community Center on Existing Retail

    Get PDF
    A community center serves people in close proximity. It draws residents of the immediate neighborhood and those commuting to that neighborhood for work and other purposes. It has the power to form a sense of community that many communities lack. It consolidates different wants and needs in one location. However, existing community centers, especially the ones in Hawai‘i, often lack these qualities. Simply put, they are basically senior and childcare centers. Buildings labeled “community centers” are not designed with the programs and spatial qualities that would attract a wide range of age groups. The unnoticeable locations of most community centers do not advocate the importance of community centers either. By incorporating a community center on top of an existing retail center, the resulting hybrid can create exciting changes that can accommodate for the programmatic and social needs of individuals. A retail development is ideal for addition of a community center for several reasons. Retail has the ability to unite people in a way that few other places can. Everyone has shopped in one way or another. A retail center’s central and visible location can help create an identity for and magnify the significance of the integrated community center. However, the single-functionality of typical retail centers has caused many to run out of business. There is a growing desire for greater living in today’s urban developments; it is about creating enjoyable environments for buying goods and spending time. People visit retail environments wanting to shop, dine, socialize, and be entertained. Retail cannot function as a single entity. Rather, it is a subunit that supports other uses, such as a community center. More importantly, second floor retail has been proven unworkable through the years because Americans are accustomed to shopping on the street level. Thus, the addition of a community center above an existing retail development is a feasible solution that would promote positive changes to both building types. Successful civic facilities address pedestrian circulations and activity spaces, which can serve as catalysts for buying goods. The resulting hybrid development merges two disparate functions to support and benefit from each other. Areas where the functions of the two overlay can pose opportunities for exciting interventions. This new combination of mixed-use can increase efficiency by concentrating more uses into a central location. The architecture of a retail and community center can bring about numerous spatial and program changes to correspond to the needs and lifestyles of the residents that it serves

    Removing and restoring control flow with the Value State Dependence Graph

    Get PDF
    This thesis studies the practicality of compiling with only data flow information. Specifically, we focus on the challenges that arise when using the Value State Dependence Graph (VSDG) as an intermediate representation (IR). We perform a detailed survey of IRs in the literature in order to discover trends over time, and we classify them by their features in a taxonomy. We see how the VSDG fits into the IR landscape, and look at the divide between academia and the 'real world' in terms of compiler technology. Since most data flow IRs cannot be constructed for irreducible programs, we perform an empirical study of irreducibility in current versions of open source software, and then compare them with older versions of the same software. We also study machine-generated C code from a variety of different software tools. We show that irreducibility is no longer a problem, and is becoming less so with time. We then address the problem of constructing the VSDG. Since previous approaches in the literature have been poorly documented or ignored altogether, we give our approach to constructing the VSDG from a common IR: the Control Flow Graph. We show how our approach is independent of the source and target language, how it is able to handle unstructured control flow, and how it is able to transform irreducible programs on the fly. Once the VSDG is constructed, we implement Lawrence's proceduralisation algorithm in order to encode an evaluation strategy whilst translating the program into a parallel representation: the Program Dependence Graph. From here, we implement scheduling and then code generation using the LLVM compiler. We compare our compiler framework against several existing compilers, and show how removing control flow with the VSDG and then restoring it later can produce high quality code. We also examine specific situations where the VSDG can put pressure on existing code generators. Our results show that the VSDG represents a radically different, yet practical, approach to compilation
    • 

    corecore