138,183 research outputs found

    Mauve: a Component-based Modeling Framework for Real-time Analysis of Robotic Applications.

    Get PDF
    Robots are more and more used in very diverse situations (services to persons, military missions, crisis management, . . . ) in which robots must give some guarantees of safety and reliability. To be really integrated in everyday life, robots must fulfil some requirements. Among these requirements, we focus on the nonfunctional requirements on embedded software [1], and more specifically on real-time software requirements. These requirements are most of the time fulfilled by proving the schedulability of the embedded software. Analysing and validating such properties on an existing hand-coded software requires some reverse modelling of the software, leading to approximations of its behaviour. These approximations may have certification authorities not be confident on the robot dependability. This paper proposes an integrated development methodology that starts from software component modelling, and leads to both validation of the embedded software and generation of deployable embedded software

    A CSP-Based Trajectory for Designing Formally Verified Embedded Control Software

    Get PDF
    This paper presents in a nutshell a procedure for producing formally verified concurrent software. The design paradigm provides means for translating block-diagrammed models of systems from various problem domains in a graphical notation for process-oriented architectures. Briefly presented CASE tool allows code generation both for formal analysis of the models of software and code generation in a target implementation language. For formal analysis a highquality commercial formal checker is used

    Dialectic tensions in the financial markets: a longitudinal study of pre- and post-crisis regulatory technology

    Get PDF
    This article presents the findings from a longitudinal research study on regulatory technology in the UK financial services industry. The financial crisis with serious corporate and mutual fund scandals raised the profile of compliance as governmental bodies, institutional and private investors introduced a ‘tsunami’ of financial regulations. Adopting a multi-level analysis, this study examines how regulatory technology was used by financial firms to meet their compliance obligations, pre- and post-crisis. Empirical data collected over 12 years examine the deployment of an investment management system in eight financial firms. Interviews with public regulatory bodies, financial institutions and technology providers reveal a culture of compliance with increased transparency, surveillance and accountability. Findings show that dialectic tensions arise as the pursuit of transparency, surveillance and accountability in compliance mandates is simultaneously rationalized, facilitated and obscured by regulatory technology. Responding to these challenges, regulatory bodies continue to impose revised compliance mandates on financial firms to force them to adapt their financial technologies in an ever-changing multi-jurisdictional regulatory landscape

    NGN PLATFORMS FOR EMERGENCY

    Get PDF

    Time lagged ordinal partition networks for capturing dynamics of continuous dynamical systems

    Full text link
    We investigate a generalised version of the recently proposed ordinal partition time series to network transformation algorithm. Firstly we introduce a fixed time lag for the elements of each partition that is selected using techniques from traditional time delay embedding. The resulting partitions define regions in the embedding phase space that are mapped to nodes in the network space. Edges are allocated between nodes based on temporal succession thus creating a Markov chain representation of the time series. We then apply this new transformation algorithm to time series generated by the R\"ossler system and find that periodic dynamics translate to ring structures whereas chaotic time series translate to band or tube-like structures -- thereby indicating that our algorithm generates networks whose structure is sensitive to system dynamics. Furthermore we demonstrate that simple network measures including the mean out degree and variance of out degrees can track changes in the dynamical behaviour in a manner comparable to the largest Lyapunov exponent. We also apply the same analysis to experimental time series generated by a diode resonator circuit and show that the network size, mean shortest path length and network diameter are highly sensitive to the interior crisis captured in this particular data set

    A low cost mobile mapping system (LCMMS) for field data acquisition: a potential use to validate aerial/satellite building damage assessment

    Get PDF
    Among the major natural disasters that occurred in 2010, the Haiti earthquake was a real turning point concerning the availability, dissemination and licensing of a huge quantity of geospatial data. In a few days several map products based on the analysis of remotely sensed data-sets were delivered to users. This demonstrated the need for reliable methods to validate the increasing variety of open source data and remote sensing-derived products for crisis management, with the aim to correctly spatially reference and interconnect these data with other global digital archives. As far as building damage assessment is concerned, the need for accurate field data to overcome the limitations of both vertical and oblique view satellite and aerial images was evident. To cope with the aforementioned need, a newly developed Low-Cost Mobile Mapping System (LCMMS) was deployed in Port-au-Prince (Haiti) and tested during a five-day survey in FebruaryMarch 2010. The system allows for acquisition of movies and single georeferenced frames by means of a transportable device easily installable (or adaptable) to every type of vehicle. It is composed of four webcams with a total field of view of about 180 degrees and one Global Positioning System (GPS) receiver, with the main aim to rapidly cover large areas for effective usage in emergency situations. The main technical features of the LCMMS, the operational use in the field (and related issues) and a potential approach to be adopted for the validation of satellite/aerial building damage assessments are thoroughly described in the articl

    An application generator for rapid prototyping of Ada real-time control software

    Get PDF
    The need to increase engineering productivity and decrease software life cycle costs in real-time system development establishes a motivation for a method of rapid prototyping. The design by iterative rapid prototyping technique is described. A tool which facilitates such a design methodology for the generation of embedded control software is described

    Y2K Interruption: Can the Doomsday Scenario Be Averted?

    Get PDF
    The management philosophy until recent years has been to replace the workers with computers, which are available 24 hours a day, need no benefits, no insurance and never complain. But as the year 2000 approached, along with it came the fear of the millennium bug, generally known as Y2K, and the computers threatened to strike!!!! Y2K, though an abbreviation of year 2000, generally refers to the computer glitches which are associated with the year 2000. Computer companies, in order to save memory and money, adopted a voluntary standard in the beginning of the computer era that all computers automatically convert any year designated by two numbers such as 99 into 1999 by adding the digits 19. This saved enormous amount of memory, and thus money, because large databases containing birth dates or other dates only needed to contain the last two digits such as 65 or 86. But it also created a built in flaw that could make the computers inoperable from January 2000. The problem is that most of these old computers are programmed to convert 00 (for the year 2000) into 1900 and not 2000. The trouble could therefore, arise when the systems had to deal with dates outside the 1900s. In 2000, for example a programme that calculates the age of a person born in 1965 will subtract 65 from 00 and get -65. The problem is most acute in mainframe systems, but that does not mean PCs, UNIX and other computing environments are trouble free. Any computer system that relies on date calculations must be tested because the Y2K or the millennium bug arises because of a potential for “date discontinuity” which occurs when the time expressed by a system, or any of its components, does not move in consonance with real time. Though attention has been focused on the potential problems linked with change from 1999 to 2000, date discontinuity may occur at other times in and around this period.
    • 

    corecore