514,372 research outputs found

    Requirements analysis of the VoD application using the tools in TRADE

    Get PDF
    This report contains a specification of requirements for a video-on-demand (VoD) application developed at Belgacom, used as a trial application in the 2RARE project. The specification contains three parts: an informal specification in natural language; a semiformal specification consisting of a number of diagrams intended to illustrate the informal specification; and a formal specification that makes the requiremants on the desired software system precise. The informal specification is structured in such a way that it resembles official specification documents conforming to standards such as that of IEEE or ESA. The semiformal specification uses some of the tools in from a requirements engineering toolkit called TRADE (Toolkit for Requirements And Design Engineering). The purpose of TRADE is to combine the best ideas in current structured and object-oriented analysis and design methods within a traditional systems engineering framework. In the case of the VoD system, the systems engineering framework is useful because it provides techniques for allocation and flowdown of system functions to components. TRADE consists of semiformal techniques taken from structured and object-oriented analysis as well as a formal specification langyage, which provides constructs that correspond to the semiformal constructs. The formal specification used in TRADE is LCM (Language for Conceptual Modeling), which is a syntactically sugared version of order-sorted dynamic logic with equality. The purpose of this report is to illustrate and validate the TRADE/LCM approach in the specification of distributed, communication-intensive systems

    Pulsar data analysis with PSRCHIVE

    Full text link
    PSRCHIVE is an open-source, object-oriented, scientific data analysis software library and application suite for pulsar astronomy. It implements an extensive range of general-purpose algorithms for use in data calibration and integration, statistical analysis and modeling, and visualisation. These are utilised by a variety of applications specialised for tasks such as pulsar timing, polarimetry, radio frequency interference mitigation, and pulse variability studies. This paper presents a general overview of PSRCHIVE functionality with some focus on the integrated interfaces developed for the core applications.Comment: 21 pages, 5 figures; tutorial presented at IPTA 2010 meeting in Leiden merged with talk presented at 2011 pulsar conference in Beijing; includes further research and development on algorithms for RFI mitigation and TOA bias correctio

    DSPSR: Digital Signal Processing Software for Pulsar Astronomy

    Full text link
    DSPSR is a high-performance, open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. Written primarily in C++, the library implements an extensive range of modular algorithms that can optionally exploit both multiple-core processors and general-purpose graphics processing units. After over a decade of research and development, DSPSR is now stable and in widespread use in the community. This paper presents a detailed description of its functionality, justification of major design decisions, analysis of phase-coherent dispersion removal algorithms, and demonstration of performance on some contemporary microprocessor architectures.Comment: 15 pages, 10 figures, to be published in PAS

    Computing and Visualizing Log-linear analysis interactively

    Get PDF
    The purpose of this paper is to describe a simple program for computing log-linear analysis based on a direct manipulation interface that emphasizes the use of plots for guiding the analysis and evaluating the results obtained. The program described here works as a plugin for ViSta (Young 1997) and receives the name of LoginViSta (for Log-linear analysis in ViSTa). ViSta is a statistical package based on Lisp-Stat. Lisp-Stat is a statistical programming environment developed by Luke Tierney (1990) that features an object-oriented approach for statistical computing and one that allows for The purpose of this paper is to describe a simple program for computing log-linear analysis based on a direct manipulation interface that emphasizes the use of plots for guiding the analysis and evaluating the results obtained. The program described here works as a plugin for ViSta (Young 1997) and receives the name of LoginViSta (for Log-linear analysis in ViSTa). ViSta is a statistical package based on Lisp-Stat. Lisp-Stat is a statistical programming environment developed by Luke Tierney (1990) that features an object-oriented approach for statistical computing and one that allows for Computing and Visualizing Pedro Valero-Mora and Forrest W. Young interactive and dynamic graphs

    Public survey instruments for business administration using social network analysis and big data

    Get PDF
    Purpose: The subject matter of this research is closely intertwined with the scientific discussion about the necessity of developing and implementing practice-oriented means of measuring social well-being taking into account the intensity of contacts between individuals. The aim of the research is to test the toolkit for analyzing social networks and to develop a research algorithm to identify sources of consolidation of public opinion and key agents of influence. The research methodology is based on postulates of sociology, graph theory, social network analysis and cluster analysis. Design/Methodology/Approach: The basis for the empirical research was provided by the data representing the reflection of social media users on the existing image of Russia and its activities in the Arctic, chosen as a model case. Findings: The algorithm allows to estimate the density and intensity of connections between actors, to trace the main channels of formation of public opinion and key agents of influence, to identify implicit patterns and trends, to relate information flows and events with current information causes and news stories for the subsequent formation of a "cleansed" image of the object under study and the key actors with whom this object is associated. Practical Implications: The work contributes to filling the existing gap in the scientific literature, caused by insufficient elaboration of the issues of applying the social network analysis to solve sociological problems. Originality/Value: The work contributes to filling the existing gap in the scientific literature formed as a result of insufficient development of practical issues of using analysis of social networks to solve sociological problems.peer-reviewe

    Exploitation of TerraSAR-X Data for Land use/Land Cover Analysis Using Object-Oriented Classification Approach in the African Sahel Area, Sudan.

    Get PDF
    Recently, object-oriented classification techniques based on image segmentation approaches are being studied using high-resolution satellite images to extract various thematic information. In this study different types of land use/land cover (LULC) types were analysed by employing object-oriented classification approach to dual TerraSAR-X images (HH and HV polarisation) at African Sahel. For that purpose, multi-resolution segmentation (MRS) of the Definiens software was used for creating the image objects. Using the feature space optimisation (FSO) tool the attributes of the TerraSAR-X image were optimised in order to obtain the best separability among classes for the LULC mapping. The backscattering coefficients (BSC) for some classes were observed to be different for HH and HV polarisations. The best separation distance of the tested spectral, shape and textural features showed different variations among the discriminated LULC classes. An overall accuracy of 84 % with a kappa value 0.82 was resulted from the classification scheme, while accuracy differences among the classes were kept minimal. Finally, the results highlighted the importance of a combine use of TerraSAR-X data and object-oriented classification approaches as a useful source of information and technique for LULC analysis in the African Sahel drylands

    The MGDO software library for data analysis in Ge neutrinoless double-beta decay experiments

    Full text link
    The GERDA and Majorana experiments will search for neutrinoless double-beta decay of germanium-76 using isotopically enriched high-purity germanium detectors. Although the experiments differ in conceptual design, they share many aspects in common, and in particular will employ similar data analysis techniques. The collaborations are jointly developing a C++ software library, MGDO, which contains a set of data objects and interfaces to encapsulate, store and manage physical quantities of interest, such as waveforms and high-purity germanium detector geometries. These data objects define a common format for persistent data, whether it is generated by Monte Carlo simulations or an experimental apparatus, to reduce code duplication and to ease the exchange of information between detector systems. MGDO also includes general-purpose analysis tools that can be used for the processing of measured or simulated digital signals. The MGDO design is based on the Object-Oriented programming paradigm and is very flexible, allowing for easy extension and customization of the components. The tools provided by the MGDO libraries are used by both GERDA and Majorana.Comment: 4 pages, 1 figure, proceedings for TAUP201
    corecore