514,372 research outputs found
Requirements analysis of the VoD application using the tools in TRADE
This report contains a specification of requirements for a video-on-demand (VoD) application developed at Belgacom, used as a trial application in the 2RARE project. The specification contains three parts: an informal specification in natural language; a semiformal specification consisting of a number of diagrams intended to illustrate the informal specification; and a formal specification that makes the requiremants on the desired software system precise. The informal specification is structured in such a way that it resembles official specification documents conforming to standards such as that of IEEE or ESA. The semiformal specification uses some of the tools in from a requirements engineering toolkit called TRADE (Toolkit for Requirements And Design Engineering). The purpose of TRADE is to combine the best ideas in current structured and object-oriented analysis and design methods within a traditional systems engineering framework. In the case of the VoD system, the systems engineering framework is useful because it provides techniques for allocation and flowdown of system functions to components. TRADE consists of semiformal techniques taken from structured and object-oriented analysis as well as a formal specification langyage, which provides constructs that correspond to the semiformal constructs. The formal specification used in TRADE is LCM (Language for Conceptual Modeling), which is a syntactically sugared version of order-sorted dynamic logic with equality. The purpose of this report is to illustrate and validate the TRADE/LCM approach in the specification of distributed, communication-intensive systems
Pulsar data analysis with PSRCHIVE
PSRCHIVE is an open-source, object-oriented, scientific data analysis
software library and application suite for pulsar astronomy. It implements an
extensive range of general-purpose algorithms for use in data calibration and
integration, statistical analysis and modeling, and visualisation. These are
utilised by a variety of applications specialised for tasks such as pulsar
timing, polarimetry, radio frequency interference mitigation, and pulse
variability studies. This paper presents a general overview of PSRCHIVE
functionality with some focus on the integrated interfaces developed for the
core applications.Comment: 21 pages, 5 figures; tutorial presented at IPTA 2010 meeting in
Leiden merged with talk presented at 2011 pulsar conference in Beijing;
includes further research and development on algorithms for RFI mitigation
and TOA bias correctio
DSPSR: Digital Signal Processing Software for Pulsar Astronomy
DSPSR is a high-performance, open-source, object-oriented, digital signal
processing software library and application suite for use in radio pulsar
astronomy. Written primarily in C++, the library implements an extensive range
of modular algorithms that can optionally exploit both multiple-core processors
and general-purpose graphics processing units. After over a decade of research
and development, DSPSR is now stable and in widespread use in the community.
This paper presents a detailed description of its functionality, justification
of major design decisions, analysis of phase-coherent dispersion removal
algorithms, and demonstration of performance on some contemporary
microprocessor architectures.Comment: 15 pages, 10 figures, to be published in PAS
Computing and Visualizing Log-linear analysis interactively
The purpose of this paper is to describe a simple program for computing log-linear analysis based on a direct manipulation interface that emphasizes the use of plots for guiding the analysis and evaluating the results obtained. The program described here works as a plugin for ViSta (Young 1997) and receives the name of LoginViSta (for Log-linear analysis in ViSTa). ViSta is a statistical package based on Lisp-Stat. Lisp-Stat is a statistical programming environment developed by Luke Tierney (1990) that features an object-oriented approach for statistical computing and one that allows for The purpose of this paper is to describe a simple program for computing log-linear analysis based on a direct manipulation interface that emphasizes the use of plots for guiding the analysis and evaluating the results obtained. The program described here works as a plugin for ViSta (Young 1997) and receives the name of LoginViSta (for Log-linear analysis in ViSTa). ViSta is a statistical package based on Lisp-Stat. Lisp-Stat is a statistical programming environment developed by Luke Tierney (1990) that features an object-oriented approach for statistical computing and one that allows for Computing and Visualizing Pedro Valero-Mora and Forrest W. Young interactive and dynamic graphs
Recommended from our members
Object-oriented views: a novel approach for tool integration in design environments (dissertation)
Object-oriented databases have been proposed to serve as the data management component of integrated design environments. One central database represents a bottleneck, however, requiring all design tools to work on the same information model and preventing the extensibility of the system over time. In this dissertation, I propose a view-based object server that successfully addresses these problems by supporting design views tailored to the needs of individual design tools.A view on an object-oriented schema corresponds to a virtual subschema graph with restructured generalization and property decomposition hierarchies. I present a methodology for supporting multiple view schemata, called MutliView. MultiView is anchored on the following four ideas: (1) the customization of individual classes using object algebra, (2) the integration of these derived classes into one global schema graoh, (3) the extraction of virtual and base classes from the global schema as required by the view, and (4) the generation of a class hierarchy for these selected view classes. MutliView's division of view specification into these well-defined tasks, some of which have been successfully automated, makes it a powerful tool for supporting the specification of views by non-database experts while enforcing view consistency.In this dissertation, I describe solutions for all four tasks underlying MultiView. For the first task, I have formulated class derivatin operators modeled after the well-known relational algebra operators. For the second task, I have developed a classification algorithm that automatically integrates derived classes into one global schema. For the third task, I have designed a view definition language that can be used to declaratively specify the view classes required for a particular view. For the last task, I have developed an algorithm that generates a complete, minimal and consistent view schema. I present proofs of correctness, complexity analysis, and numerous illustrative examples for all algorithms.MultiView is applied to address the tool integration problem in a behavioral synthesis system. For this purpose, I first develop a unified design object model for behavioral synthesis. I then formulate customized design views of this model tailored to the needs of particular design tools. The resulting system allows the design tools to work on their view of the information model, while MultiView assures the consistent integration of the diverse design data into one object model
Public survey instruments for business administration using social network analysis and big data
Purpose: The subject matter of this research is closely intertwined with the scientific discussion about the necessity of developing and implementing practice-oriented means of measuring social well-being taking into account the intensity of contacts between individuals. The aim of the research is to test the toolkit for analyzing social networks and to develop a research algorithm to identify sources of consolidation of public opinion and key agents of influence. The research methodology is based on postulates of sociology, graph theory, social network analysis and cluster analysis. Design/Methodology/Approach: The basis for the empirical research was provided by the data representing the reflection of social media users on the existing image of Russia and its activities in the Arctic, chosen as a model case. Findings: The algorithm allows to estimate the density and intensity of connections between actors, to trace the main channels of formation of public opinion and key agents of influence, to identify implicit patterns and trends, to relate information flows and events with current information causes and news stories for the subsequent formation of a "cleansed" image of the object under study and the key actors with whom this object is associated. Practical Implications: The work contributes to filling the existing gap in the scientific literature, caused by insufficient elaboration of the issues of applying the social network analysis to solve sociological problems. Originality/Value: The work contributes to filling the existing gap in the scientific literature formed as a result of insufficient development of practical issues of using analysis of social networks to solve sociological problems.peer-reviewe
Exploitation of TerraSAR-X Data for Land use/Land Cover Analysis Using Object-Oriented Classification Approach in the African Sahel Area, Sudan.
Recently, object-oriented classification techniques based on image segmentation approaches are being studied using high-resolution satellite images to extract various thematic information. In this study different types of land use/land cover (LULC) types were analysed by employing object-oriented classification approach to dual TerraSAR-X images (HH and HV polarisation) at African Sahel. For that purpose, multi-resolution segmentation (MRS) of the Definiens software was used for creating the image objects. Using the feature space optimisation (FSO) tool the attributes of the TerraSAR-X image were optimised in order to obtain the best separability among classes for the LULC mapping. The backscattering coefficients (BSC) for some classes were observed to be different for HH and HV polarisations. The best separation distance of the tested spectral, shape and textural features showed different variations among the discriminated LULC classes. An overall accuracy of 84 % with a kappa value 0.82 was resulted from the classification scheme, while accuracy differences among the classes were kept minimal. Finally, the results highlighted the importance of a combine use of TerraSAR-X data and object-oriented classification approaches as a useful source of information and technique for LULC analysis in the African Sahel drylands
The MGDO software library for data analysis in Ge neutrinoless double-beta decay experiments
The GERDA and Majorana experiments will search for neutrinoless double-beta
decay of germanium-76 using isotopically enriched high-purity germanium
detectors. Although the experiments differ in conceptual design, they share
many aspects in common, and in particular will employ similar data analysis
techniques. The collaborations are jointly developing a C++ software library,
MGDO, which contains a set of data objects and interfaces to encapsulate, store
and manage physical quantities of interest, such as waveforms and high-purity
germanium detector geometries. These data objects define a common format for
persistent data, whether it is generated by Monte Carlo simulations or an
experimental apparatus, to reduce code duplication and to ease the exchange of
information between detector systems. MGDO also includes general-purpose
analysis tools that can be used for the processing of measured or simulated
digital signals. The MGDO design is based on the Object-Oriented programming
paradigm and is very flexible, allowing for easy extension and customization of
the components. The tools provided by the MGDO libraries are used by both GERDA
and Majorana.Comment: 4 pages, 1 figure, proceedings for TAUP201
- …