261 research outputs found
Spectral Line Removal in the LIGO Data Analysis System (LDAS)
High power in narrow frequency bands, spectral lines, are a feature of an
interferometric gravitational wave detector's output. Some lines are coherent
between interferometers, in particular, the 2 km and 4 km LIGO Hanford
instruments. This is of concern to data analysis techniques, such as the
stochastic background search, that use correlations between instruments to
detect gravitational radiation. Several techniques of `line removal' have been
proposed. Where a line is attributable to a measurable environmental
disturbance, a simple linear model may be fitted to predict, and subsequently
subtract away, that line. This technique has been implemented (as the command
oelslr) in the LIGO Data Analysis System (LDAS). We demonstrate its application
to LIGO S1 data.Comment: 11 pages, 5 figures, to be published in CQG GWDAW02 proceeding
Logarithmic growth dynamics in software networks
In a recent paper, Krapivsky and Redner (Phys. Rev. E, 71 (2005) 036118)
proposed a new growing network model with new nodes being attached to a
randomly selected node, as well to all ancestors of the target node. The model
leads to a sparse graph with an average degree growing logarithmically with the
system size. Here we present compeling evidence for software networks being the
result of a similar class of growing dynamics. The predicted pattern of network
growth, as well as the stationary in- and out-degree distributions are
consistent with the model. Our results confirm the view of large-scale software
topology being generated through duplication-rewiring mechanisms. Implications
of these findings are outlined.Comment: 7 pages, 3 figures, published in Europhysics Letters (2005
The impact of perinatal severe acute respiratory syndrome coronavirus 2 infection during the peripartum period.
An Implicitization Challenge for Binary Factor Analysis
We use tropical geometry to compute the multidegree and Newton polytope of
the hypersurface of a statistical model with two hidden and four observed
binary random variables, solving an open question stated by Drton, Sturmfels
and Sullivant in "Lectures on Algebraic Statistics" (Problem 7.7). The model is
obtained from the undirected graphical model of the complete bipartite graph
by marginalizing two of the six binary random variables. We present
algorithms for computing the Newton polytope of its defining equation by
parallel walks along the polytope and its normal fan. In this way we compute
vertices of the polytope. Finally, we also compute and certify its facets by
studying tangent cones of the polytope at the symmetry classes vertices. The
Newton polytope has 17214912 vertices in 44938 symmetry classes and 70646
facets in 246 symmetry classes.Comment: 25 pages, 5 figures, presented at Mega 09 (Barcelona, Spain
Validation of Kalman Filter alignment algorithm with cosmic-ray data using a CMS silicon strip tracker endcap
A Kalman Filter alignment algorithm has been applied to cosmic-ray data. We
discuss the alignment algorithm and an experiment-independent implementation
including outlier rejection and treatment of weakly determined parameters.
Using this implementation, the algorithm has been applied to data recorded with
one CMS silicon tracker endcap. Results are compared to both photogrammetry
measurements and data obtained from a dedicated hardware alignment system, and
good agreement is observed.Comment: 11 pages, 8 figures. CMS NOTE-2010/00
Semantic mutation testing
This is the Pre-print version of the Article. The official published version can be obtained from the link below - Copyright @ 2011 ElsevierMutation testing is a powerful and flexible test technique. Traditional mutation testing makes a small change to the syntax of a description (usually a program) in order to create a mutant. A test suite is considered to be good if it distinguishes between the original description and all of the (functionally non-equivalent) mutants. These mutants can be seen as representing potential small slips and thus mutation testing aims to produce a test suite that is good at finding such slips. It has also been argued that a test suite that finds such small changes is likely to find larger changes. This paper describes a new approach to mutation testing, called semantic mutation testing. Rather than mutate the description, semantic mutation testing mutates the semantics of the language in which the description is written. The mutations of the semantics of the language represent possible misunderstandings of the description language and thus capture a different class of faults. Since the likely misunderstandings are highly context dependent, this context should be used to determine which semantic mutants should be produced. The approach is illustrated through examples with statecharts and C code. The paper also describes a semantic mutation testing tool for C and the results of experiments that investigated the nature of some semantic mutation operators for C
ICON: A System for Implementing Constraints in Object-based Networks
In today's Network Management scenario, the network operator's interface to the network is through a Management Information Base (MIB). The MIB stores all management related data such as configuration information, performance statistics, and trouble logs and so on. Configuration management, which is at the core of network management, is implemented through the MIB in a three step process: making updates to MIB data elements, checking the validity of the updates, propagating the effects of the updates to the network elements. While all three steps need to be executed efficiently for the MIB to serve its intended goal, the second step of checking update validity is especially important from the management viewpoint. For example, if an operator mistakenly configures a ninth port on an eight port card, it is essential that the MIB should both detect and prevent this error. Allowing such operations to go through would have adverse impact on the performance of the network (since it increases the network management traffic). Therefore, we focus primarily on the problem of checking the validity of updates to MIB data elements, which can be viewed as a specific instance of the general problem of constraint management in database systems. We introduce the design of ICON (Implementing Constraints in Object-based Networks), a proposed constraint management system. In ICON, constrains are expressed through rules. Each rule is composed of an event, a condition, and an action. Occurrence of the event triggers the rule, the condition is a boolean check, and the action is executed if the condition is satisfied. Rules and events are also treated as objects so that they can be created, modified, and deleted like other objects, thus providing a uniform view of rules and events in an OO context. The OO paradigm results in an extensible and a reusable system. To our knowledge, not much work has been done in this area and this paper would trigger further research in this area
Wiselib: A Generic Algorithm Library for Heterogeneous Sensor Networks
One unfortunate consequence of the success story of wireless sensor networks
(WSNs) in separate research communities is an ever-growing gap between theory
and practice. Even though there is a increasing number of algorithmic methods
for WSNs, the vast majority has never been tried in practice; conversely, many
practical challenges are still awaiting efficient algorithmic solutions. The
main cause for this discrepancy is the fact that programming sensor nodes still
happens at a very technical level. We remedy the situation by introducing
Wiselib, our algorithm library that allows for simple implementations of
algorithms onto a large variety of hardware and software. This is achieved by
employing advanced C++ techniques such as templates and inline functions,
allowing to write generic code that is resolved and bound at compile time,
resulting in virtually no memory or computation overhead at run time.
The Wiselib runs on different host operating systems, such as Contiki, iSense
OS, and ScatterWeb. Furthermore, it runs on virtual nodes simulated by Shawn.
For any algorithm, the Wiselib provides data structures that suit the specific
properties of the target platform. Algorithm code does not contain any
platform-specific specializations, allowing a single implementation to run
natively on heterogeneous networks.
In this paper, we describe the building blocks of the Wiselib, and analyze
the overhead. We demonstrate the effectiveness of our approach by showing how
routing algorithms can be implemented. We also report on results from
experiments with real sensor-node hardware.Comment: 16 pages, 1 figure, 7 tables. Appears in European Conference on
Wireless Sensor Networks (EWSN 2010
Modularity and Variability of Distributed Software Architectures through Multi-view Refinement of AO-Connectors
The Verifying Compiler: A Grand Challenge for Computing Research
Abstract. This contribution proposes a set of criteria that distinguish a grand challenge in science or engineering from the many other kinds of short-term or long-term research problems that engage the interest of scientists and engineers. As an example drawn from Computer Science, it revives an old challenge: the construction and application of a verifying compiler that guarantees correctness of a program before running it. Introduction. The primary purpose of the formulation and promulgation of a grand challenge is the advancement of science or engineering. A grand challenge represents a commitment by a significant section of the research community to work together towards a common goal, agreed to be valuable and achievable by a team effort within a predicted timescale. The challenge is formulated by th
- …