7,049 research outputs found
Illustrating interference in interfering versions of programs
The need to integrate several versions of a program into a common one arises frequently, but it is a tedious and time consuming task to merge programs by hand. The program-integration algorithm recently proposed by S. Horwitz, J. Prins, and T. Reps provides a way to create a semantics-based tool for program integration. The integration algorithm is based on the assumption that any change in the behavior, rather than the text, of a program variant is significant and must be preserved in the merged program. An integration system based on this algorithm will either automatically combine several different but related variants of a base program, or else determine that the programs incorporate interfering changes.
In this paper we discuss how an integration tool can illustrate the causes of interference to the user when interference is detected. Our main technical result is an alternative characterization of the integration algorithm�s interference criterion that is more suitable to illustrating the causes of interference. We then propose six methods for an integration system to display information to demonstrate the causes of interference to the user
Non-interference for deterministic interactive programs
We consider the problem of defining an appropriate notion of non-interference (NI) for deterministic interactive programs. Previous work on the security of interactive programs by O'Neill, Clarkson and Chong (CSFW 2006) builds on earlier ideas due to Wittbold and Johnson (Symposium on Security and Privacy 1990), and argues for a notion of NI defined in terms of strategies modelling the behaviour of users. We show that, for deterministic interactive programs, it is not necessary to consider strategies and that a simple stream model of the users' behaviour is sufficient. The key technical result is that, for deterministic programs, stream-based NI implies the apparently more general strategy-based NI (in fact we consider a wider class of strategies than those of O'Neill et al). We give our results in terms of a simple notion of Input-Output Labelled Transition System, thus allowing application of the results to a large class of deterministic interactive programming languages
The Anatomy and Facets of Dynamic Policies
Information flow policies are often dynamic; the security concerns of a
program will typically change during execution to reflect security-relevant
events. A key challenge is how to best specify, and give proper meaning to,
such dynamic policies. A large number of approaches exist that tackle that
challenge, each yielding some important, but unconnected, insight. In this work
we synthesise existing knowledge on dynamic policies, with an aim to establish
a common terminology, best practices, and frameworks for reasoning about them.
We introduce the concept of facets to illuminate subtleties in the semantics of
policies, and closely examine the anatomy of policies and the expressiveness of
policy specification mechanisms. We further explore the relation between
dynamic policies and the concept of declassification.Comment: Technical Report of publication under the same name in Computer
Security Foundations (CSF) 201
Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC), version 4.0: User's manual
The information in the NASARC (Version 4.0) Technical Manual (NASA-TM-101453) and NASARC (Version 4.0) User's Manual (NASA-TM-101454) relates to the state of Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) software development through November 1, 1988. The Technical Manual describes the NASARC concept and the algorithms used to implement the concept. The User's Manual provides information on computer system considerations, installation instructions, description of input files, and program operation instructions. Significant revisions were incorporated in the Version 4.0 software over prior versions. These revisions have further enhanced the modeling capabilities of the NASARC procedure and provide improved arrangements of predetermined arcs within the geostationary orbit. Array dimensions within the software were structured to fit within the currently available 12-megabyte memory capacity of the International Frequency Registration Board (IFRB) computer facility. A piecewise approach to predetermined arc generation in NASARC (Version 4.) allows worldwide planning problem scenarios to be accommodated within computer run time and memory constraints with enhanced likelihood and ease of solution
GR@PPA 2.8: initial-state jet matching for weak boson production processes at hadron collisions
The initial-state jet matching method introduced in our previous studies has
been applied to the event generation of single and production processes
and diboson (, and ) production processes at hadron
collisions in the framework of the GR@PPA event generator. The generated events
reproduce the transverse momentum spectra of weak bosons continuously in the
entire kinematical region. The matrix elements (ME) for hard interactions are
still at the tree level. As in previous versions, the decays of weak bosons are
included in the matrix elements. Therefore, spin correlations and phase-space
effects in the decay of weak bosons are exact at the tree level. The program
package includes custom-made parton shower programs as well as ME-based hard
interaction generators in order to achieve self-consistent jet matching. The
generated events can be passed to general-purpose event generators to make the
simulation proceed down to the hadron level.Comment: 29 pages, 14 figures; minor changes to clarify the discussions, and
corrections of typo
An Extension of the ns-3 LTE Module to Simulate Fractional Frequency Reuse Algorithms
We developed an extension for the LTE module of the ns-3 simulator in order to allow the simulation of Fractional Frequency Reuse algorithms and the evaluation of their performance in an LTE scenario. In this paper, we describe the technical components of such extension, namely the new API for Fractional Frequency Reuse algorithms, the implementation of several state-of-the-art-algorithms based on such API, and the implementation of the LTE downlink and uplink power control functionality which are required by many of these algorithms. Additionally, we provide an overview of the test suites that are included with our extension in order to validate its correct functionality, and discuss some example scenarios illustrating how our extension can be used in an LTE simulation
From Lagrangians to Events: Computer Tutorial at the MC4BSM-2012 Workshop
This is a written account of the computer tutorial offered at the Sixth
MC4BSM workshop at Cornell University, March 22-24, 2012. The tools covered
during the tutorial include: FeynRules, LanHEP, MadGraph, CalcHEP, Pythia 8,
Herwig++, and Sherpa. In the tutorial, we specify a simple extension of the
Standard Model, at the level of a Lagrangian. The software tools are then used
to automatically generate a set of Feynman rules, compute the invariant matrix
element for a sample process, and generate both parton-level and fully
hadronized/showered Monte Carlo event samples. The tutorial is designed to be
self-paced, and detailed instructions for all steps are included in this
write-up. Installation instructions for each tool on a variety of popular
platforms are also provided.Comment: 58 pages, 1 figur
Handbook of aircraft noise metrics
Information is presented on 22 noise metrics that are associated with the measurement and prediction of the effects of aircraft noise. Some of the instantaneous frequency weighted sound level measures, such as A-weighted sound level, are used to provide multiple assessment of the aircraft noise level. Other multiple event metrics, such as day-night average sound level, were designed to relate sound levels measured over a period of time to subjective responses in an effort to determine compatible land uses and aid in community planning. The various measures are divided into: (1) instantaneous sound level metrics; (2) duration corrected single event metrics; (3) multiple event metrics; and (4) speech communication metrics. The scope of each measure is examined in terms of its: definition, purpose, background, relationship to other measures, calculation method, example, equipment, references, and standards
- …