102 research outputs found
McFSM: Globally Taming Complex Systems
Industrial computing devices, in particular cyber-physical, real-time and
safety-critical systems, focus on reacting to external events and the need to
cooperate with other devices to create a functional system. They are often
implemented with languages that focus on a simple, local description of how a
component reacts to external input data and stimuli. Despite the trend in
modern software architectures to structure systems into largely independent
components, the remaining interdependencies still create rich behavioural
dynamics even for small systems. Standard and industrial programming approaches
do usually not model or extensively describe the global properties of an entire
system. Although a large number of approaches to solve this dilemma have been
suggested, it remains a hard and error-prone task to implement systems with
complex interdependencies correctly.
We introduce multiple coupled finite state machines (McFSMs), a novel
mechanism that allows us to model and manage such interdependencies. It is
based on a consistent, well-structured and simple global description. A sound
theoretical foundation is provided, and associated tools allow us to generate
efficient low-level code in various programming languages using model-driven
techniques. We also present a domain specific language to express McFSMs and
their connections to other systems, to model their dynamic behaviour, and to
investigate their efficiency and correctness at compile-time.Comment: To appear in SEsCPS@ICSE201
Multi-mode states in decoy-based quantum key distribution protocols
Every security analysis of quantum key distribution (QKD) relies on a
faithful modeling of the employed quantum states. Many photon sources, like for
instance a parametric down conversion (PDC) source, require a multi-mode
description, but are usually only considered in a single-mode representation.
In general, the important claim in decoy-based QKD protocols for
indistinguishability between signal and decoy states does not hold for all
sources. We derive new bounds on the single photon transmission probability and
error rate for multi-mode states, and apply these bounds to the output state of
a PDC source. We observe two opposing effects on the secure key rate. First,
the multi-mode structure of the state gives rise to a new attack that decreases
the key rate. Second, more contributing modes change the photon number
distribution from a thermal towards a Poissonian distribution, which increases
the key rate
Observing Custom Software Modifications: A Quantitative Approach of Tracking the Evolution of Patch Stacks
Modifications to open-source software (OSS) are often provided in the form of
"patch stacks" - sets of changes (patches) that modify a given body of source
code. Maintaining patch stacks over extended periods of time is problematic
when the underlying base project changes frequently. This necessitates a
continuous and engineering-intensive adaptation of the stack. Nonetheless,
long-term maintenance is an important problem for changes that are not
integrated into projects, for instance when they are controversial or only of
value to a limited group of users.
We present and implement a methodology to systematically examine the temporal
evolution of patch stacks, track non-functional properties like integrability
and maintainability, and estimate the eventual economic and engineering effort
required to successfully develop and maintain patch stacks.
Our results provide a basis for quantitative research on patch stacks,
including statistical analyses and other methods that lead to actionable advice
on the construction and long-term maintenance of custom extensions to OSS
A Dual Model of Open Source License Growth
Every open source project needs to decide on an open source license. This
decision is of high economic relevance: Just which license is the best one to
help the project grow and attract a community? The most common question is:
Should the project choose a restrictive (reciprocal) license or a more
permissive one? As an important step towards answering this question, this
paper analyses actual license choice and correlated project growth from ten
years of open source projects. It provides closed analytical models and finds
that around 2001 a reversal in license choice occurred from restrictive towards
permissive licenses.Comment: 14 pages, 6 figure
The List is the Process: Reliable Pre-Integration Tracking of Commits on Mailing Lists
A considerable corpus of research on software evolution focuses on mining
changes in software repositories, but omits their pre-integration history.
We present a novel method for tracking this otherwise invisible evolution of
software changes on mailing lists by connecting all early revisions of changes
to their final version in repositories. Since artefact modifications on mailing
lists are communicated by updates to fragments (i.e., patches) only,
identifying semantically similar changes is a non-trivial task that our
approach solves in a language-independent way. We evaluate our method on
high-profile open source software (OSS) projects like the Linux kernel, and
validate its high accuracy using an elaborately created ground truth.
Our approach can be used to quantify properties of OSS development processes,
which is an essential requirement for using OSS in reliable or safety-critical
industrial products, where certifiability and conformance to processes are
crucial. The high accuracy of our technique allows, to the best of our
knowledge, for the first time to quantitatively determine if an open
development process effectively aligns with given formal process requirements
Theory of quantum frequency conversion and type-II parametric down-conversion in the high-gain regime
Frequency conversion (FC) and type-II parametric down-conversion (PDC)
processes serve as basic building blocks for the implementation of quantum
optical experiments: type-II PDC enables the efficient creation of quantum
states such as photon-number states and Einstein-Podolsky-Rosen-states
(EPR-states). FC gives rise to technologies enabling efficient atom-photon
coupling, ultrafast pulse gates and enhanced detection schemes. However,
despite their widespread deployment, their theoretical treatment remains
challenging. Especially the multi-photon components in the high-gain regime as
well as the explicit time-dependence of the involved Hamiltonians hamper an
efficient theoretical description of these nonlinear optical processes.
In this paper, we investigate these effects and put forward two models that
enable a full description of FC and type-II PDC in the high-gain regime. We
present a rigorous numerical model relying on the solution of coupled
integro-differential equations that covers the complete dynamics of the
process. As an alternative, we develop a simplified model that, at the expense
of neglecting time-ordering effects, enables an analytical solution.
While the simplified model approximates the correct solution with high
fidelity in a broad parameter range, sufficient for many experimental
situations, such as FC with low efficiency, entangled photon-pair generation
and the heralding of single photons from type-II PDC, our investigations reveal
that the rigorous model predicts a decreased performance for FC processes in
quantum pulse gate applications and an enhanced EPR-state generation rate
during type-II PDC, when EPR squeezing values above 12 dB are considered.Comment: 26 pages, 4 figure
Approximate Approximation on a Quantum Annealer
Many problems of industrial interest are NP-complete, and quickly exhaust
resources of computational devices with increasing input sizes. Quantum
annealers (QA) are physical devices that aim at this class of problems by
exploiting quantum mechanical properties of nature. However, they compete with
efficient heuristics and probabilistic or randomised algorithms on classical
machines that allow for finding approximate solutions to large NP-complete
problems. While first implementations of QA have become commercially available,
their practical benefits are far from fully explored. To the best of our
knowledge, approximation techniques have not yet received substantial
attention. In this paper, we explore how problems' approximate versions of
varying degree can be systematically constructed for quantum annealer programs,
and how this influences result quality or the handling of larger problem
instances on given set of qubits. We illustrate various approximation
techniques on both, simulations and real QA hardware, on different seminal
problems, and interpret the results to contribute towards a better
understanding of the real-world power and limitations of current-state and
future quantum computing.Comment: Proceedings of the 17th ACM International Conference on Computing
Frontiers (CF 2020
- …
