116,119 research outputs found
Long-Term Average Cost in Featured Transition Systems
A software product line is a family of software products that share a common
set of mandatory features and whose individual products are differentiated by
their variable (optional or alternative) features. Family-based analysis of
software product lines takes as input a single model of a complete product line
and analyzes all its products at the same time. As the number of products in a
software product line may be large, this is generally preferable to analyzing
each product on its own. Family-based analysis, however, requires that standard
algorithms be adapted to accomodate variability.
In this paper we adapt the standard algorithm for computing limit average
cost of a weighted transition system to software product lines. Limit average
is a useful and popular measure for the long-term average behavior of a quality
attribute such as performance or energy consumption, but has hitherto not been
available for family-based analysis of software product lines. Our algorithm
operates on weighted featured transition systems, at a symbolic level, and
computes limit average cost for all products in a software product line at the
same time. We have implemented the algorithm and evaluated it on several
examples
Hydrogen and fuel cell technologies for heating: A review
The debate on low-carbon heat in Europe has become focused on a narrow range of technological options and has largely neglected hydrogen and fuel cell technologies, despite these receiving strong support towards commercialisation in Asia. This review examines the potential benefits of these technologies across different markets, particularly the current state of development and performance of fuel cell micro-CHP. Fuel cells offer some important benefits over other low-carbon heating technologies, and steady cost reductions through innovation are bringing fuel cells close to commercialisation in several countries. Moreover, fuel cells offer wider energy system benefits for high-latitude countries with peak electricity demands in winter. Hydrogen is a zero-carbon alternative to natural gas, which could be particularly valuable for those countries with extensive natural gas distribution networks, but many national energy system models examine neither hydrogen nor fuel cells for heating. There is a need to include hydrogen and fuel cell heating technologies in future scenario analyses, and for policymakers to take into account the full value of the potential contribution of hydrogen and fuel cells to low-carbon energy systems
The power of the crowd: promise and potential of crowdsourcing for education
Crowdsourcing is the term often used for processes of data collation and creation where
individuals or groups of users who are not necessarily located centrally generate content that is
then shared. While the term originates within the world of business, it has since gained traction
within a number of academic and professional disciplines. Drawing upon two examples that have
originated within the Republic of Ireland, this paper reflects on the educational potential of
crowdsourcing. Firstly, it reports a unique one-year open crowdsourcing initiative which compiled
a comprehensive A-Z directory of edtech tools for teaching and learning through collaborative
contributions. Secondly, it describes an initiative to develop a crowdsourced repository of study
tips and suggestions for adult, part-time, online and flexible learners embarking on further study.
These two case studies provide a valuable context for considering the wider potential of
crowdsourcing applications for teaching and learning purposes
High-fidelity single-shot readout for a spin qubit via an enhanced latching mechanism
The readout of semiconductor spin qubits based on spin blockade is fast but
suffers from a small charge signal. Previous work suggested large benefits from
additional charge mapping processes, however uncertainties remain about the
underlying mechanisms and achievable fidelity. In this work, we study the
single-shot fidelity and limiting mechanisms for two variations of an enhanced
latching readout. We achieve average single-shot readout fidelities > 99.3% and
> 99.86% for the conventional and enhanced readout respectively, the latter
being the highest to date for spin blockade. The signal amplitude is enhanced
to a full one-electron signal while preserving the readout speed. Furthermore,
layout constraints are relaxed because the charge sensor signal is no longer
dependent on being aligned with the conventional (2, 0) - (1, 1) charge dipole.
Silicon donor-quantum-dot qubits are used for this study, for which the dipole
insensitivity substantially relaxes donor placement requirements. One of the
readout variations also benefits from a parametric lifetime enhancement by
replacing the spin-relaxation process with a charge-metastable one. This
provides opportunities to further increase the fidelity. The relaxation
mechanisms in the different regimes are investigated. This work demonstrates a
readout that is fast, has one-electron signal and results in higher fidelity.
It further predicts that going beyond 99.9% fidelity in a few microseconds of
measurement time is within reach.Comment: Supplementary information is included with the pape
Network Structure, Efficiency, and Performance in WikiProjects
The internet has enabled collaborations at a scale never before possible, but
the best practices for organizing such large collaborations are still not
clear. Wikipedia is a visible and successful example of such a collaboration
which might offer insight into what makes large-scale, decentralized
collaborations successful. We analyze the relationship between the structural
properties of WikiProject coeditor networks and the performance and efficiency
of those projects. We confirm the existence of an overall
performance-efficiency trade-off, while observing that some projects are higher
than others in both performance and efficiency, suggesting the existence
factors correlating positively with both. Namely, we find an association
between low-degree coeditor networks and both high performance and high
efficiency. We also confirm results seen in previous numerical and small-scale
lab studies: higher performance with less skewed node distributions, and higher
performance with shorter path lengths. We use agent-based models to explore
possible mechanisms for degree-dependent performance and efficiency. We present
a novel local-majority learning strategy designed to satisfy properties of
real-world collaborations. The local-majority strategy as well as a localized
conformity-based strategy both show degree-dependent performance and
efficiency, but in opposite directions, suggesting that these factors depend on
both network structure and learning strategy. Our results suggest possible
benefits to decentralized collaborations made of smaller, more tightly-knit
teams, and that these benefits may be modulated by the particular learning
strategies in use.Comment: 11 pages, 5 figures, to appear in ICWSM 201
TQM implementation: An empirical examination and proposed generic model
Total quality management (TQM) is considered by many as an important quality and business performance improvement tool. The popularity of the concept has led to an explosion of TQM related literature. A careful review of the literature suggests that most publications recount the experiences or perceptions of the authors or deal with single case organisations. Furthermore, there is a dearth of empirical research and literature dealing with TQM's implementation process. This paper reports the findings of a research project that empirically examined the process of TQM implementation in a sample of organisations widely regarded as leading exponents of TQM. The paper presents a non-prescriptive model of the TQM implementation process derived from the findings and proposes an "outcome driven" approach as an alternative to the more commonplace TQM implementation strategies
- âŠ