2,603 research outputs found
Non-equilibrium and non-linear stationary state in thermoelectric materials
Efficiency of thermoelectric materials is characterized by the figure of
merit Z. Z has been believed to be a peculiar material constant. However, the
accurate measurements in the present work reveal that Z has large size
dependence and a non-linear temperature distribution appears as stationary
state in the thermoelectric material. The observation of these phenomena is
achieved by the Harman method. This method is the most appropriate way to
investigate the thermoelectric properties because the dc and ac resistances are
measured by the same electrode configuration. We describe the anomalous
thermoelectric properties observed in mainly (Bi,Sb)2Te3 by the Harman method
and then insist that Z is not the peculiar material constant but must be
defined as the physical quantity dependent of the size and the position in the
material.Comment: 9 pages, 4 figures. submitted to Applied Physics Lette
True covariance simulation of the EUVE update filter
A covariance analysis of the performance and sensitivity of the attitude determination Extended Kalman Filter (EKF) used by the On Board Computer (OBC) of the Extreme Ultra Violet Explorer (EUVE) spacecraft is presented. The linearized dynamics and measurement equations of the error states are derived which constitute the truth model describing the real behavior of the systems involved. The design model used by the OBC EKF is then obtained by reducing the order of the truth model. The covariance matrix of the EKF which uses the reduced order model is not the correct covariance of the EKF estimation error. A true covariance analysis has to be carried out in order to evaluate the correct accuracy of the OBC generated estimates. The results of such analysis are presented which indicate both the performance and the sensitivity of the OBC EKF
Achievements, open problems and challenges for search based software testing
Search Based Software Testing (SBST) formulates testing as an optimisation problem, which can be attacked using computational search techniques from the field of Search Based Software Engineering (SBSE). We present an analysis of the SBST research agenda, focusing on the open problems and challenges of testing non-functional properties, in particular a topic we call 'Search Based Energy Testing' (SBET), Multi-objective SBST and SBST for Test Strategy Identification. We conclude with a vision of FIFIVERIFY tools, which would automatically find faults, fix them and verify the fixes. We explain why we think such FIFIVERIFY tools constitute an exciting challenge for the SBSE community that already could be within its reach
Sapienz: Multi-objective automated testing for android applications
We introduce Sapienz, an approach to Android testing that uses multi-objective search-based testing to automatically explore and optimise test sequences, minimising length, while simultaneously maximising coverage and fault revelation. Sapienz combines random fuzzing, systematic and search-based exploration, exploiting seeding and multi-level instrumentation. Sapienz significantly outperforms (with large effect size) both the state-of-the-art technique Dynodroid and the widely-used tool, Android Monkey, in 7/10 experiments for coverage, 7/10 for fault detection and 10/10 for fault-revealing sequence length. When applied to the top 1, 000 Google Play apps, Sapienz found 558 unique, previously unknown crashes. So far we have managed to make contact with the developers of 27 crashing apps. Of these, 14 have confirmed that the crashes are caused by real faults. Of those 14, six already have developer-confirmed fixes
Trivial compiler equivalence: A large scale empirical study of a simple, fast and effective equivalent mutant detection technique
Identifying equivalent mutants remains the largest impediment to the widespread uptake of mutation testing. Despite being researched for more than three decades, the problem remains. We propose Trivial Compiler Equivalence (TCE) a technique that exploits the use of readily available compiler technology to address this long-standing challenge. TCE is directly applicable to real-world programs and can imbue existing tools with the ability to detect equivalent mutants and a special form of useless mutants called duplicated mutants. We present a thorough empirical study using 6 large open source programs, several orders of magnitude larger than those used in previous work, and 18 benchmark programs with hand-analysis equivalent mutants. Our results reveal that, on large real-world programs, TCE can discard more than 7% and 21% of all the mutants as being equivalent and duplicated mutants respectively. A human- based equivalence verification reveals that TCE has the ability to detect approximately 30% of all the existing equivalent mutants
A survey of app store analysis for software engineering
App Store Analysis studies information about applications obtained from app stores. App stores provide a wealth of information derived from users that would not exist had the applications been distributed via previous software deployment methods. App Store Analysis combines this non-technical information with technical information to learn trends and behaviours within these forms of software repositories. Findings from App Store Analysis have a direct and actionable impact on the software teams that develop software for app stores, and have led to techniques for requirements engineering, release planning, software design, security and testing. This survey describes and compares the areas of research that have been explored thus far, drawing out common aspects, trends and directions future research should take to address open problems and challenges
Phase reconstruction of strong-field excited systems by transient-absorption spectroscopy
We study the evolution of a V-type three-level system, whose two resonances
are coherently excited and coupled by two ultrashort laser pump and probe
pulses, separated by a varying time delay. We relate the quantum dynamics of
the excited multi-level system to the absorption spectrum of the transmitted
probe pulse. In particular, by analyzing the quantum evolution of the system,
we interpret how atomic phases are differently encoded in the
time-delay-dependent spectral absorption profiles when the pump pulse either
precedes or follows the probe pulse. We experimentally apply this scheme to
atomic Rb, whose fine-structure-split 5s\,^2S_{1/2}\rightarrow 5p\,^2P_{1/2}
and 5s\,^2S_{1/2}\rightarrow 5p\,^2P_{3/2} transitions are driven by the
combined action of a pump pulse of variable intensity and a delayed probe
pulse. The provided understanding of the relationship between quantum phases
and absorption spectra represents an important step towards full time-dependent
phase reconstruction (quantum holography) of bound-state wave-packets in
strong-field light-matter interactions with atoms, molecules and solids.Comment: 5 pages, 4 figure
A survey of the use of crowdsourcing in software engineering
The term 'crowdsourcing' was initially introduced in 2006 to describe an emerging distributed problem-solving model by online workers. Since then it has been widely studied and practiced to support software engineering. In this paper we provide a comprehensive survey of the use of crowdsourcing in software engineering, seeking to cover all literature on this topic. We first review the definitions of crowdsourcing and derive our definition of Crowdsourcing Software Engineering together with its taxonomy. Then we summarise industrial crowdsourcing practice in software engineering and corresponding case studies. We further analyse the software engineering domains, tasks and applications for crowdsourcing and the platforms and stakeholders involved in realising Crowdsourced Software Engineering solutions. We conclude by exposing trends, open issues and opportunities for future research on Crowdsourced Software Engineering
The Value of Exact Analysis in Requirements Selection
Uncertainty is characterised by incomplete understanding. It is inevitable in the early phase of requirements engineering, and can lead to unsound requirement decisions. Inappropriate requirement choices may result in products that fail to satisfy stakeholders' needs, and might cause loss of revenue. To overcome uncertainty, requirements engineering decision support needs uncertainty management. In this research, we develop a decision support framework METRO for the Next Release Problem (NRP) to manage algorithmic uncertainty and requirements uncertainty. An exact NRP solver (NSGDP) lies at the heart of METRO. NSGDP's exactness eliminates interference caused by approximate existing NRP solvers. We apply NSGDP to three NRP instances, derived from a real world NRP instance, RALIC, and compare with NSGA-II, a widely-used approximate (inexact) technique. We find the randomness of NSGA-II results in decision makers missing up to 99.95% of the optimal solutions and obtaining up to 36:48% inexact requirement selection decisions. The chance of getting an inexact decision using existing approximate approaches is negatively correlated with the implementation cost of a requirement (Spearman ρ up to -0.72). Compared to the inexact existing approach, NSGDP saves 15.21% lost revenue, on average, for the RALIC dataset
- …