183,999 research outputs found
Validation of Ultrahigh Dependability for Software-Based Systems
Modern society depends on computers for a number of critical tasks in which failure can have very high costs. As a consequence, high levels of dependability (reliability, safety, etc.) are required from such computers, including their software. Whenever a quantitative approach to risk is adopted, these requirements must be stated in quantitative terms, and a rigorous demonstration of their being attained is necessary. For software used in the most critical roles, such demonstrations are not usually supplied. The fact is that the dependability requirements often lie near the limit of the current state of the art, or beyond, in terms not only of the ability to satisfy them, but also, and more often, of the ability to demonstrate that they are satisfied in the individual operational products (validation). We discuss reasons why such demonstrations cannot usually be provided with the means available: reliability growth models, testing with stable reliability, structural dependability modelling, as well as more informal arguments based on good engineering practice. We state some rigorous arguments about the limits of what can be validated with each of such means. Combining evidence from these different sources would seem to raise the levels that can be validated; yet this improvement is not such as to solve the problem. It appears that engineering practice must take into account the fact that no solution exists, at present, for the validation of ultra-high dependability in systems relying on complex software
A Lightweight State Machine for Validating Use Case Descriptions
This paper presents a tool to provide an enaction
capability for use case descriptions. Use cases have
wide industry acceptance and are well suited for
constructing initial approximations of the intended
behaviour. However, use case descriptions are still
relatively immature with respect to precise syntax
and semantics. Hence, despite promising work on
providing writing guidelines, rigorous validation of
use case descriptions requires further support.
One approach to supporting validation is to use
enaction. Indeed, enactable models have been used
extensively within process modelling to clarify
understanding of descriptions.
Given the importance of requirements validation,
such automated support promises significant benefits.
However, the need to produce formal descriptions, to
drive enaction, is often seen as a barrier to the takeup
of such technologies. That is, developers have
traditionally been reluctant to increase the
proportion of effort devoted to requirements
activities. Our approach involves the development of
a lightweight state-machine, which obviates any need
to create intermediate formal descriptions, thereby
maintaining the simple nature of the use case
description.
Hence, this 'lightweight' approach, which provides
an enaction capability āfor minimal effortā, increases
the likelihood of industrial take-up
Imaging X-Ray Polarimeter Explorer Systems Engineering Approach and Implementation
The Imaging X-ray Polarimetry Explorer (IXPE) is a NASA Small Explorer x-ray astrophysics mission being implemented by a geographically dispersed team. Each IXPE partner provides unique capabilities and experience which are utilized to design, build and launch the IXPE observator. A rigorous and iterative systems engineering approach is essential to ensuring the successful realization of reliable and cost effective IXPE mission system. The IXPE collaboration and observatory complexity provide both unique challenges and advantages for project systems engineering. The project uses established and tailored systems engineering (SE) methods and teaming approaches to achieve the IXPE mission goals. The IXPE systems engineering team spans all partner organizations. Currently, the project is in system integration and test working through structural environmental testingāvibration testing is just starting. Systems work is now focused on requirements management and maturity assessments, requirements verification and validation via sell-off packages (SOP) and interface control document (ICD) verification while supporting environmental test planning and execution. IXPE verification, validation and characterization (V&V) starts at the component/unit level and rolls up to appropriate higher levels where V&V compliance is assured by collaborative development by the cross-organizational V&V Team. This paper provides a technical summary of the IXPE concept of operations and mission-system (payload, spacecraft, observatory, ground system, launch vehicle), overviews the IXPE systems engineering approach (communications, project reviews, requirements analysis and management, baseline design and design trade studies, interfaces definition and documentation, resource management), describes the verification, validation and characterization activities (requirements validation, models and simulations validation, systems integration and test (I&T), system validation), discusses risk and opportunities philosophy and implementation, outlines COVID 19 accommodations, itemizes some key challenges and lessons-learned followed by the path to launch and conclusions
Synthesis of Logic Programs from Object-Oriented Formal Specifications
Early validation of requirements is crucial for the rigorous development of software. Without it, even the most formal of the methodologies will produce the wrong outcome. One successful approach, popularised by some of the so-called lightweight formal methods, consists in generating (finite, small) models of the specifications. Another possibility is to build a running prototype from those specifications. In this paper we show how to obtain executable prototypes from formal specifications written in an object oriented notation by translating them into logic programs. This has some advantages over other lightweight methodologies. For instance, we recover the possibility of dealing with recursive data types as specifications that use them often lack finite models
Animating formal specifications : a telephone simulation case study
Colloque avec actes sans comitƩ de lecture.We believe that a more rigorous method of specification and validation can be achieved by first developing a {\it specification architecture} whose high-level semantics are based on object oriented concepts. This architecture promotes the construction of new functionality in a formal manner using rigorous notions of composition and inheritance. An object oriented approach will also facilitate incremental approaches to validation and verification. We present our first steps towards producing such an architecture for the Plain Old Telephone Service (POTS), which is specified and validated using a formal object oriented language based on LOTOS. The method by which the formal model is derived from the informal understanding of the requirements is examined. Validation based on meta-analysis of the problem structure is elucidated
Towards a method for rigorous development of generic requirements patterns
We present work in progress on a method for the engineering, validation and verification of generic requirements using domain engineering and formal methods. The need to develop a generic requirement set for subsequent system instantiation is complicated by the addition of the high levels of verification demanded by safety-critical domains such as avionics. Our chosen application domain is the failure detection and management function for engine control systems: here generic requirements drive a software product line of target systems. A pilot formal specification and design exercise is undertaken on a small (twosensor) system element. This exercise has a number of aims: to support the domain analysis, to gain a view of appropriate design abstractions, for a B novice to gain experience in the B method and tools, and to evaluate the usability and utility of that method.We also present a prototype method for the production and verification of a generic requirement set in our UML-based formal notation, UML-B, and tooling developed in support. The formal verification both of the structural generic requirement set, and of a particular application, is achieved via translation to the formal specification language, B, using our U2B and ProB tools
Towards a methodology for rigorous development of generic requirements patterns
We present work in progress on a methodology for the engineering, validation and verification of generic requirements using domain engineering and formal methods. The need to develop a generic requirement set for subsequent system instantiation is complicated by the addition of the high levels of verification demanded by safety-critical domains such as avionics. We consider the failure detection and management function for engine control systems as an application domain where product line engineering is useful. The methodology produces a generic requirement set in our, UML based, formal notation, UML-B. The formal verification both of the generic requirement set, and of a particular application, is achieved via translation to the formal specification language, B, using our U2B and ProB tools
In-ear EEG biometrics for feasible and readily collectable real-world person authentication
The use of EEG as a biometrics modality has been investigated for about a
decade, however its feasibility in real-world applications is not yet
conclusively established, mainly due to the issues with collectability and
reproducibility. To this end, we propose a readily deployable EEG biometrics
system based on a `one-fits-all' viscoelastic generic in-ear EEG sensor
(collectability), which does not require skilled assistance or cumbersome
preparation. Unlike most existing studies, we consider data recorded over
multiple recording days and for multiple subjects (reproducibility) while, for
rigour, the training and test segments are not taken from the same recording
days. A robust approach is considered based on the resting state with eyes
closed paradigm, the use of both parametric (autoregressive model) and
non-parametric (spectral) features, and supported by simple and fast cosine
distance, linear discriminant analysis and support vector machine classifiers.
Both the verification and identification forensics scenarios are considered and
the achieved results are on par with the studies based on impractical on-scalp
recordings. Comprehensive analysis over a number of subjects, setups, and
analysis features demonstrates the feasibility of the proposed ear-EEG
biometrics, and its potential in resolving the critical collectability,
robustness, and reproducibility issues associated with current EEG biometrics
- ā¦