28,084 research outputs found
Semantic Support for Log Analysis of Safety-Critical Embedded Systems
Testing is a relevant activity for the development life-cycle of Safety
Critical Embedded systems. In particular, much effort is spent for analysis and
classification of test logs from SCADA subsystems, especially when failures
occur. The human expertise is needful to understand the reasons of failures,
for tracing back the errors, as well as to understand which requirements are
affected by errors and which ones will be affected by eventual changes in the
system design. Semantic techniques and full text search are used to support
human experts for the analysis and classification of test logs, in order to
speedup and improve the diagnosis phase. Moreover, retrieval of tests and
requirements, which can be related to the current failure, is supported in
order to allow the discovery of available alternatives and solutions for a
better and faster investigation of the problem.Comment: EDCC-2014, BIG4CIP-2014, Embedded systems, testing, semantic
discovery, ontology, big dat
Model the System from Adversary Viewpoint: Threats Identification and Modeling
Security attacks are hard to understand, often expressed with unfriendly and
limited details, making it difficult for security experts and for security
analysts to create intelligible security specifications. For instance, to
explain Why (attack objective), What (i.e., system assets, goals, etc.), and
How (attack method), adversary achieved his attack goals. We introduce in this
paper a security attack meta-model for our SysML-Sec framework, developed to
improve the threat identification and modeling through the explicit
representation of security concerns with knowledge representation techniques.
Our proposed meta-model enables the specification of these concerns through
ontological concepts which define the semantics of the security artifacts and
introduced using SysML-Sec diagrams. This meta-model also enables representing
the relationships that tie several such concepts together. This representation
is then used for reasoning about the knowledge introduced by system designers
as well as security experts through the graphical environment of the SysML-Sec
framework.Comment: In Proceedings AIDP 2014, arXiv:1410.322
Informaticology: combining Computer Science, Data Science, and Fiction Science
Motivated by an intention to remedy current complications with Dutch
terminology concerning informatics, the term informaticology is positioned to
denote an academic counterpart of informatics where informatics is conceived of
as a container for a coherent family of practical disciplines ranging from
computer engineering and software engineering to network technology, data
center management, information technology, and information management in a
broad sense.
Informaticology escapes from the limitations of instrumental objectives and
the perspective of usage that both restrict the scope of informatics. That is
achieved by including fiction science in informaticology and by ranking fiction
science on equal terms with computer science and data science, and framing (the
study of) game design, evelopment, assessment and distribution, ranging from
serious gaming to entertainment gaming, as a chapter of fiction science. A
suggestion for the scope of fiction science is specified in some detail.
In order to illustrate the coherence of informaticology thus conceived, a
potential application of fiction to the ontology of instruction sequences and
to software quality assessment is sketched, thereby highlighting a possible
role of fiction (science) within informaticology but outside gaming
Requirements modelling and formal analysis using graph operations
The increasing complexity of enterprise systems requires a more advanced
analysis of the representation of services expected than is currently possible.
Consequently, the specification stage, which could be facilitated by formal
verification, becomes very important to the system life-cycle. This paper presents
a formal modelling approach, which may be used in order to better represent
the reality of the system and to verify the awaited or existing system’s properties,
taking into account the environmental characteristics. For that, we firstly propose
a formalization process based upon properties specification, and secondly we
use Conceptual Graphs operations to develop reasoning mechanisms of verifying
requirements statements. The graphic visualization of these reasoning enables us
to correctly capture the system specifications by making it easier to determine if
desired properties hold. It is applied to the field of Enterprise modelling
A Lightweight Framework for Universal Fragment Composition
Domain-specific languages (DSLs) are useful tools for coping with complexity in software development. DSLs provide developers with appropriate constructs for specifying and solving the problems they are faced with. While the exact definition of DSLs can vary, they can roughly be divided into two categories: embedded and non-embedded. Embedded DSLs (E-DSLs) are integrated into general-purpose host languages (e.g. Java), while non-embedded DSLs (NE-DSLs) are standalone languages with their own tooling (e.g. compilers or interpreters). NE-DSLs can for example be found on the Semantic Web where they are used for querying or describing shared domain models (ontologies). A common theme with DSLs is naturally their support of focused expressive power. However, in many cases they do not support non–domain-specific component-oriented constructs that can be useful for developers. Such constructs are standard in general-purpose languages (procedures, methods, packages, libraries etc.). While E-DSLs have access to such constructs via their host languages, NE-DSLs do not have this opportunity. Instead, to support such notions, each of these languages have to be extended and their tooling updated accordingly. Such modifications can be costly and must be done individually for each language. A solution method for one language cannot easily be reused for another. There currently exist no appropriate technology for tackling this problem in a general manner. Apart from identifying the need for a general approach to address this issue, we extend existing composition technology to provide a language-inclusive solution. We build upon fragment-based composition techniques and make them applicable to arbitrary (context-free) languages. We call this process for the composition techniques’ universalization. The techniques are called fragment-based since their view of components— reusable software units with interfaces—are pieces of source code that conform to an underlying (context-free) language grammar. The universalization process is grammar-driven: given a base language grammar and a description of the compositional needs wrt. the composition techniques, an adapted grammar is created that corresponds to the specified needs. The result is thus an adapted grammar that forms the foundation for allowing to define and compose the desired fragments. We further build upon this grammar-driven universalization approach to allow developers to define the non–domain-specific component-oriented constructs that are needed for NE-DSLs. Developers are able to define both what those constructs should be, and how they are to be interpreted (via composition). Thus, developers can effectively define language extensions and their semantics. This solution is presented in a framework that can be reused for different languages, even if their notion of ‘components’ differ. To demonstrate the approach and show its applicability, we apply it to two Semantic Web related NE-DSLs that are in need of component-oriented constructs. We introduce modules to the rule-based Web query language Xcerpt and role models to the Web Ontology Language OWL
- …