27 research outputs found
Using Rules to Adapt Applications for Business Models with High Evolutionary Rates
Nowadays, business models are in permanent evolution since the requirements belongs to a rapidly evolving world. In a context where communications all around the world travel so fast the business models need to be adapted permanently to the information the managers receive. In such world, traditional software development, needed for adapting software to changes, do not work properly since business changes need to be in exploitation in shorter times. In that situation, it is needed to go quicker from the business idea to the exploitation environment. This issue can be solved accelerating the development speed: from the expert to the customer, with no –or few, technical intervention. This paper proposes an approach to empower domain experts in developing adaptability solutions by using automated sets of production rules in a friendly way. Furthermore, a use case that implements this kind of development was used in a real problem prototype
A service-based testbed for Trust Negotiation
Trust Negotiation allows users to develop trust incrementally, by disclosing credentials step by step. This way, services and resources can be shared in an open environment, and access rights can be granted on the basis of peer-to-peer trust relationships. This article presents a service-based testbed for Trust Negotiation. At its core, it is created as a generic framework based on the WS-Trust standard. It integrates a modular trust engine and a rule engine, which is used as a policy checker. The system is mainly oriented at Web services composition and location-based social networking scenarios
Adding Support for Automatic Enforcement of Security Policies in NFV Networks
This paper introduces an approach towards automatic enforcement of security policies in
fv networks and dynamic adaptation to network changes.
The approach relies on a refinement model that allows the dynamic transformation of high-level security requirements into configuration settings for the Network Security Functions (NSFs), and optimization models that allow the optimal selection of the NSFs to use.
These models are built on a formalization of the NSF capabilities, which serves to unequivocally describe what NSFs are able to do for security policy enforcement purposes.
The approach proposed is the first step towards a security policy aware NFV management, orchestration, and resource allocation system - a paradigm shift for the management of virtualized networks - and it requires minor changes to the current NFV architecture.
We prove that our approach is feasible, as it has been implemented by extending the OpenMANO framework and validated on several network scenarios.
Furthermore, we prove with performance tests that policy refinement scales well enough to support current and future virtualized networks
Second CLIPS Conference Proceedings, volume 1
Topics covered at the 2nd CLIPS Conference held at the Johnson Space Center, September 23-25, 1991 are given. Topics include rule groupings, fault detection using expert systems, decision making using expert systems, knowledge representation, computer aided design and debugging expert systems
Gamification as a Service: Conceptualization of a Generic Enterprise Gamification Platform
Gamification is a novel method to improve engagement, motivation, or participation in non-game contexts using game mechanics. To a large extent, gamification is a psychological- and design-oriented discipline, i.e., a lot of effort has to be spent already in the design phase of a gamification project. Subsequently, the design is implemented in information systems such as portals or enterprise resource planning applications. These systems act as mediators to transport a gameful design to its users.
However, the efforts for the subsequent development and integration process are often underestimated. In fact, most conceptual gamification designs are never implemented due to the high development costs that arise from building the gamification solution from scratch, imprecise design or technical requirements, and communication conflicts between different stakeholders in the project.
This thesis addresses these problems by systematically defining the phases and stakeholders of the overall gamification process. Furthermore, the thesis rigorously defines the conceptual requirements of gamification based on a broad literature review. The identified conceptual requirements are mapped to a domain-specific language, called the Gamification Modeling Language. Moreover, this thesis analyzes 29 existing gamification solutions that aim to decrease the implementation efforts of gamification. However, using the different language elements, it is shown that none of the existing solutions suffices all requirements.
Therefore, a generic and reusable platform as runtime environment for gamification is proposed which fulfills all presented functional and non-functional requirements. As another benefit, it is shown how the Gamification Modeling Language can be automatically compiled into code for the gamification runtime environment and, thus, further reduces development efforts.
Based on the developed artifacts and five real gamified applications from industry, it is shown that the efforts for the implementation of the gamification can be significantly reduced from several months or weeks to a few days. Since the technology is designed as a reusable service, future projects benefit continuously with regards to time and efforts
First CLIPS Conference Proceedings, volume 2
The topics of volume 2 of First CLIPS Conference are associated with following applications: quality control; intelligent data bases and networks; Space Station Freedom; Space Shuttle and satellite; user interface; artificial neural systems and fuzzy logic; parallel and distributed processing; enchancements to CLIPS; aerospace; simulation and defense; advisory systems and tutors; and intelligent control
Optimisation and Decision Support during the Conceptual Stage of Building Design
Merged with duplicate record 10026.1/726 on 28.02.2017 by CS (TIS)Modern building design is complex and involves many different disciplines operating in a
fragmented manner. Appropriate computer-based decision support (DS) tools are sought
that can raise the level of integration of different activities at the conceptual stage, in order
to help create better designs solutions. This project investigates opportunities that exist for
using techniques based upon the Genetic Algorithm (GA) to support critical activities of
conceptual building design (CBD). Collective independent studies have shown that the
GA is a powerful optimisation and exploratory search technique with widespread
application. The GA is essentially very simple yet it offers robustness and domain
independence. The GA efficiently searches a domain to exploit highly suitable
information. It maintains multiple solutions to problems simultaneously and is well suited
to non-linear problems and those of a discontinuous nature found in engineering design.
The literature search first examines traditional approaches to supporting conceptual design.
Existing GA techniques and applications are discussed which include pioneering studies in
the field of detailed structural design. Broader GA studies are also reported which have
demonstrated possibilities for investigating geometrical, topological and member size
variation. The tasks and goals of conceptual design are studied. A rationale is introduced,
aimed at enabling the GA to be applied in a manner that provides the most effective
support to the designer. Numerical experiments with floor planning are presented. These
studies provide a basic foundation for a subsequent design support system (DSS) capable
of generating structural design concepts.
A hierarchical Structured GA (SGA) created by Dasgupta et al [1] is investigated to
support the generation of diverse structural design concepts. The SGA supports variation
in the size, shape and structural configuration of a building and in the choice of structural
frame type and floor system. The benefits and limitations of the SGA approach are
discussed. The creation of a prototype DSS system, abritrarily called Designer-Pro
(DPRO), is described. A detailed building design model is introduced which is required
for design development and appraisal. Simplifications, design rationale and generic
component modelling are mentioned. A cost-based single criteria optimisation problem
(SCOP) is created in which other constraints are represented as design parameters.
The thesis describes the importance of the object-oriented programming (OOP) paradigm
for creating a versatile design model and the need for complementary graphical user
interface (GUI) tools to provide human-computer interaction (HCI) capabilities for control
and intelligent design manipulation. Techniques that increase flexibility in the generation
and appraisal of concept are presented. Tools presented include a convergence plot of
design solutions that supports cursor-interrogation to reveal the details of individual
concepts. The graph permits study of design progression, or evolution of optimum design
solutions. A visualisation tool is also presented.
The DPRO system supports multiple operating modes, including single-design appraisal
and enumerative search (ES). Case study examples are provided which demonstrate the
applicability of the DPRO system to a range of different design scenarios. The DPRO
system performs well in all tests. A parametric study demonstrates the potential of the
system for DS. Limitations of the current approach and opportunities to broaden the study
form part of the scope for further work. Some suggestions for further study are made,
based upon newly-emerging techniques
Spatial ontologies for architectural heritage
Informatics and artificial intelligence have generated new requirements for digital archiving, information, and documentation. Semantic interoperability has become fundamental for the management and sharing of information. The constraints to data interpretation enable both database interoperability, for data and schemas sharing and reuse, and information retrieval in large datasets. Another challenging issue is the exploitation of automated reasoning possibilities. The solution is the use of domain ontologies as a reference for data modelling in information systems. The architectural heritage (AH) domain is considered in this thesis. The documentation in this field, particularly complex and multifaceted, is well-known to be critical for the preservation, knowledge, and promotion of the monuments. For these reasons, digital inventories, also exploiting standards and new semantic technologies, are developed by international organisations (Getty Institute, ONU, European Union). Geometric and geographic information is essential part of a monument. It is composed by a number of aspects (spatial, topological, and mereological relations; accuracy; multi-scale representation; time; etc.). Currently, geomatics permits the obtaining of very accurate and dense 3D models (possibly enriched with textures) and derived products, in both raster and vector format. Many standards were published for the geographic field or in the cultural heritage domain. However, the first ones are limited in the foreseen representation scales (the maximum is achieved by OGC CityGML), and the semantic values do not consider the full semantic richness of AH. The second ones (especially the core ontology CIDOC – CRM, the Conceptual Reference Model of the Documentation Commettee of the International Council of Museums) were employed to document museums’ objects. Even if it was recently extended to standing buildings and a spatial extension was included, the integration of complex 3D models has not yet been achieved. In this thesis, the aspects (especially spatial issues) to consider in the documentation of monuments are analysed. In the light of them, the OGC CityGML is extended for the management of AH complexity. An approach ‘from the landscape to the detail’ is used, for considering the monument in a wider system, which is essential for analysis and reasoning about such complex objects. An implementation test is conducted on a case study, preferring open source applications
Flexibilização de regras de negócio aplicadas ao Sistema de Dotação de Material do Exército Brasileiro
Dissertação (mestrado)—Universidade de BrasÃlia, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2018.Distribuição de materiais é um tema comum na área de logÃstica. A distribuição de Materiais
de Emprego Militar (MEM) no Exército Brasileiro envolve a catalogação de materiais,
a definição de regras que associam MEMs a unidades organizacionais, e a execução das
regras para derivar os materiais previstos para cada unidade militar, listados no Quadro
de Material Previsto (QDM). Atualmente, o QDM é gerado quase que totalmente de
forma manual através do preenchimento de planilhas eletrônicas para cada Organização
Militar (OM). O objetivo deste trabalho é apresentar a solução desenvolvida para geração
automática de QDMs a partir da definição de regras de distribuição e posterior execução
em um motor de regras. A simplicidade na definição das regras de distribuição de materiais,
transformando definições de alto nÃvel em definições baseadas em regras de motor de
inferência, facilita a manutenção e extrai tais definições do código-fonte de uma aplicação.
Isso motivou o uso de programação generativa, em termos de meta-programação, e um
motor de inferência especÃfico para a linguagem Java. Embora seja especÃfica para a 4a
Subchefia do Estado-Maior do Exército (EME), a solução tende a ser genérica o suficiente
para ser adotada, após algumas adaptações, por outras áreas do exército ou até mesmo
organizações externas que lidam com regras de distribuição de materiais semelhantes.Material distribution is a common theme in the logistics area. The distribution of Military
Employment Materials (MEM) in the Brazilian Army involves the cataloging of materials,
the definition of rules that associate MEMs with organizational units, and the execution
of rules to derive the materials for each military unit listed in the Foreseen Material Table
(QDM). Currently, QDM is generated almost entirely manually by filling spreadsheets for
each Military Organization (OM). The objective of this work is to present the solution
developed for automatic generation of QDMs from the definition of distribution rules and
later execution in a rules engine. Simplicity in defining material distribution rules, transforming
high-level definitions into definitions based on inference engine rules, facilitates
maintenance and extracts such definitions from the source code of an application. This
motivated the use of generative programming, in terms of meta-programming, and an inference
engine specific to the Java language. Although it is specific to the 4th Army Staff
Sub-Committee (EME), the solution tends to be generic enough to be adopted after some
adaptations by other areas of the army or even outside organizations dealing with similar
materials