510 research outputs found
Covering rough sets based on neighborhoods: An approach without using neighborhoods
Rough set theory, a mathematical tool to deal with inexact or uncertain
knowledge in information systems, has originally described the indiscernibility
of elements by equivalence relations. Covering rough sets are a natural
extension of classical rough sets by relaxing the partitions arising from
equivalence relations to coverings. Recently, some topological concepts such as
neighborhood have been applied to covering rough sets. In this paper, we
further investigate the covering rough sets based on neighborhoods by
approximation operations. We show that the upper approximation based on
neighborhoods can be defined equivalently without using neighborhoods. To
analyze the coverings themselves, we introduce unary and composition operations
on coverings. A notion of homomorphismis provided to relate two covering
approximation spaces. We also examine the properties of approximations
preserved by the operations and homomorphisms, respectively.Comment: 13 pages; to appear in International Journal of Approximate Reasonin
A taxonomy for similarity metrics between Markov decision processes
Although the notion of task similarity is potentially interesting in a wide range of areas such as curriculum learning or automated planning, it has mostly been tied to transfer learning. Transfer is based on the idea of reusing the knowledge acquired in the learning of a set of source tasks to a new learning process in a target task, assuming that the target and source tasks are close enough. In recent years, transfer learning has succeeded in making reinforcement learning (RL) algorithms more efficient (e.g., by reducing the number of samples needed to achieve (near-)optimal performance). Transfer in RL is based on the core concept of similarity: whenever the tasks are similar, the transferred knowledge can be reused to solve the target task and significantly improve the learning performance. Therefore, the selection of good metrics to measure these similarities is a critical aspect when building transfer RL algorithms, especially when this knowledge is transferred from simulation to the real world. In the literature, there are many metrics to measure the similarity between MDPs, hence, many definitions of similarity or its complement distance have been considered. In this paper, we propose a categorization of these metrics and analyze the definitions of similarity proposed so far, taking into account such categorization. We also follow this taxonomy to survey the existing literature, as well as suggesting future directions for the construction of new metricsOpen Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature. This work has also been supported by the Madrid Government (Comunidad de Madrid-Spain) under the Multiannual Agreement with UC3M in the line of Excellence of University Professors (EPUC3M17), and in the context of the V PRICIT (Regional Programme of Research and Technological Innovation)S
Natural Communication
In Natural Communication, the author criticizes the current paradigm of specific goal orientation in the complexity sciences. His model of "natural communication" encapsulates modern theoretical concepts from mathematics and physics, in particular category theory and quantum theory. The author is convinced that only by looking to the past is it possible to establish continuity and coherence in the complexity science
Security Analysis of System Behaviour - From "Security by Design" to "Security at Runtime" -
The Internet today provides the environment for novel applications and
processes which may evolve way beyond pre-planned scope and
purpose. Security analysis is growing in complexity with the increase
in functionality, connectivity, and dynamics of current electronic
business processes. Technical processes within critical
infrastructures also have to cope with these developments. To tackle
the complexity of the security analysis, the application of models is
becoming standard practice. However, model-based support for security
analysis is not only needed in pre-operational phases but also during
process execution, in order to provide situational security awareness
at runtime.
This cumulative thesis provides three major contributions to modelling
methodology.
Firstly, this thesis provides an approach for model-based analysis and
verification of security and safety properties in order to support
fault prevention and fault removal in system design or redesign.
Furthermore, some construction principles for the design of
well-behaved scalable systems are given.
The second topic is the analysis of the exposition of vulnerabilities
in the software components of networked systems to exploitation by
internal or external threats. This kind of fault forecasting allows
the security assessment of alternative system configurations and
security policies. Validation and deployment of security policies
that minimise the attack surface can now improve fault tolerance and
mitigate the impact of successful attacks.
Thirdly, the approach is extended to runtime applicability. An
observing system monitors an event stream from the observed system
with the aim to detect faults - deviations from the specified
behaviour or security compliance violations - at runtime.
Furthermore, knowledge about the expected behaviour given by an
operational model is used to predict faults in the near
future. Building on this, a holistic security management strategy is
proposed. The architecture of the observing system is described and
the applicability of model-based security analysis at runtime is
demonstrated utilising processes from several industrial scenarios.
The results of this cumulative thesis are provided by 19 selected
peer-reviewed papers
VI Workshop on Computational Data Analysis and Numerical Methods: Book of Abstracts
The VI Workshop on Computational Data Analysis and Numerical Methods (WCDANM) is going to be held on June 27-29, 2019, in the Department of Mathematics of the University of Beira Interior (UBI), Covilhã, Portugal and it is a unique opportunity to disseminate scientific research related to the areas of Mathematics in general, with particular relevance to the areas of Computational Data Analysis and Numerical Methods in theoretical and/or practical field, using new techniques, giving especial emphasis to applications in Medicine, Biology, Biotechnology, Engineering, Industry, Environmental Sciences, Finance, Insurance, Management and Administration. The meeting will provide a forum for discussion and debate of ideas with interest to the scientific community in general. With this meeting new scientific collaborations among colleagues, namely new collaborations in Masters and PhD projects are expected. The event is open to the entire scientific community (with or without communication/poster)
Spacetime-Free Approach to Quantum Theory and Effective Spacetime Structure
Motivated by hints of the effective emergent nature of spacetime structure,
we formulate a spacetime-free algebraic framework for quantum theory, in which
no a priori background geometric structure is required. Such a framework is
necessary in order to study the emergence of effective spacetime structure in a
consistent manner, without assuming a background geometry from the outset.
Instead, the background geometry is conjectured to arise as an effective
structure of the algebraic and dynamical relations between observables that are
imposed by the background statistics of the system. Namely, we suggest that
quantum reference states on an extended observable algebra, the free algebra
generated by the observables, may give rise to effective spacetime structures.
Accordingly, perturbations of the reference state lead to perturbations of the
induced effective spacetime geometry. We initiate the study of these
perturbations, and their relation to gravitational phenomena
Adding Threshold Concepts to the Description Logic EL
We introduce a family of logics extending the lightweight Description Logic EL, that allows us to define concepts in an approximate way. The main idea is to use a graded membership function m, which for each individual and concept yields a number in the interval [0,1] expressing the degree to which the individual belongs to the concept. Threshold concepts C~t for ~ in {,>=} then collect all the individuals that belong to C with degree ~t. We further study this framework in two particular directions. First, we define a specific graded membership function deg and investigate the complexity of reasoning in the resulting Description Logic tEL(deg) w.r.t. both the empty terminology and acyclic TBoxes. Second, we show how to turn concept similarity measures into membership degree functions. It turns out that under certain conditions such functions are well-defined, and therefore induce a wide range of threshold logics. Last, we present preliminary results on the computational complexity landscape of reasoning in such a big family of threshold logics
Contributions to Game Theory and Management. Vol. III. Collected papers presented on the Third International Conference Game Theory and Management.
The collection contains papers accepted for the Third International Conference Game Theory and Management (June 24-26, 2009, St. Petersburg University, St. Petersburg, Russia). The presented papers belong to the field of game theory and its applications to management. The volume may be recommended for researches and post-graduate students of management, economic and applied mathematics departments.
- …