394 research outputs found

    An approach to GDPR based on object role modeling

    Get PDF
    The General Data Protection Regulation 2016/679 (GDPR) is a set of legal rules to attain the privacy of people in the handling of their personal data and the movement of such data across countries. When those rules are considered in the operation of information systems, the one becomes attainable for legal approval within that scope. This paper presents a model we are developing to help enterprises do align their information system with the GDPR requirements. The model shall serve the purpose of analyzing the enterprises in what concerns the use of the subject’s personal data, allowing to capture and improve data protection capabilities placed in the GDPR. The main issue of our approach is to set a baseline to define the requirements for establishing, implementing, maintaining and continually improving data protection management system on organizations.info:eu-repo/semantics/publishedVersio

    Conceptual Modeling and the Quality of Ontologies: A Comparison between Object-Role Modeling and the Object Paradigm

    Get PDF
    Ontologies are key enablers for sharing precise and machine-understandable semantics among different applications and parties. Yet, for ontologies to meet these expectations, their quality must be of a good standard. The quality of an ontology is strongly based on the design method employed. This paper addresses the design problems related to the modelling of ontologies, with specific concentration on the issues related to the quality of the conceptualisations produced. The paper aims to demonstrate the impact of the modelling paradigm adopted on the quality of ontological models and, consequently, the potential impact that such a decision can have in relation to the development of software applications. To this aim, an ontology that is conceptualised based on the Object Role Modelling (ORM) approach is re-engineered into a one modelled on the basis of the Object Paradigm (OP). Next, the two ontologies are analytically compared using the specified criteria. The conducted comparison highlights that using the OP for ontology conceptualisation can provide more expressive, reusable, objective and temporal ontologies than those conceptualised on the basis of the ORM approach

    ANFIS Modeling of Dynamic Load Balancing in LTE

    Get PDF
    Modelling of ill-defined or unpredictable systems can be very challenging. Most models have relied on conventional mathematical models which does not adequately track some of the multifaceted challenges of such a system. Load balancing, which is a self-optimization operation of Self-Organizing Networks (SON), aims at ensuring an equitable distribution of users in the network. This translates into better user satisfaction and a more efficient use of network resources. Several methods for load balancing have been proposed. While some of them have a very buoyant theoretical basis, they are not practical. Furthermore, most of the techniques proposed the use of an iterative algorithm, which in itself is not computationally efficient as it does not take the unpredictable fluctuation of network load into consideration. This chapter proposes the use of soft computing, precisely Adaptive Neuro-Fuzzy Inference System (ANFIS) model, for dynamic QoS aware load balancing in 3GPP LTE. The use of ANFIS offers learning capability of neural network and knowledge representation of fuzzy logic for a load balancing solution that is cost effective and closer to human intuition. Three key load parameters (number of satisfied user in the net- work, virtual load of the serving eNodeB, and the overall state of the target eNodeB) are used to adjust the hysteresis value for load balancing

    Towards a Layered Architectural View for Security Analysis in SCADA Systems

    Full text link
    Supervisory Control and Data Acquisition (SCADA) systems support and control the operation of many critical infrastructures that our society depend on, such as power grids. Since SCADA systems become a target for cyber attacks and the potential impact of a successful attack could lead to disastrous consequences in the physical world, ensuring the security of these systems is of vital importance. A fundamental prerequisite to securing a SCADA system is a clear understanding and a consistent view of its architecture. However, because of the complexity and scale of SCADA systems, this is challenging to acquire. In this paper, we propose a layered architectural view for SCADA systems, which aims at building a common ground among stakeholders and supporting the implementation of security analysis. In order to manage the complexity and scale, we define four interrelated architectural layers, and uses the concept of viewpoints to focus on a subset of the system. We indicate the applicability of our approach in the context of SCADA system security analysis.Comment: 7 pages, 4 figure

    Representing Things and Properties in Conceptual Modeling: Understanding the Impact of Task Type

    Get PDF
    The representation of things and properties is a fundamental issue in conceptual modeling. The proponents of different modeling approaches, for example entity relationship modeling and object-role modeling, offer very different advice about the distinction between things and properties and their representation. We use ontological theory to provide guidelines about how things and properties should be represented. Previous experimental work has provided evidence to support the use of ontologically sound representations of things and properties in conceptual modeling. However the results also indicate that the type of task undertaken (for example comprehension, problem solving, discrepancy checking, and decomposition) may also impact the use of conceptual models. In this paper, a research project is proposed to examine the sorts of tasks that are best supported by distinguishing between things and properties in conceptual modeling

    Perancangan Dan Pembuatan Aplikasi Erd Generator Notasi Orm Dari Skrip Basis Data Oracle Berbasis J2ee

    Full text link
    The existing of database processing is needed by many institutions and companies. Database is not only to get information faster, it is also enlarging their service to customer. For companies, this advantage can increase competency. Because of this reason, many companies using manual processing turn to database. As mentioned above, database reverse engineering process has became a necessity for database developers to understand the structure of any databases. Commonly, this structure is modeled in some notations of Entity Relationships Diagram (ERD). The graphical visualization of database structure in an ERD can use many notations, so it is easy to understand. One of the approaches, which are easy to understand, is Object Role Modeling (ORM) diagram. By reverse engineering the mapping process to relation schema of a database, ERD generation from a data definition language script of Oracle can be done. For more flexibility, this application is constructed web based with Servlets technology that is propertied by JavaTM 2 SDK

    A Conceptual Schema Based XML Schema with Integrity Constraints Checking

    Get PDF
    The more popular XML for exchanging and representing information on Web, the more important Flat XML (XML) and intelligent editors become. For data exchanging, an XML Data with an XML Schema and integrity constraints are preferred. We employ an Object-Role Modeling (ORM) for enriching the XML Schema constraints and providing better validation the XML Data. An XML conceptual schema is presented using the ORM conceptual model. Editor Meta Tables are generated from the conceptual schema diagram and are populated. A User XML Schema base on the information in the Editor Meta Tables is generated. However, W3C XML Schema language does not support all of the ORM constraints. Therefore, we propose an Editor XML Schema and an Editor XML Data to cover unsupported the ORM constraints. We propose the algorithms for defining constraint in the User XML Schema and extending validity constraint checking. Finally, XQuery is used for extending validity checking

    Exploratory study to explore the role of ICT in the process of knowledge management in an Indian business environment

    Get PDF
    In the 21st century and the emergence of a digital economy, knowledge and the knowledge base economy are rapidly growing. To effectively be able to understand the processes involved in the creating, managing and sharing of knowledge management in the business environment is critical to the success of an organization. This study builds on the previous research of the authors on the enablers of knowledge management by identifying the relationship between the enablers of knowledge management and the role played by information communication technologies (ICT) and ICT infrastructure in a business setting. This paper provides the findings of a survey collected from the four major Indian cities (Chennai, Coimbatore, Madurai and Villupuram) regarding their views and opinions about the enablers of knowledge management in business setting. A total of 80 organizations participated in the study with 100 participants in each city. The results show that ICT and ICT infrastructure can play a critical role in the creating, managing and sharing of knowledge in an Indian business environment
    corecore