225,595 research outputs found

    Entity - relationship modeling

    Get PDF
    Baza podataka je skup podataka. Arhitektura baze podataka sastoji se od tri sloja: Fizička razina, globalna logička razina i lokalna logička razina. Prednosti korištenja baze podataka su kontrola redundance, integralnost sustava, korištenje zajedničkih podataka, zaštita podataka, standardizacija podataka i optimizacija cjeline. Modeli baze podataka su relacijski model, mrežni model, hijerarhijski model i objektni model. Glavne faze oblikovanja baze podataka su prikupljanje i analiza zahtjeva, konceptualno modeliranje, logičko modeliranje i fizičko modeliranje. Najupotrebljivaniji konceptualni model je model entiteta i veza. Chen je u svom članku 1977. predstavio model koji pomoću dijagrama prikazuje stvarni svijet. E - R model koristi prirodniji pogled na svijet koji se sastoji od entiteta i veza. Entitet predstavlja stvarnu ili apstraktnu pojavu koja se može identificirati. Veze su odnosi među entitetima. Atributi su svojstva kojima je opisan entitet. Ključ jednoznačno identificira entitet. Koraci u otkrivanju entiteta, veza i atributa: 1. Otkrivanje imenica (entiteti i atributi) i glagola (veze), 2. Odrediti koji atributi opisuju koji entitet, 3. crtanje dijagrama i 4. pisanje popratnog teksta. Na temelju skladišne primke prikazali smo neke od E - R modela, a to su: Batini, Ceri & Navathe, Chen, Elmasri & Navathe, Korth & Silberschatz, Martin, McFadden & Hoffer i Oracle's case method (Barker). Prošireni model entiteta i veza razvijao se od kasnih 1970 - ih kako bi se omogućilo postizanje zahtjevnijih i preciznijih baza podataka. Prošireni model entiteta i veza uvodi nove pojmove kao što su podrazred, nadrazred, specijalizacija, generalizacija i nasljeđivanje.Database is a collection of interconnected data that constitutes a common data base on all applications of an information system. Architecture of a database consists of three layers: Physical, global logical level and a local logic level. The advantages of using a database are redundancy control, system integrity, the use of common data, data protection, data standardization and optimization of the unit. Models of databases are relational model, network model, the hierarchical model and the object model. The main stages of database design are requirements gathering and analysis, conceptual modeling, logical modeling and physical modeling. The most used conceptual model is a model of entities and relationships. Chen in his article "The Entity - relationship model: TOWARD A UNIFIED VIEW OF DATA" in 1977 introduced a model with diagrams showing the real world. E - R model uses a more natural view of the world consisting of entities and relationships. The entity is real or abstract phenomenon that can be identified. Relationships are relations between the entities. Attributes are properties which are describing an entity. The key uniquely identifies the entity. Steps in detecting entities, relationships and attributes: 1. Detection of nouns (entities and attributes) and a verb (relationships), 2. Determine what attributes describe the entity, 3. drawing diagrams and 4. writing the accompanying text. Based on a warehouse receipt, we presented some of the ER model, and those are: Batini, Ceri & Navathe, Chen, Elmasri & Navathe, Korth & Silberschatz, Martin, McFadden & Hoffer and Oracle's case method (Barker). The extended model of entities and relationships was developed from the late 1970s to enable the achievement of demanding and precise database. The extended model of entities and relationships introduces new concepts such as class, superclass, specialization, generalization and inheritance

    On the Foundations of the Brussels Operational-Realistic Approach to Cognition

    Get PDF
    The scientific community is becoming more and more interested in the research that applies the mathematical formalism of quantum theory to model human decision-making. In this paper, we provide the theoretical foundations of the quantum approach to cognition that we developed in Brussels. These foundations rest on the results of two decade studies on the axiomatic and operational-realistic approaches to the foundations of quantum physics. The deep analogies between the foundations of physics and cognition lead us to investigate the validity of quantum theory as a general and unitary framework for cognitive processes, and the empirical success of the Hilbert space models derived by such investigation provides a strong theoretical confirmation of this validity. However, two situations in the cognitive realm, 'question order effects' and 'response replicability', indicate that even the Hilbert space framework could be insufficient to reproduce the collected data. This does not mean that the mentioned operational-realistic approach would be incorrect, but simply that a larger class of measurements would be in force in human cognition, so that an extended quantum formalism may be needed to deal with all of them. As we will explain, the recently derived 'extended Bloch representation' of quantum theory (and the associated 'general tension-reduction' model) precisely provides such extended formalism, while remaining within the same unitary interpretative framework.Comment: 21 page

    Modeling views in the layered view model for XML using UML

    Get PDF
    In data engineering, view formalisms are used to provide flexibility to users and user applications by allowing them to extract and elaborate data from the stored data sources. Conversely, since the introduction of Extensible Markup Language (XML), it is fast emerging as the dominant standard for storing, describing, and interchanging data among various web and heterogeneous data sources. In combination with XML Schema, XML provides rich facilities for defining and constraining user-defined data semantics and properties, a feature that is unique to XML. In this context, it is interesting to investigate traditional database features, such as view models and view design techniques for XML. However, traditional view formalisms are strongly coupled to the data language and its syntax, thus it proves to be a difficult task to support views in the case of semi-structured data models. Therefore, in this paper we propose a Layered View Model (LVM) for XML with conceptual and schemata extensions. Here our work is three-fold; first we propose an approach to separate the implementation and conceptual aspects of the views that provides a clear separation of concerns, thus, allowing analysis and design of views to be separated from their implementation. Secondly, we define representations to express and construct these views at the conceptual level. Thirdly, we define a view transformation methodology for XML views in the LVM, which carries out automated transformation to a view schema and a view query expression in an appropriate query language. Also, to validate and apply the LVM concepts, methods and transformations developed, we propose a view-driven application development framework with the flexibility to develop web and database applications for XML, at varying levels of abstraction

    Challenges in Bridging Social Semantics and Formal Semantics on the Web

    Get PDF
    This paper describes several results of Wimmics, a research lab which names stands for: web-instrumented man-machine interactions, communities, and semantics. The approaches introduced here rely on graph-oriented knowledge representation, reasoning and operationalization to model and support actors, actions and interactions in web-based epistemic communities. The re-search results are applied to support and foster interactions in online communities and manage their resources

    A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows

    Get PDF
    This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes. Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques. The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base. The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete. After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system. A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling

    A formal theory of conceptual modeling universals

    Get PDF
    Conceptual Modeling is a discipline of great relevance to several areas in Computer Science. In a series of papers [1,2,3] we have been using the General Ontological Language (GOL) and its underlying upper level ontology, proposed in [4,5], to evaluate the ontological correctness of conceptual models and to develop guidelines for how the constructs of a modeling language (UML) should be used in conceptual modeling. In this paper, we focus on the modeling metaconcepts of classifiers and objects from an ontological point of view. We use a philosophically and psychologically well-founded theory of universals to propose a UML profile for Ontology Representation and Conceptual Modeling. The formal semantics of the proposed modeling elements is presented in a language of modal logics with quantification restricted to Sortal universals

    Big Data and Changing Concepts of the Human

    Get PDF
    Big Data has the potential to enable unprecedentedly rigorous quantitative modeling of complex human social relationships and social structures. When such models are extended to nonhuman domains, they can undermine anthropocentric assumptions about the extent to which these relationships and structures are specifically human. Discoveries of relevant commonalities with nonhumans may not make us less human, but they promise to challenge fundamental views of what it is to be human
    • …
    corecore