102 research outputs found

    Traditional Accounting with Decentralised Ledger Technology

    Get PDF
    Distributed ledger technology is by some believe to be the accounting system of the future, replacing the centuries-old double-entry accounting paradigm, as it has desirable characteristics such as tamper-resistance. However, it might suffer from technology lock-in as double-entry bookkeeping, due to its long-standing history, has offered the conceptual foundations for many laws, regulations and business practices. While some of these laws, regulations and practices might become obsolete as a result of distributed ledger technology, some might still prove to be valuable in a new technological context. While aiming at unlocking the potential of distributed ledger technology in an accounting context, we also want to preserve the wisdom of accounting craftsman. For this reason, it is the aim of this paper to offer a bi-directional mapping between traditional double-entry bookkeeping and innovative paradigms that have proven their value in decentralised systems, of which distributed ledger technology is an exponent. This paper offers such a mapping for the Resource-Event-Agent paradigm

    Node-attribute graph layout for small-world networks

    Get PDF
    Small-world networks are a very commonly occurring type of graph in the real-world, which exhibit a clustered structure that is not well represented by current graph layout algorithms. In many cases we also have information about the nodes in such graphs, which are typically depicted on the graph as node colour, shape or size. Here we demonstrate that these attributes can instead be used to layout the graph in high-dimensional data space. Then using a dimension reduction technique, targeted projection pursuit, the graph layout can be optimised for displaying clustering. The technique out-performs force-directed layout methods in cluster separation when applied to a sample, artificially generated, small-world network

    Visualising computational intelligence through converting data into formal concepts

    Get PDF

    A FORMAL CONCEPT OF CULTURE IN THE CLASSIFICATION OF ALFRED L. KROEBER AND CLYDE KLUCKHOHN

    Get PDF
    The objective of this article is to analyse definitions of culture gathered by Alfred L. Kroeber and Clyde Kluckhohn and published in Culture. A Critical Review of Concepts and Definitions in 1952. This article emphasizes a possibility of re-analysing the material collected by these researchers (Kroeber–Kluckhohn Culture Classification, hereinafter referred to as KKCC). The article shows that the KKCC material constitutes a coherent conceptual and theoretical paradigm. This paradigm was subject to contextual, frequential and conceptual (Formal Conceptual Analysis, hereinafter referred to as FCA) analyses. The obtained research results enabled the author to develop a formal concept of culture of KKCC, which could be used as a model for further analysis. The final conclusions are as follows: (1) the notion of "culture" is definable only within the frameworks of a conceptually coherent paradigm; (2) determination of a paradigm requires material repository (resp. text corpus); (3) contextual and frequential analyses enable one to index that kind of repository in order to determine general categories which will be used to develop a formal concept; (4) the formal concept of culture of KKCC constitutes the framework of all possible theoretical analyses concerning the meaning of the notion of "culture" in anthropology; (5) KKCC constitutes a representation of one theory of culture

    In-Close, a fast algorithm for computing formal concepts

    Get PDF
    This paper presents an algorithm, called In-Close, that uses incremental closure and matrix searching to quickly compute all formal concepts in a formal context. In-Close is based, conceptually, on a well known algorithm called Close-By-One. The serial version of a recently published algorithm (Krajca, 2008) was shown to be in the order of 100 times faster than several well-known algorithms, and timings of other algorithms in reviews suggest that none of them are faster than Krajca. This paper compares In-Close to Krajca, discussing computational methods, data requirements and memory considerations. From experiments using several public data sets and random data, this paper shows that In-Close is in the order of 20 times faster than Krajca. In-Close is small, straightforward, requires no matrix pre-processing and is simple to implement.</p

    Terrorist threat assessment with formal concept analysis.

    Get PDF
    The National Police Service Agency of the Netherlands developed a model to classify (potential) jihadists in four sequential phases of radicalism. The goal of the model is to signal the potential jihadist as early as possible to prevent him or her to enter the next phase. This model has up till now, never been used to actively find new subjects. In this paper, we use Formal Concept Analysis to extract and visualize potential jihadists in the different phases of radicalism from a large set of reports describing police observations. We employ Temporal Concept Analysis to visualize how a possible jihadist radicalizes over time. The combination of these instruments allows for easy decisionmaking on where and when to act.Formal concept analysis; Temporal concept analysis; Contextual attribute logic; Text mining; Terrorist threat assesment;

    On the isomorphism problem of concept algebras

    Get PDF
    Weakly dicomplemented lattices are bounded lattices equipped with two unary operations to encode a negation on {\it concepts}. They have been introduced to capture the equational theory of concept algebras \cite{Wi00}. They generalize Boolean algebras. Concept algebras are concept lattices, thus complete lattices, with a weak negation and a weak opposition. A special case of the representation problem for weakly dicomplemented lattices, posed in \cite{Kw04}, is whether complete {\wdl}s are isomorphic to concept algebras. In this contribution we give a negative answer to this question (Theorem \ref{T:main}). We also provide a new proof of a well known result due to M.H. Stone \cite{St36}, saying that {\em each Boolean algebra is a field of sets} (Corollary \ref{C:Stone}). Before these, we prove that the boundedness condition on the initial definition of {\wdl}s (Definition \ref{D:wdl}) is superfluous (Theorem \ref{T:wcl}, see also \cite{Kw09}).Comment: 15 page

    Knowledge discovery through creating formal contexts

    Get PDF
    Knowledge discovery is important for systems that have computational intelligence in helping them learn and adapt to changing environments. By representing, in a formal way, the context in which an intelligent system operates, it is possible to discover knowledge through an emerging data technology called formal concept analysis (FCA). This paper describes a tool called FcaBedrock that converts data into formal contexts for FCA. This paper describes how, through a process of guided automation, data preparation techniques such as attribute exclusion and value restriction allow data to be interpreted to meet the requirements of the analysis. Examples are given of how formal contexts can be created using FcaBedrock and then analysed for knowledge discovery, using real datasets. Creating formal contexts using FcaBedrock is shown to be straightforward and versatile. Large datasets are easily converted into a standard FCA format

    A case of using formal concept analysis in combination with emergent self organizing maps for detecting domestic violence.

    Get PDF
    In this paper, we propose a framework for iterative knowledge discovery from unstructured text using Formal Concept Analysis and Emergent Self Organizing Maps. We apply the framework to a real life case study using data from the Amsterdam-Amstelland police. The case zooms in on the problem of distilling concepts for domestic violence from the unstructured text in police reports. Our human-centered framework facilitates the exploration of the data and allows for an efficient incorporation of prior expert knowledge to steer the discovery process. This exploration resulted in the discovery of faulty case labellings, common classification errors made by police officers, confusing situations, missing values in police reports, etc. The framework was also used for iteratively expanding a domain-specific thesaurus. Furthermore, we showed how the presented method was used to develop a highly accurate and comprehensible classification model that automatically assigns a domestic or non-domestic violence label to police reports.Formal concept analysis; Emergent self organizing map; Text mining; Actionable knowledge discovery; Domestic violence;
    corecore