21,951 research outputs found

    Peirce's sign theory as an open-source R package.

    Get PDF
    Throughout Peirce’s writing, we witness his developing vision of a machine that scientists will eventually be able to create. Nadin (2010) raised the question:Why do computer scientists continue to ignore Peirce’s sign theory? A review of the literature on Peirce’s theory and the semiotics machine reveals that many authors discussed the machine;however, they donot differentiate between a physical computer machine and its software. This paper discusses the problematic issues involved in converting Peirce’s theory into a programming language, machine and software application. We demonstrate this challenge by introducing Peirce’s sign theory as a software application that runs under an open-source R environmen

    The Oz programming model

    Get PDF
    The Oz Programming Model (OPM) is a concurrent programming model subsuming higher-order functional and object-oriented programming as facets of a general model. This is particularly interesting for concurrent object-oriented programming, for which no comprehensive formal model existed until now. The model can be extended so that it can express encapsulated problem solvers generalizing the problem solving capabilities of constraint logic programming. OPM has been developed together with a concomitant programming language Oz, which is designed for applications that require complex symbolic computations, organization into multiple agents, and soft real-time control. An efficient, robust, and interactive implementation of Oz is freely available

    Semantic Web: Who is who in the field – A bibliometric analysis

    Get PDF
    The Semantic Web (SW) is one of the main efforts aiming to enhance human and machine interaction by representing data in an understandable way for machines to mediate data and services. It is a fast-moving and multidisciplinary field. This study conducts a thorough bibliometric analysis of the field by collecting data from Web of Science (WOS) and Scopus for the period of 1960-2009. It utilizes a total of 44,157 papers with 651,673 citations from Scopus, and 22,951 papers with 571,911 citations from WOS. Based on these papers and citations, it evaluates the research performance of the SW by identifying the most productive players, major scholarly communication media, highly cited authors, influential papers and emerging stars

    Apperceptive patterning: Artefaction, extensional beliefs and cognitive scaffolding

    Get PDF
    In “Psychopower and Ordinary Madness” my ambition, as it relates to Bernard Stiegler’s recent literature, was twofold: 1) critiquing Stiegler’s work on exosomatization and artefactual posthumanism—or, more specifically, nonhumanism—to problematize approaches to media archaeology that rely upon technical exteriorization; 2) challenging how Stiegler engages with Giuseppe Longo and Francis Bailly’s conception of negative entropy. These efforts were directed by a prevalent techno-cultural qualifier: the rise of Synthetic Intelligence (including neural nets, deep learning, predictive processing and Bayesian models of cognition). This paper continues this project but first directs a critical analytic lens at the Derridean practice of the ontologization of grammatization from which Stiegler emerges while also distinguishing how metalanguages operate in relation to object-oriented environmental interaction by way of inferentialism. Stalking continental (Kapp, Simondon, Leroi-Gourhan, etc.) and analytic traditions (e.g., Carnap, Chalmers, Clark, Sutton, Novaes, etc.), we move from artefacts to AI and Predictive Processing so as to link theories related to technicity with philosophy of mind. Simultaneously drawing forth Robert Brandom’s conceptualization of the roles that commitments play in retrospectively reconstructing the social experiences that lead to our endorsement(s) of norms, we compliment this account with Reza Negarestani’s deprivatized account of intelligence while analyzing the equipollent role between language and media (both digital and analog)

    Kolmogorov Complexity in perspective. Part II: Classification, Information Processing and Duality

    Get PDF
    We survey diverse approaches to the notion of information: from Shannon entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov complexity are presented: randomness and classification. The survey is divided in two parts published in a same volume. Part II is dedicated to the relation between logic and information system, within the scope of Kolmogorov algorithmic information theory. We present a recent application of Kolmogorov complexity: classification using compression, an idea with provocative implementation by authors such as Bennett, Vitanyi and Cilibrasi. This stresses how Kolmogorov complexity, besides being a foundation to randomness, is also related to classification. Another approach to classification is also considered: the so-called "Google classification". It uses another original and attractive idea which is connected to the classification using compression and to Kolmogorov complexity from a conceptual point of view. We present and unify these different approaches to classification in terms of Bottom-Up versus Top-Down operational modes, of which we point the fundamental principles and the underlying duality. We look at the way these two dual modes are used in different approaches to information system, particularly the relational model for database introduced by Codd in the 70's. This allows to point out diverse forms of a fundamental duality. These operational modes are also reinterpreted in the context of the comprehension schema of axiomatic set theory ZF. This leads us to develop how Kolmogorov's complexity is linked to intensionality, abstraction, classification and information system.Comment: 43 page

    A contribution for data processing and interoperability in Industry 4.0

    Get PDF
    Dissertação de mestrado em Engenharia de SistemasIndustry 4.0 is expected to drive a significant change in companies’ growth. The idea is to cluster important information from all the company’s supply chain, enabling valuable decision-making while permitting interactions between machines and humans in real time. Autonomous systems powered with Information Technologies are enablers of Industry 4.0 – like Internet of Things (IoT), Cyber Physical-Systems (CPS) and Big Data and analytics. IoT gather information from every piece of the big puzzle which is the manufacturing process. Cloud Computing store all that information in one place. People share information across the company, between its supply chain and hierarchical levels through integration of systems. Finally, Big Data and analytics are of intelligence that will improve Industry 4.0. Methods and tools in Industry 4.0 are designed to increase interoperability across industrial stakeholders. In order to make the complete process possible, standardisation must be implemented across the company. Two reference models for Industry 4.0 were studied - RAMI 4.0 and IIRA. RAMI 4.0, a German initiative, focuses on industrial digitalization while IIRA, an American initiative, focuses on “Internet of Things” world, i.e. energy, healthcare and transportation. The two initiatives aim to obtain intelligence data from processes while enabling interoperability among systems. Representatives from the two reference models are working together on the technological interface standards that could be used by companies joining this new era. This study aims at the interoperability between systems. Even though there must be a model to guide the company into Industry 4.0, this model ought to be mutable and flexible enough to handle differences in manufacturing process, as an example automotive industry 4.0 will not have the same approach as aviation Industry 4.0.Espera-se que a Indústria 4.0 seja uma mudança significativa no crescimento das empresas. O objetivo é agrupar informações importantes de toda a cadeia de suprimentos da empresa, proporcionando uma tomada de decisão mais acertada, ao mesmo tempo que permite interações entre seres humanos e máquinas em tempo real. Sistemas autônomos equipados com Tecnologias da Informação possibilitam a Indústria 4.0 como a Internet das Coisas (IoT), sistemas ciber-físicos (CPS) e Big Data e analytics. A IoT coleta informações de cada peça do grande quebra-cabeça que é o processo de fabricação. Cloud Computing lida com armazenamento de toda essa informação em um só lugar. As pessoas compartilham informações em toda a empresa, na cadeia de abastecimento e níveis hierárquicos por meio da integração de sistemas. Por fim, Big Data e analytics são de inteligência que melhorarão a Indústria 4.0. Os métodos e ferramentas da Indústria 4.0 são projetadas para aumentar a interoperabilidade entre os stakeholders. Para tornar possível essa interoperabilidade, um padrão em toda a empresa deve ser implementado. Dois modelos de referência para a Indústria 4.0 foram estudados - RAMI 4.0 e IIRA. RAMI 4.0, a iniciativa alemã, concentra-se na digitalização industrial, enquanto IIRA, a iniciativa americana, foca no mundo da Internet das Coisas, como energia, saúde e transporte. As duas iniciativas visam obter dados inteligentes dos processos e, ao mesmo tempo, permitir a interoperabilidade entre os sistemas. Representantes dos dois modelos de referência estão a trabalhar juntos para discutir os padrões de interface tecnológica que podem ser usados pelas empresas que entram nessa nova era. Este estudo visa a interoperabilidade entre sistemas. Embora deva haver um modelo para orientar a empresa na Indústria 4.0, esse modelo deve ser mutável e flexível o suficiente para lidar com diferenças no processo de fabricação, como exemplo a indústria 4.0 automotiva não terá a mesma abordagem que a Indústria 4.0 de aviação
    corecore