121 research outputs found

    Towards a Collaborative Process Platform: Publishing Processes according to the Linked Data Principles

    Get PDF
    Research in the area of process modeling and analysis has a long-established tradition. Process modeling is among others used in the medical domain to define an ideal workflow in order to ensure an efficient treatment of patients. These processes are often defined and maintained by multiple persons. Furthermore, multiple persons are interested in these defined processes to compare them with own defined processes for improvements purposes. Current solutions provide tools to model processes locally and export them in standard formats in order to exchange them. Besides, there are some collaboration tools available to model processes collaboratively and see changes dynamically. However, these solutions do not publish the data according to the Linked Data principles. Enriching processes with semantic information is useful in order to perform enhanced analysis. However, different users can only provide particular meta-information on same process steps. To address these problems we 1) developed an intuitive, open-source extension for Semantic MediaWiki that supports the graphical modeling of processes and stores the information in a structured way; 2) enable to enrich the processes with semantics from ontologies and knowledge graphs with references to external data sources 3) provide adapted views on meta-information in order to not overwhelm users with unnecessary information

    Analysis and assessment of a knowledge based smart city architecture providing service APIs

    Get PDF
    Abstract The main technical issues regarding smart city solutions are related to data gathering, aggregation, reasoning, data analytics, access, and service delivering via Smart City APIs (Application Program Interfaces). Different kinds of Smart City APIs enable smart city services and applications, while their effectiveness depends on the architectural solutions to pass from data to services for city users and operators, exploiting data analytics, and presenting services via APIs. Therefore, there is a strong activity on defining smart city architectures to cope with this complexity, putting in place a significant range of different kinds of services and processes. In this paper, the work performed in the context of Sii-Mobility smart city project on defining a smart city architecture addressing a wide range of processes and data is presented. To this end, comparisons of the state of the art solutions of smart city architectures for data aggregation and for Smart City API are presented by putting in evidence the usage semantic ontologies and knowledge base in the data aggregation in the production of smart services. The solution proposed aggregate and re-conciliate data (open and private, static and real time) by using reasoning/smart algorithms for enabling sophisticated service delivering via Smart City API. The work presented has been developed in the context of the Sii-Mobility national smart city project on mobility and transport integrated with smart city services with the aim of reaching a more sustainable mobility and transport systems. Sii-Mobility is grounded on Km4City ontology and tools for smart city data aggregation, analytics support and service production exploiting smart city API. To this end, Sii-Mobility/Km4City APIs have been compared to the state of the art solutions. Moreover, the proposed architecture has been assessed in terms of performance, computational and network costs in terms of measures that can be easily performed on private cloud on premise. The computational costs and workloads of the data ingestion and data analytics processes have been assessed to identify suitable measures to estimate needed resources. Finally, the API consumption related data in the recent period are presented

    Taxonomy, Semantic Data Schema, and Schema Alignment for Open Data in Urban Building Energy Modeling

    Full text link
    Urban Building Energy Modeling (UBEM) is a critical tool to provide quantitative analysis on building decarbonization, sustainability, building-to-grid integration, and renewable energy applications on city, regional, and national scales. Researchers usually use open data as inputs to build and calibrate UBEM. However, open data are from thousands of sources covering various perspectives of weather, building characteristics, etc. Besides, a lack of semantic features of open data further increases the engineering effort to process information to be directly used for UBEM as inputs. In this paper, we first reviewed open data types used for UBEM and developed a taxonomy to categorize open data. Based on that, we further developed a semantic data schema for each open data category to maintain data consistency and improve model automation for UBEM. In a case study, we use three popular open data to show how they can be automatically processed based on the proposed schematic data structure using large language models. The accurate results generated by large language models indicate the machine-readability and human-interpretability of the developed semantic data schema

    Proceedings of the 12th European Workshop on Natural Language Generation (ENLG 2009)

    Get PDF

    Knowledge Components and Methods for Policy Propagation in Data Flows

    Get PDF
    Data-oriented systems and applications are at the centre of current developments of the World Wide Web (WWW). On the Web of Data (WoD), information sources can be accessed and processed for many purposes. Users need to be aware of any licences or terms of use, which are associated with the data sources they want to use. Conversely, publishers need support in assigning the appropriate policies alongside the data they distribute. In this work, we tackle the problem of policy propagation in data flows - an expression that refers to the way data is consumed, manipulated and produced within processes. We pose the question of what kind of components are required, and how they can be acquired, managed, and deployed, to support users on deciding what policies propagate to the output of a data-intensive system from the ones associated with its input. We observe three scenarios: applications of the Semantic Web, workflow reuse in Open Science, and the exploitation of urban data in City Data Hubs. Starting from the analysis of Semantic Web applications, we propose a data-centric approach to semantically describe processes as data flows: the Datanode ontology, which comprises a hierarchy of the possible relations between data objects. By means of Policy Propagation Rules, it is possible to link data flow steps and policies derivable from semantic descriptions of data licences. We show how these components can be designed, how they can be effectively managed, and how to reason efficiently with them. In a second phase, the developed components are verified using a Smart City Data Hub as a case study, where we developed an end-to-end solution for policy propagation. Finally, we evaluate our approach and report on a user study aimed at assessing both the quality and the value of the proposed solution

    Query Interface for Smart City Internet of Things Data Marketplaces: A Case Study

    Get PDF
    Cities are increasingly getting augmented with sensors through public, private, and academic sector initiatives. Most of the time, these sensors are deployed with a primary purpose (objective) in mind (e.g., deploy sensors to understand noise pollution) by a sensor owner (i.e., the organization that invests in sensing hardware, for example, a city council). Over the last few years, communities undertaking smart city development projects have understood the importance of making the sensor data available to a wider community – beyond their primary usage. Different business models have been proposed to achieve this, including creating data marketplaces. The vision is to encourage new start-ups and small and medium-scale businesses to create novel products and services using sensor data to generate additional economic value. Currently, data are sold as pre-defined independent datasets (e.g., noise level and parking status data may be sold separately). This approach creates several challenges, such as (i) difficulties in pricing, which leads to higher prices (per dataset), (ii) higher network communication and bandwidth requirements, and (iii) information overload for data consumers (i.e., those who purchase data). We investigate the benefit of semantic representation and its reasoning capabilities towards creating a business model that offers data on-demand within smart city Internet of Things (IoT) data marketplaces. The objective is to help data consumers (i.e., small and medium enterprises (SMEs)) acquire the most relevant data they need. We demonstrate the utility of our approach by integrating it into a real-world IoT data marketplace (developed by synchronicity-iot.eu project). We discuss design decisions and their consequences (i.e., trade-offs) on the choice and selection of datasets. Subsequently, we present a series of data modeling principles and recommendations for implementing IoT data marketplaces

    Understanding the intended and enacted National Certificate Vocational English curriculum

    Get PDF
    A thesis submitted in the fulfilment of the requirements of the degree of Doctor of Philosophy to the Faculty of Humanities, University of the Witwatersrand, Johannesburg , 2017This thesis is premised on the notion that the perceived lack of quality of curriculum delivery in the vocational education sector in South Africa is probably due in part to the weaknesses of content knowledge selected for inclusion in the curriculum of various programmes offered in the vocational education sector. The thesis examines the nature of knowledge specified in the English subject offered in the Technical Education and Vocational Training (TVET) Colleges. Drawing on Basil Bernstein’s notion of the pedagogic device, the study follows the English curriculum as it starts from the production field where new ideas are created and modified, to the recontextualization field where curriculum designers and textbook writers produce written curriculum documents, to the reproduction field where the students are taught and examined. The study further examines the English lecturers’ insights about their perceptions and understanding of the curriculum they teach from. My findings indicate that the English curriculum follows an outcomes-based design structure, and displays a lack of conceptual integration, knowledge sequence and progression. The approaches to the teaching of English which inform the construction of the intended curriculum display characteristics of a generic horizontal nature. The intended curriculum does not incorporate features that encourage a mastery of technical terms which are appropriate for different occupational fields followed by the TVET College students. The design structure of the curriculum fails to guide the lecturers in terms of unpacking approaches to the teaching of English and how to use them in their teaching, as well as clarify the progression process and ways of aligning lesson planning to the occupational needs of the students. An analysis of this curriculum identifies strengths and weakness, highlights accomplishments, and focuses on realistic policy alternatives for the TVET sector, curriculum design, pedagogical and assessment practices.MT 201

    After Scotland: Irvine Welsh and the Ethic of Emergence

    Get PDF
    In “After Scotland: Irvine Welsh and the Ethic of Emergence,” the author’s objective is to mirror what he argues is the Scottish writer Irvine Welsh’s objective: to chart out a future Scotland guided by a generative life ethic. In order to achieve this objective, the author lays open and reengages Scotland’s past, discovers and commits to neglected or submerged materials and energies in its past, demonstrates how Welsh’s work is faithful to those and newly produced materials and energies, and suggests that Welsh’s use of those materials and energies enables readers to envision a new Scotland that will be integral to an alternative postmodern world that countervails one ruled by late capital. Each chapter builds toward a Marxist ethic of emergence, which is composed of four virtues uncovered in Scotland’s historical-material fabric: congregation, integration, emergence, and forgiveness. To bring these virtues to the surface, the author historically grounds Welsh’s novels and short stories—Trainspotting, Glue, Porno, Filth, “The Granton Star Cause,” “The Two Philosophers,” and Marabou Stork Nightmares. Through this historiographical process, each virtue is uncovered and analyzed in the context of a particular historical period: medieval, Reformation, Enlightenment, and postmodern. Each context presents a unique set of materials and energies; each also presents an epistemological and ethical focus. The author brings the first three contexts and virtues together to formulate the ethic of emergence within the postmodern context. Throughout, the author stresses how this ethic and each of its virtues are embedded in Welsh’s work and in Scotland’s historical-material fabric. The author then suggests what he and Welsh hope will emerge from that fabric according to such an ethic. Because Welsh is a contemporary writer who has gained relatively little attention from literary scholars, another aim of this study is to situate Welsh’s work by connecting it with literature produced inside and outside of the Scottish and postmodern contexts: e.g. Gaelic prehistorical and epic literature, Chaucer, morality plays, Robert Burns, and the modern mystery genre. The author concludes the study with an afterword, relating his project to recent events that have occurred in Scottish politics

    The (un)becoming-Scot: Irvine Welsh, Gilles Deleuze and the minor literature of Scotland after Scotland

    Get PDF
    This thesis examines the works of Scottish novelist Irvine Welsh alongside the philosophical works of French poststructuralists Gilles Deleuze and Félix Guattari as a case study for minor literature. By utilising Deleuze and Guattari’s aesthetic philosophies in a deep reading of Welsh’s novels, the thesis hopes to highlight the post-national potentials within both minor literature theory and the literary philosophy of Irvine Welsh. The first half of the thesis consists of three chapters that highlight the three categorical elements of minor literature: minor use of a major language, anti-establishmentarian politics, and a collective value for audiences. In Chapters One, Two and Three, I will not only describe these factors, but I will attempt at examining the linguistic, political, and communitarian elements of unbecoming-Scottish throughout Welsh’s novels. The second half of the thesis specifically focuses more on the ways in which becoming and unbecoming can alter Welsh’s view of Scottish cultural and national identity, which, for him, begins in a critique of masculinity, violence, colonial histories, religious identity and the problems of family. Therefore, Chapters Four, Five and Six respond to the three elements of Welsh’s critiques of majoritarian national identity: in a becoming-woman, an unbecoming-man and a new becoming-pack, modes of existential transformation that challenge both patriarchy and the institution of family. Throughout the thesis, I hope to illustrate how the minor voices of Welsh’s works reflect the minor voices of other postnational, post-industrial writers and artists. In reading Welsh with Deleuze as a minor artist, we might find some radical value in the transgressive, cruel and brutal aesthetics of such an ‘unbecoming Scot’, Irvine Welsh. Like his characters who must face the terror of Scotland after Scotland, industry and country obliterated by failed attempts at independence and the growth of global neoliberal capitalism, this thesis faces the major, molar and dominant facets of national, linguistic, cultural, gendered or racial identity construction in Welsh’s novels, and thus to establish a universal response to poverty and violence: to choose life

    Information handling: Concepts which emerged in practical situations and are analysed cybernetically

    Get PDF
    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University
    corecore