9,349 research outputs found

    Knowledge Graph Building Blocks: An easy-to-use Framework for developing FAIREr Knowledge Graphs

    Full text link
    Knowledge graphs and ontologies provide promising technical solutions for implementing the FAIR Principles for Findable, Accessible, Interoperable, and Reusable data and metadata. However, they also come with their own challenges. Nine such challenges are discussed and associated with the criterion of cognitive interoperability and specific FAIREr principles (FAIR + Explorability raised) that they fail to meet. We introduce an easy-to-use, open source knowledge graph framework that is based on knowledge graph building blocks (KGBBs). KGBBs are small information modules for knowledge-processing, each based on a specific type of semantic unit. By interrelating several KGBBs, one can specify a KGBB-driven FAIREr knowledge graph. Besides implementing semantic units, the KGBB Framework clearly distinguishes and decouples an internal in-memory data model from data storage, data display, and data access/export models. We argue that this decoupling is essential for solving many problems of knowledge management systems. We discuss the architecture of the KGBB Framework as we envision it, comprising (i) an openly accessible KGBB-Repository for different types of KGBBs, (ii) a KGBB-Engine for managing and operating FAIREr knowledge graphs (including automatic provenance tracking, editing changelog, and versioning of semantic units); (iii) a repository for KGBB-Functions; (iv) a low-code KGBB-Editor with which domain experts can create new KGBBs and specify their own FAIREr knowledge graph without having to think about semantic modelling. We conclude with discussing the nine challenges and how the KGBB Framework provides solutions for the issues they raise. While most of what we discuss here is entirely conceptual, we can point to two prototypes that demonstrate the principle feasibility of using semantic units and KGBBs to manage and structure knowledge graphs

    The Metaverse: Survey, Trends, Novel Pipeline Ecosystem & Future Directions

    Full text link
    The Metaverse offers a second world beyond reality, where boundaries are non-existent, and possibilities are endless through engagement and immersive experiences using the virtual reality (VR) technology. Many disciplines can benefit from the advancement of the Metaverse when accurately developed, including the fields of technology, gaming, education, art, and culture. Nevertheless, developing the Metaverse environment to its full potential is an ambiguous task that needs proper guidance and directions. Existing surveys on the Metaverse focus only on a specific aspect and discipline of the Metaverse and lack a holistic view of the entire process. To this end, a more holistic, multi-disciplinary, in-depth, and academic and industry-oriented review is required to provide a thorough study of the Metaverse development pipeline. To address these issues, we present in this survey a novel multi-layered pipeline ecosystem composed of (1) the Metaverse computing, networking, communications and hardware infrastructure, (2) environment digitization, and (3) user interactions. For every layer, we discuss the components that detail the steps of its development. Also, for each of these components, we examine the impact of a set of enabling technologies and empowering domains (e.g., Artificial Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on its advancement. In addition, we explain the importance of these technologies to support decentralization, interoperability, user experiences, interactions, and monetization. Our presented study highlights the existing challenges for each component, followed by research directions and potential solutions. To the best of our knowledge, this survey is the most comprehensive and allows users, scholars, and entrepreneurs to get an in-depth understanding of the Metaverse ecosystem to find their opportunities and potentials for contribution

    Corporate Social Responsibility: the institutionalization of ESG

    Get PDF
    Understanding the impact of Corporate Social Responsibility (CSR) on firm performance as it relates to industries reliant on technological innovation is a complex and perpetually evolving challenge. To thoroughly investigate this topic, this dissertation will adopt an economics-based structure to address three primary hypotheses. This structure allows for each hypothesis to essentially be a standalone empirical paper, unified by an overall analysis of the nature of impact that ESG has on firm performance. The first hypothesis explores the evolution of CSR to the modern quantified iteration of ESG has led to the institutionalization and standardization of the CSR concept. The second hypothesis fills gaps in existing literature testing the relationship between firm performance and ESG by finding that the relationship is significantly positive in long-term, strategic metrics (ROA and ROIC) and that there is no correlation in short-term metrics (ROE and ROS). Finally, the third hypothesis states that if a firm has a long-term strategic ESG plan, as proxied by the publication of CSR reports, then it is more resilience to damage from controversies. This is supported by the finding that pro-ESG firms consistently fared better than their counterparts in both financial and ESG performance, even in the event of a controversy. However, firms with consistent reporting are also held to a higher standard than their nonreporting peers, suggesting a higher risk and higher reward dynamic. These findings support the theory of good management, in that long-term strategic planning is both immediately economically beneficial and serves as a means of risk management and social impact mitigation. Overall, this contributes to the literature by fillings gaps in the nature of impact that ESG has on firm performance, particularly from a management perspective

    A Decision Support System for Economic Viability and Environmental Impact Assessment of Vertical Farms

    Get PDF
    Vertical farming (VF) is the practice of growing crops or animals using the vertical dimension via multi-tier racks or vertically inclined surfaces. In this thesis, I focus on the emerging industry of plant-specific VF. Vertical plant farming (VPF) is a promising and relatively novel practice that can be conducted in buildings with environmental control and artificial lighting. However, the nascent sector has experienced challenges in economic viability, standardisation, and environmental sustainability. Practitioners and academics call for a comprehensive financial analysis of VPF, but efforts are stifled by a lack of valid and available data. A review of economic estimation and horticultural software identifies a need for a decision support system (DSS) that facilitates risk-empowered business planning for vertical farmers. This thesis proposes an open-source DSS framework to evaluate business sustainability through financial risk and environmental impact assessments. Data from the literature, alongside lessons learned from industry practitioners, would be centralised in the proposed DSS using imprecise data techniques. These techniques have been applied in engineering but are seldom used in financial forecasting. This could benefit complex sectors which only have scarce data to predict business viability. To begin the execution of the DSS framework, VPF practitioners were interviewed using a mixed-methods approach. Learnings from over 19 shuttered and operational VPF projects provide insights into the barriers inhibiting scalability and identifying risks to form a risk taxonomy. Labour was the most commonly reported top challenge. Therefore, research was conducted to explore lean principles to improve productivity. A probabilistic model representing a spectrum of variables and their associated uncertainty was built according to the DSS framework to evaluate the financial risk for VF projects. This enabled flexible computation without precise production or financial data to improve economic estimation accuracy. The model assessed two VPF cases (one in the UK and another in Japan), demonstrating the first risk and uncertainty quantification of VPF business models in the literature. The results highlighted measures to improve economic viability and the viability of the UK and Japan case. The environmental impact assessment model was developed, allowing VPF operators to evaluate their carbon footprint compared to traditional agriculture using life-cycle assessment. I explore strategies for net-zero carbon production through sensitivity analysis. Renewable energies, especially solar, geothermal, and tidal power, show promise for reducing the carbon emissions of indoor VPF. Results show that renewably-powered VPF can reduce carbon emissions compared to field-based agriculture when considering the land-use change. The drivers for DSS adoption have been researched, showing a pathway of compliance and design thinking to overcome the ‘problem of implementation’ and enable commercialisation. Further work is suggested to standardise VF equipment, collect benchmarking data, and characterise risks. This work will reduce risk and uncertainty and accelerate the sector’s emergence

    The determinants of value addition: a crtitical analysis of global software engineering industry in Sri Lanka

    Get PDF
    It was evident through the literature that the perceived value delivery of the global software engineering industry is low due to various facts. Therefore, this research concerns global software product companies in Sri Lanka to explore the software engineering methods and practices in increasing the value addition. The overall aim of the study is to identify the key determinants for value addition in the global software engineering industry and critically evaluate the impact of them for the software product companies to help maximise the value addition to ultimately assure the sustainability of the industry. An exploratory research approach was used initially since findings would emerge while the study unfolds. Mixed method was employed as the literature itself was inadequate to investigate the problem effectively to formulate the research framework. Twenty-three face-to-face online interviews were conducted with the subject matter experts covering all the disciplines from the targeted organisations which was combined with the literature findings as well as the outcomes of the market research outcomes conducted by both government and nongovernment institutes. Data from the interviews were analysed using NVivo 12. The findings of the existing literature were verified through the exploratory study and the outcomes were used to formulate the questionnaire for the public survey. 371 responses were considered after cleansing the total responses received for the data analysis through SPSS 21 with alpha level 0.05. Internal consistency test was done before the descriptive analysis. After assuring the reliability of the dataset, the correlation test, multiple regression test and analysis of variance (ANOVA) test were carried out to fulfil the requirements of meeting the research objectives. Five determinants for value addition were identified along with the key themes for each area. They are staffing, delivery process, use of tools, governance, and technology infrastructure. The cross-functional and self-organised teams built around the value streams, employing a properly interconnected software delivery process with the right governance in the delivery pipelines, selection of tools and providing the right infrastructure increases the value delivery. Moreover, the constraints for value addition are poor interconnection in the internal processes, rigid functional hierarchies, inaccurate selections and uses of tools, inflexible team arrangements and inadequate focus for the technology infrastructure. The findings add to the existing body of knowledge on increasing the value addition by employing effective processes, practices and tools and the impacts of inaccurate applications the same in the global software engineering industry

    Annals [...].

    Get PDF
    Pedometrics: innovation in tropics; Legacy data: how turn it useful?; Advances in soil sensing; Pedometric guidelines to systematic soil surveys.Evento online. Coordenado por: Waldir de Carvalho Junior, Helena Saraiva Koenow Pinheiro, Ricardo SimĂŁo Diniz Dalmolin

    Investigating and mitigating the role of neutralisation techniques on information security policies violation in healthcare organisations

    Get PDF
    Healthcare organisations today rely heavily on Electronic Medical Records systems (EMRs), which have become highly crucial IT assets that require significant security efforts to safeguard patients’ information. Individuals who have legitimate access to an organisation’s assets to perform their day-to-day duties but intentionally or unintentionally violate information security policies can jeopardise their organisation’s information security efforts and cause significant legal and financial losses. In the information security (InfoSec) literature, several studies emphasised the necessity to understand why employees behave in ways that contradict information security requirements but have offered widely different solutions. In an effort to respond to this situation, this thesis addressed the gap in the information security academic research by providing a deep understanding of the problem of medical practitioners’ behavioural justifications to violate information security policies and then determining proper solutions to reduce this undesirable behaviour. Neutralisation theory was used as the theoretical basis for the research. This thesis adopted a mixed-method research approach that comprises four consecutive phases, and each phase represents a research study that was conducted in light of the results from the preceding phase. The first phase of the thesis started by investigating the relationship between medical practitioners’ neutralisation techniques and their intention to violate information security policies that protect a patient’s privacy. A quantitative study was conducted to extend the work of Siponen and Vance [1] through a study of the Saudi Arabia healthcare industry. The data was collected via an online questionnaire from 66 Medical Interns (MIs) working in four academic hospitals. The study found that six neutralisation techniques—(1) appeal to higher loyalties, (2) defence of necessity, (3) the metaphor of ledger, (4) denial of responsibility, (5) denial of injury, and (6) condemnation of condemners—significantly contribute to the justifications of the MIs in hypothetically violating information security policies. The second phase of this research used a series of semi-structured interviews with IT security professionals in one of the largest academic hospitals in Saudi Arabia to explore the environmental factors that motivated the medical practitioners to evoke various neutralisation techniques. The results revealed that social, organisational, and emotional factors all stimulated the behavioural justifications to breach information security policies. During these interviews, it became clear that the IT department needed to ensure that security policies fit the daily tasks of the medical practitioners by providing alternative solutions to ensure the effectiveness of those policies. Based on these interviews, the objective of the following two phases was to improve the effectiveness of InfoSec policies against the use of behavioural justification by engaging the end users in the modification of existing policies via a collaborative writing process. Those two phases were conducted in the UK and Saudi Arabia to determine whether the collaborative writing process could produce a more effective security policy that balanced the security requirements with daily business needs, thus leading to a reduction in the use of neutralisation techniques to violate security policies. The overall result confirmed that the involvement of the end users via a collaborative writing process positively improved the effectiveness of the security policy to mitigate the individual behavioural justifications, showing that the process is a promising one to enhance security compliance

    Freelance subtitlers in a subtitle production network in the OTT industry in Thailand: a longitudinal study

    Get PDF
    The present study sets out to investigate a subtitle production network in the over-the-top (OTT) industry in Thailand through the perspective of freelance subtitlers. A qualitative longitudinal research design was adopted to gain insights into (1) the way the work practices of freelance subtitlers are influenced by both human and non-human actors in the network, (2) the evolution of the network, and (3) how the freelance subtitlers’ perception of quality is influenced by changes occurring in the network. Eleven subtitlers were interviewed every six months over a period of two years, contributing to over 60 hours of interview data. The data analysis was informed by selected concepts from Actor-Network Theory (ANT) (Law 1992, 2009; Latour 1996, 2005; Mol 2010), and complemented by the three-dimensional quality model proposed by Abdallah (2016, 2017). Reflexive thematic analysis (Braun and Clarke 2019a, 2020b) was used to generate themes and sub-themes which address the research questions and tell compelling stories about the actor-network. It was found that from July 2017 to September 2019, the subtitle production network, which was sustained by complex interrelationships between actors, underwent a number of changes. The changes affected the work practices of freelance subtitlers in a more negative than positive way, demonstrating their precarious position in an industry that has widely adopted the vendor model (Moorkens 2017). Moreover, as perceived by the research participants, under increasingly undesirable working conditions, it became more challenging to maintain a quality process and to produce quality subtitles. Finally, translation technology and tools, including machine translation, were found to be key non-human actors that catalyse the changes in the network under study
    • 

    corecore