12,766 research outputs found

    The Metaverse: Survey, Trends, Novel Pipeline Ecosystem & Future Directions

    Full text link
    The Metaverse offers a second world beyond reality, where boundaries are non-existent, and possibilities are endless through engagement and immersive experiences using the virtual reality (VR) technology. Many disciplines can benefit from the advancement of the Metaverse when accurately developed, including the fields of technology, gaming, education, art, and culture. Nevertheless, developing the Metaverse environment to its full potential is an ambiguous task that needs proper guidance and directions. Existing surveys on the Metaverse focus only on a specific aspect and discipline of the Metaverse and lack a holistic view of the entire process. To this end, a more holistic, multi-disciplinary, in-depth, and academic and industry-oriented review is required to provide a thorough study of the Metaverse development pipeline. To address these issues, we present in this survey a novel multi-layered pipeline ecosystem composed of (1) the Metaverse computing, networking, communications and hardware infrastructure, (2) environment digitization, and (3) user interactions. For every layer, we discuss the components that detail the steps of its development. Also, for each of these components, we examine the impact of a set of enabling technologies and empowering domains (e.g., Artificial Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on its advancement. In addition, we explain the importance of these technologies to support decentralization, interoperability, user experiences, interactions, and monetization. Our presented study highlights the existing challenges for each component, followed by research directions and potential solutions. To the best of our knowledge, this survey is the most comprehensive and allows users, scholars, and entrepreneurs to get an in-depth understanding of the Metaverse ecosystem to find their opportunities and potentials for contribution

    An exploration of the language within Ofsted reports and their influence on primary school performance in mathematics: a mixed methods critical discourse analysis

    Get PDF
    This thesis contributes to the understanding of the language of Ofsted reports, their similarity to one another and associations between different terms used within ‘areas for improvement’ sections and subsequent outcomes for pupils. The research responds to concerns from serving headteachers that Ofsted reports are overly similar, do not capture the unique story of their school, and are unhelpful for improvement. In seeking to answer ‘how similar are Ofsted reports’ the study uses two tools, a plagiarism detection software (Turnitin) and a discourse analysis tool (NVivo) to identify trends within and across a large corpus of reports. The approach is based on critical discourse analysis (Van Dijk, 2009; Fairclough, 1989) but shaped in the form of practitioner enquiry seeking power in the form of impact on pupils and practitioners, rather than a more traditional, sociological application of the method. The research found that in 2017, primary school section 5 Ofsted reports had more than half of their content exactly duplicated within other primary school inspection reports published that same year. Discourse analysis showed the quality assurance process overrode variables such as inspector designation, gender, or team size, leading to three distinct patterns of duplication: block duplication, self-referencing, and template writing. The most unique part of a report was found to be the ‘area for improvement’ section, which was tracked to externally verified outcomes for pupils using terms linked to ‘mathematics’. Those required to improve mathematics in their areas for improvement improved progress and attainment in mathematics significantly more than national rates. These findings indicate that there was a positive correlation between the inspection reporting process and a beneficial impact on pupil outcomes in mathematics, and that the significant similarity of one report to another had no bearing on the usefulness of the report for school improvement purposes within this corpus

    Economia colaborativa

    Get PDF
    A importância de se proceder à análise dos principais desafios jurídicos que a economia colaborativa coloca – pelas implicações que as mudanças de paradigma dos modelos de negócios e dos sujeitos envolvidos suscitam − é indiscutível, correspondendo à necessidade de se fomentar a segurança jurídica destas práticas, potenciadoras de crescimento económico e bem-estar social. O Centro de Investigação em Justiça e Governação (JusGov) constituiu uma equipa multidisciplinar que, além de juristas, integra investigadores de outras áreas, como a economia e a gestão, dos vários grupos do JusGov – embora com especial participação dos investigadores que integram o grupo E-TEC (Estado, Empresa e Tecnologia) – e de outras prestigiadas instituições nacionais e internacionais, para desenvolver um projeto neste domínio, com o objetivo de identificar os problemas jurídicos que a economia colaborativa suscita e avaliar se já existem soluções para aqueles, refletindo igualmente sobre a conveniência de serem introduzidas alterações ou se será mesmo necessário criar nova regulamentação. O resultado desta investigação é apresentado nesta obra, com o que se pretende fomentar a continuação do debate sobre este tema.Esta obra é financiada por fundos nacionais através da FCT — Fundação para a Ciência e a Tecnologia, I.P., no âmbito do Financiamento UID/05749/202

    HR Analytics: Concept, Application, and Impact on Talent Management, Branding, and Challenges

    Get PDF
    Purpose: Making wiser decisions about employees to improve performance at the individual and/or organizational levels is the process of HR analytics. HR analytics is a method for determining the correlation between HR practices and organizational performance outcomes such as sales volume or customer satisfaction. Human Resource Analytics was established in 1978 by Jac Fitz-Enz, the pioneer of human capital strategic analysis and performance benchmarking. In this paper, the researcher wants to discuss the concept of HR analytics, its application, impact on talent management, branding, and challenges in its application.Design/methodology/approach: The researcher examines secondary data and conducts a thorough literature review to understand the concept and its application across industries and nations, as well as to identify any challenges encountered during deployment and any benefits perceived by various industry professionals. Findings: The study's findings indicate that using HR analytics can help businesses build their brand and gain a competitive edge in today's fiercely competitive business environment while also enhancing workforce and employee productivity.Originality/value: This study has significant implications for both literature and HR analytics. Researchers will know more about the factors that contribute to and the mechanisms by which HR analytics improve organisational performance. The author's second claim is that having access to HR technology both facilitates and precedes HR analytics. Finally, concrete data from the literature demonstrates its influence on branding and organisational success. Keywords: Human resource (HR) analytics, People analytics, Branding, Talent Management, Organizational performance. Paper type: Research paper JEL Code: M12, M15 & M51 DOI: 10.7176/EJBM/15-8-06 Publication date: April 30th 202

    Lift EVERY Voice and Sing: An Intersectional Qualitative Study Examining the Experiences of Lesbian, Gay, Bisexual, and Queer Faculty and Administrators at Historically Black Colleges and Universities

    Get PDF
    While there is minimal literature that address the experiences of lesbian, gay, bisexual, and trans* identified students at Historically Black Colleges and Universities (HBCUs), the experiences of Black, queer faculty and administrators at HBCUs has not been studied. This intersectional qualitative research study focused on the experiences of lesbian, gay, bisexual, and queer identified faculty and administrators who work at HBCUs. By investigating the intersections of religion, race, gender, and sexuality within a predominantly Black institution, this study aims to enhance diversity, equity, and inclusion efforts at HBCUs by sharing the experiences of the LGBQ faculty and administrators that previously or currently work at an HBCU as a full-time employee. The research questions that guided this study were 1) How have LGBQ faculty and staff negotiated/navigated their careers at HBCUs? and 2) How do LGBQ faculty and staff at HBCUs influence cultural (relating to LGBQ inclusion) change at the organizational level? The main theoretical framework used was intersectionality and it shaped the chosen methodology and methods. The Politics of Respectability was the second theoretical framework used to describe the intra-racial tensions within the Black/African American community. The study included 60-120 minute interviews with 12 participants. Using intersectionality as a guide, the data were coded and utilized for thematic analysis. Then, an ethnodramatic performance engages readers. The goals of this study were to encourage policy changes, promote inclusivity for LGBQ employees at HBCUs, and provide an expansion to the body of literature in the field pertaining to the experiences of LGBQ faculty and administrators in higher education

    Towards a sociology of conspiracy theories: An investigation into conspiratorial thinking on Dönmes

    Get PDF
    This thesis investigates the social and political significance of conspiracy theories, which has been an academically neglected topic despite its historical relevance. The academic literature focuses on the methodology, social significance and political impacts of these theories in a secluded manner and lacks empirical analyses. In response, this research provides a comprehensive theoretical framework for conspiracy theories by considering their methodology, political impacts and social significance in the light of empirical data. Theoretically, the thesis uses Adorno's semi-erudition theory along with Girardian approach. It proposes that conspiracy theories are methodologically semi-erudite narratives, i.e. they are biased in favour of a belief and use reason only to prove it. It suggests that conspiracy theories appear in times of power vacuum and provide semi-erudite cognitive maps that relieve alienation and ontological insecurities of people and groups. In so doing, they enforce social control over their audience due to their essentialist, closed-to-interpretation narratives. In order to verify the theory, the study analyses empirically the social and political significance of conspiracy theories about the Dönme community in Turkey. The analysis comprises interviews with conspiracy theorists, conspiracy theory readers and political parties, alongside a frame analysis of the popular conspiracy theory books on Dönmes. These confirm the theoretical framework by showing that the conspiracy theories are fed by the ontological insecurities of Turkish society. Hence, conspiracy theorists, most readers and some political parties respond to their own ontological insecurities and political frustrations through scapegoating Dönmes. Consequently, this work shows that conspiracy theories are important symptoms of society, which, while relieving ontological insecurities, do not provide politically prolific narratives

    Data-to-text generation with neural planning

    Get PDF
    In this thesis, we consider the task of data-to-text generation, which takes non-linguistic structures as input and produces textual output. The inputs can take the form of database tables, spreadsheets, charts, and so on. The main application of data-to-text generation is to present information in a textual format which makes it accessible to a layperson who may otherwise find it problematic to understand numerical figures. The task can also automate routine document generation jobs, thus improving human efficiency. We focus on generating long-form text, i.e., documents with multiple paragraphs. Recent approaches to data-to-text generation have adopted the very successful encoder-decoder architecture or its variants. These models generate fluent (but often imprecise) text and perform quite poorly at selecting appropriate content and ordering it coherently. This thesis focuses on overcoming these issues by integrating content planning with neural models. We hypothesize data-to-text generation will benefit from explicit planning, which manifests itself in (a) micro planning, (b) latent entity planning, and (c) macro planning. Throughout this thesis, we assume the input to our generator are tables (with records) in the sports domain. And the output are summaries describing what happened in the game (e.g., who won/lost, ..., scored, etc.). We first describe our work on integrating fine-grained or micro plans with data-to-text generation. As part of this, we generate a micro plan highlighting which records should be mentioned and in which order, and then generate the document while taking the micro plan into account. We then show how data-to-text generation can benefit from higher level latent entity planning. Here, we make use of entity-specific representations which are dynam ically updated. The text is generated conditioned on entity representations and the records corresponding to the entities by using hierarchical attention at each time step. We then combine planning with the high level organization of entities, events, and their interactions. Such coarse-grained macro plans are learnt from data and given as input to the generator. Finally, we present work on making macro plans latent while incrementally generating a document paragraph by paragraph. We infer latent plans sequentially with a structured variational model while interleaving the steps of planning and generation. Text is generated by conditioning on previous variational decisions and previously generated text. Overall our results show that planning makes data-to-text generation more interpretable, improves the factuality and coherence of the generated documents and re duces redundancy in the output document

    Epilepsy Mortality: Leading Causes of Death, Co-morbidities, Cardiovascular Risk and Prevention

    Get PDF
    a reuptake inhibitor selectively prevents seizure-induced sudden death in the DBA/1 mouse model of sudden unexpected ... Bilateral lesions of the fastigial nucleus prevent the recovery of blood pressure following hypotension induced by ..
    • …
    corecore