6,730 research outputs found

    The Metaverse: Survey, Trends, Novel Pipeline Ecosystem & Future Directions

    Full text link
    The Metaverse offers a second world beyond reality, where boundaries are non-existent, and possibilities are endless through engagement and immersive experiences using the virtual reality (VR) technology. Many disciplines can benefit from the advancement of the Metaverse when accurately developed, including the fields of technology, gaming, education, art, and culture. Nevertheless, developing the Metaverse environment to its full potential is an ambiguous task that needs proper guidance and directions. Existing surveys on the Metaverse focus only on a specific aspect and discipline of the Metaverse and lack a holistic view of the entire process. To this end, a more holistic, multi-disciplinary, in-depth, and academic and industry-oriented review is required to provide a thorough study of the Metaverse development pipeline. To address these issues, we present in this survey a novel multi-layered pipeline ecosystem composed of (1) the Metaverse computing, networking, communications and hardware infrastructure, (2) environment digitization, and (3) user interactions. For every layer, we discuss the components that detail the steps of its development. Also, for each of these components, we examine the impact of a set of enabling technologies and empowering domains (e.g., Artificial Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on its advancement. In addition, we explain the importance of these technologies to support decentralization, interoperability, user experiences, interactions, and monetization. Our presented study highlights the existing challenges for each component, followed by research directions and potential solutions. To the best of our knowledge, this survey is the most comprehensive and allows users, scholars, and entrepreneurs to get an in-depth understanding of the Metaverse ecosystem to find their opportunities and potentials for contribution

    Self-Supervised Learning to Prove Equivalence Between Straight-Line Programs via Rewrite Rules

    Full text link
    We target the problem of automatically synthesizing proofs of semantic equivalence between two programs made of sequences of statements. We represent programs using abstract syntax trees (AST), where a given set of semantics-preserving rewrite rules can be applied on a specific AST pattern to generate a transformed and semantically equivalent program. In our system, two programs are equivalent if there exists a sequence of application of these rewrite rules that leads to rewriting one program into the other. We propose a neural network architecture based on a transformer model to generate proofs of equivalence between program pairs. The system outputs a sequence of rewrites, and the validity of the sequence is simply checked by verifying it can be applied. If no valid sequence is produced by the neural network, the system reports the programs as non-equivalent, ensuring by design no programs may be incorrectly reported as equivalent. Our system is fully implemented for a given grammar which can represent straight-line programs with function calls and multiple types. To efficiently train the system to generate such sequences, we develop an original incremental training technique, named self-supervised sample selection. We extensively study the effectiveness of this novel training approach on proofs of increasing complexity and length. Our system, S4Eq, achieves 97% proof success on a curated dataset of 10,000 pairs of equivalent programsComment: 30 pages including appendi

    Desarrollo de una herramienta integral de gestión de gases de efecto invernadero para la toma de decisión contra el cambio climático a nivel regional y local en la Comunitat Valenciana

    Full text link
    Tesis por compendio[ES] Actualmente, los responsables de tomar decisiones contra el cambio climático carecen de herramientas para desarrollar inventarios de emisiones de gases de efecto invernadero (GEI) con suficiente rigor científico-técnico y precisión para priorizar e invertir los recursos disponibles de manera eficiente en las medidas necesarias para luchar contra el cambio climático. Por ello, en esta tesis se expone el desarrollo de un sistema de información territorial y sectorial (SITE) para monitorear las emisiones de GEI que sirva como herramienta de gobernanza climática local y regional. SITE combina las ventajas de los enfoques metodológicos descendente o top-down (de arriba hacia abajo) y ascendente o bottom-up (de abajo hacia arriba), para lograr un enfoque híbrido innovador para contabilizar y gestionar de manera eficiente las emisiones de GEI. Por tanto, en esta tesis se definen los diferentes desarrollos metodológicos, tanto generales como específicos de sectores clave del Panel Intergubernamental de Cambio Climático (IPPC) (edificación, transporte, sector forestal, etc.), un desarrollo informático para la parte de SITE que se ejecuta del lado del servidor, que de ahora en adelante denominaremos back-end del sistema, y siete implementaciones como casos de estudio representativos, a diferentes escalas y aplicados sobre diferentes sectores. Estas implementaciones a diferentes escalas y sectores demuestran el potencial del sistema como herramienta de apoyo en la toma de decisión contra el cambio climático a nivel regional y local. Las diferentes implementaciones en casos piloto representativos, tanto a nivel regional en la Comunitat Valenciana como a nivel local en municipios grandes (València) y medianos (Quart de Poblet y Llíria) muestran el potencial de adaptación territorial y sectorial que tiene la herramienta. Las metodologías desarrolladas para los sectores específicos de tráfico rodado, edificación o sector forestal, ofrecen cuantificaciones con una resolución espacial con gran capacidad de optimizar las políticas locales y regionales. Por tanto, la herramienta cuenta con un gran potencial de escalabilidad y gran capacidad de mejora continua mediante la inclusión de nuevos enfoques metodológicos, adaptación de las metodologías a la disponibilidad de datos, metodologías concretas para sectores clave y actualización a las mejores metodologías disponibles derivadas de actividades de investigación de la comunidad científica.[CA] Actualment, els responsables de prendre decisions contra el canvi climàtic no tenen eines per aconseguir inventaris d'emissions de gasos d'efecte hivernacle (GEH) amb prou cientificotècnic rigor, precisió i integritat per invertir els recursos disponibles de manera eficient en les mesures necessàries contra el canvi climàtic. Per això, en aquesta tesis se exposa el desenvolupa un sistema d'informació territorial i sectorial (SITE) per monitoritzar les emissions de GEH com a eina de governança climàtica local i regional. Aquest sistema combina els avantatges dels enfocaments metodològics descendent o top-down (de dalt a baix) i ascendent o bottom-up (de baix a dalt), per aconseguir un enfocament híbrid innovador per comptabilitzar i gestionar de manera eficient les emissions de GEH. Per tant, en aquesta tesi doctoral es descriuen els diferents desenvolupaments metodològics, tant generals com específics de sectors clau del Panel Intergovernamental contra el Canvi Climàtic (edificació, transport, forestal, etc.), un desenvolupament informàtic per al back-end del sistema i set implementacions com a casos d'estudi representatius, a diferents escales, amb els diferents enfocaments metodològics i aplicats sobre diferents sectors. Això queda descrit en sis capítols. Aquestes implementacions a diferents escales i sectors demostren el potencial del sistema com a eina de suport en la presa de decisió contra el canvi climàtic a nivell regional i local. Les diferents implementacions en casos pilot representatius, tant a nivell regional a la Comunitat Valenciana com a nivell local en municipis grans (València) i mitjans (Quart de Poblet i Llíria,) mostren el potencial d'adaptació territorial i sectorial que té l'eina. Les metodologies desenvolupades per als sectors específics de trànsit rodat, edificació i forestal, ofereixen quantificacions amb una resolució espacial amb gran capacitat d'optimitzar les polítiques locals i regionals. Per tant, l'eina compta amb un gran potencial d'escalabilitat i gran capacitat de millora contínua mitjançant la inclusió de nous enfocaments metodològics, adaptació de les metodologies a la disponibilitat de dades, metodologies concretes per a sectors clau, i actualització a les millors metodologies disponibles derivades de activitats de investigació de la comunitat científica.[EN] Currently, regional and local decision-makers lack of tools to achieve greenhouse gases (GHG) emissions inventories with enough rigor, accuracy and completeness in order to prioritize available resources efficiently against climate change. Thus, in this thesis the development of a territorial and sectoral information system (SITE) to monitor GHG emissions as a local and regional climate governance tool is exposed. This system combines the advantages of both, top-down and bottom-up approaches, to achieve an innovative hybrid approach to account and manage efficiently GHG emissions. Furthermore, this thesis defines the methodologies developed, a computer proposal for the back-end of the system and seven implementations as representative case studies at different scales (local and regional level), with the different methodological approaches and applied to different sectors. Thus, these implementations demonstrate the potential of the system as decision-making tool against climate change at the regional and local level as climate governance tool. The different implementations in representative pilot cases, both at the regional level in the Valencian Community and at the local level in large (Valencia) and medium-sized municipalities (Quart de Poblet and Llíria) demonstrate the potential for territorial and sectoral adaptation of the system developed. The methodologies developed for the specific sectors of road transport, building and forestry, offer quantifications with a spatial resolution with a great capacity to optimize local and regional policies. Therefore, the tool has a great potential for scalability and a great capacity for continuous improvement through the inclusion of new methodological approaches, adapting the methodologies to the availability of data, specific methodologies for key sectors, and updating to the best methodologies available in the scientific community.Lorenzo Sáez, E. (2022). Desarrollo de una herramienta integral de gestión de gases de efecto invernadero para la toma de decisión contra el cambio climático a nivel regional y local en la Comunitat Valenciana [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/181662TESISCompendi

    Exploring Topics in Bibliometric Research Through Citation Networks and Semantic Analysis.

    Full text link
    This article surveys topic distributions of the academic literature that employs the terms bibliometrics, scientometrics, and informetrics. This exploration allows informing on the adoption of those terms and publication patterns of the authors acknowledging their work to be part of bibliometric research. We retrieved 20,268 articles related to bibliometrics and applied methodologies that exploit various features of the dataset to surface different topic representations. Across them, we observe major trends including discussions on theory, regional publication patterns, databases, and tools. There is a great increase in the application of bibliometrics as science mapping and decision-making tools in management, public health, sustainability, and medical fields. It is also observed that the term bibliometrics has reached an overall generality, while the terms scientometrics and informetrics may be more accurate in representing the core of bibliometric research as understood by the information and library science field. This article contributes by providing multiple snapshots of a field that has grown too quickly beyond the confines of library science

    Design principles and architecture of a second language learning chatbot

    Get PDF
    The purpose of this article is to set out the design principles and architecture of a second language (L2) learning voice chatbot. Building on L2 acquisition theories and chatbot research, in this article, we report on a South Korean government-funded longitudinal project in which we designed and developed a chatbot called “Ellie”. Chatbot Ellie has three chat modes, “General Chat,” “Task Chat,” and “Skills”. In the General Chat mode, L2 users can have short talks about personal information, whereas in the Task Chat mode, they can engage in a wide range of problem-solving L2 tasks to achieve task goals by exchanging meanings with Ellie. The Skills mode offers form-focused language practice. Ellie was piloted among 137 Korean high school students, who used Ellie individually or in a group, for seven weeks in their English classes. The quality of the chatbot was investigated in terms of the appropriateness of language level, continuity of conversation, and success in task performance. Based on the results of the pilot, Ellie appears to have considerable potential to become an effective language learning companion for L2 learners, and has implications for the design and developments of future L2 chatbots

    The crisis of cultural authority in museums : contesting human remains in the collections of Britain

    Get PDF
    Museums in Britain have displayed and researched human remains since the eighteenth century. However, in the last two decades human remains in collections have become subject to claims and controversies. Firstly, human remains associated with acquisition during the colonial period have become increasingly difficult to retain and have been transfered to culturally affiliated overseas indigenous groups. Secondly, a group of British Pagans have formed to make claims on ancient human remains in collections. Thirdly, human remains that are not requested by any community group, and of all ages, have become the focus of concerns expressed about their treatment by members of the profession. A discourse arguing for 'respect' has emerged, which argues that all human remains should be treated with new care. The claims made on human remains have been vigourously but differentially contested by members of the sector, who consider the human remains to be unique research objects. This thesis charts the influences at play on the contestation over human remains and examines its construction. The academic literature tends to understand changes to museums as a result of external factors. This thesis argues that this problem is influenced by a crisis of legitimacy and establishes that there are strong internal influences. Through a weak social constructionist approach I demonstrate that the issue has been promoted by influential members of the sector as part of a broader attempt to distance themselves from their foundational role, as a consequence of a crisis of cultural authority stimulated by external and internal factors. The symbolic character of human remains in locating this problem is informed by the unique properties of dead bodies and is influenced by the significance of the body as a scientific object; its association with identity work and as a site of political struggle, in the high modem period

    Graphical scaffolding for the learning of data wrangling APIs

    Get PDF
    In order for students across the sciences to avail themselves of modern data streams, they must first know how to wrangle data: how to reshape ill-organised, tabular data into another format, and how to do this programmatically, in languages such as Python and R. Despite the cross-departmental demand and the ubiquity of data wrangling in analytical workflows, the research on how to optimise the instruction of it has been minimal. Although data wrangling as a programming domain presents distinctive challenges - characterised by on-the-fly syntax lookup and code example integration - it also presents opportunities. One such opportunity is how tabular data structures are easily visualised. To leverage the inherent visualisability of data wrangling, this dissertation evaluates three types of graphics that could be employed as scaffolding for novices: subgoal graphics, thumbnail graphics, and parameter graphics. Using a specially built e-learning platform, this dissertation documents a multi-institutional, randomised, and controlled experiment that investigates the pedagogical effects of these. Our results indicate that the graphics are well-received, that subgoal graphics boost the completion rate, and that thumbnail graphics improve navigability within a command menu. We also obtained several non-significant results, and indications that parameter graphics are counter-productive. We will discuss these findings in the context of general scaffolding dilemmas, and how they fit into a wider research programme on data wrangling instruction

    Developing automated meta-research approaches in the preclinical Alzheimer's disease literature

    Get PDF
    Alzheimer’s disease is a devastating neurodegenerative disorder for which there is no cure. A crucial part of the drug development pipeline involves testing therapeutic interventions in animal disease models. However, promising findings in preclinical experiments have not translated into clinical trial success. Reproducibility has often been cited as a major issue affecting biomedical research, where experimental results in one laboratory cannot be replicated in another. By using meta-research (research on research) approaches such as systematic reviews, researchers aim to identify and summarise all available evidence relating to a specific research question. By conducting a meta-analysis, researchers can also combine the results from different experiments statistically to understand the overall effect of an intervention and to explore reasons for variations seen across different publications. Systematic reviews of the preclinical Alzheimer’s disease literature could inform decision making, encourage research improvement, and identify gaps in the literature to guide future research. However, due to the vast amount of potentially useful evidence from animal models of Alzheimer’s disease, it remains difficult to make sense of and utilise this data effectively. Systematic reviews are common practice within evidence based medicine, yet their application to preclinical research is often limited by the time and resources required. In this thesis, I develop, build-upon, and implement automated meta-research approaches to collect, curate, and evaluate the preclinical Alzheimer’s literature. I searched several biomedical databases to obtain all research relevant to Alzheimer’s disease. I developed a novel deduplication tool to automatically identify and remove duplicate publications identified across different databases with minimal human effort. I trained a crowd of reviewers to annotate a subset of the publications identified and used this data to train a machine learning algorithm to screen through the remaining publications for relevance. I developed text-mining tools to extract model, intervention, and treatment information from publications and I improved existing automated tools to extract reported measures to reduce the risk of bias. Using these tools, I created a categorised database of research in transgenic Alzheimer’s disease animal models and created a visual summary of this dataset on an interactive, openly accessible online platform. Using the techniques described, I also identified relevant publications within the categorised dataset to perform systematic reviews of two key outcomes of interest in transgenic Alzheimer’s disease models: (1) synaptic plasticity and transmission in hippocampal slices and (2) motor activity in the open field test. Over 400,000 publications were identified across biomedical research databases, with 230,203 unique publications. In a performance evaluation across different preclinical datasets, the automated deduplication tool I developed could identify over 97% of duplicate citations and a had an error rate similar to that of human performance. When evaluated on a test set of publications, the machine learning classifier trained to identify relevant research in transgenic models performed was highly sensitive (captured 96.5% of relevant publications) and excluded 87.8% of irrelevant publications. Tools to identify the model(s) and outcome measure(s) within the full-text of publications may reduce the burden on reviewers and were found to be more sensitive than searching only the title and abstract of citations. Automated tools to assess risk of bias reporting were highly sensitive and could have the potential to monitor research improvement over time. The final dataset of categorised Alzheimer’s disease research contained 22,375 publications which were then visualised in the interactive web application. Within the application, users can see how many publications report measures to reduce the risk of bias and how many have been classified as using each transgenic model, testing each intervention, and measuring each outcome. Users can also filter to obtain curated lists of relevant research, allowing them to perform systematic reviews at an accelerated pace with reduced effort required to search across databases, and a reduced number of publications to screen for relevance. Both systematic reviews and meta-analyses highlighted failures to report key methodological information within publications. Poor transparency of reporting limited the statistical power I had to understand the sources of between-study variation. However, some variables were found to explain a significant proportion of the heterogeneity. Transgenic animal model had a significant impact on results in both reviews. For certain open field test outcomes, wall colour of the open field arena and the reporting of measures to reduce the risk of bias were found to impact results. For in vitro electrophysiology experiments measuring synaptic plasticity, several electrophysiology parameters, including magnesium concentration of the recording solution, were found to explain a significant proportion of the heterogeneity. Automated meta-research approaches and curated web platforms summarising preclinical research could have the potential to accelerate the conduct of systematic reviews and maximise the potential of existing evidence to inform translation

    The geographies of care and training in the development of assistance dog partnerships

    Get PDF
    Human-assistance-dog partnerships form a significant phenomena that have been overlooked in both animal geographies and disability geographies. By focusing on one Assistance Dogs UK (ADUK) charity, ‘Dog A.I.D’., a charity that helps physically disabled and chronically ill people to train their own pets to be assistance dogs, I detail the intimate entangled lifeworlds that humans and dogs occupy. In doing so, I also dialogue between the sub-disciplinary fields of animal geographies and disability geographies, by exploring two broad thematic areas – embodiment and care. As such, this thesis examines the geographies of assistance dog partnership, the care and training practices involved, the benefits and challenges of sharing a lifeworld with a different species, and the changing relationship from a human-pet bond to a human-assistance-dog partnership. Drawing on lived experience and representations of assistance dog partnerships gathered through qualitative (and quantitative) research methods, including a survey, semi-structured interviews (face-to-face, online, and telephone), video ethnography, and magazine analysis, I contribute to research on the assistance dog partnerships and growing debates around the more-than-human nature of care. The ethnomethodological approach to exploring how training occurs between disabled human and assistance dog is also noteworthy as it centres the lively experiences of practice at work between species. The thesis is organised around interconnected themes: the intimate worlds of assistance dog partnerships, working bodies, and caring relations. These thematics allow for a geographical interpretation into the governance, spatial organisation, and representations of dog assistance partnerships. I also explore the training cultures of Dog A.I.D. whilst also spotlighting the lived experiences of training through the early stages of ‘socialisation’, ‘familiarisation’, ‘life skills training’, through to ‘task work’. Finally, the thesis focuses on the practices of care that characterise the assistance dog partnership, showing how care is provided and received by both human and nonhuman. I pay attention to the complex potentiality of the partnership, illustrating how dogs are trained to assist, but also how dogs appear to embody lively, agentic, moments of care. The thesis contributes original work which speaks to animal and disability geographies and attends to the multiple geographies of care-full cross-species lives

    Analytics in the Business School: Insights from the Literature

    Get PDF
    The demand for business and data analysts is growing. The business school is well positioned to offer programs to meet these needs. This paper presents both the findings from a review of the existing literature on data analytics job roles, skills required for those roles and also feedback from industry experts on findings. Three different types of articles are included in the design: faculty writing about their personal experiences and observations (faculty voice), data gathered from expert practitioners and other academics (nonresident expertise), and empirical data from online job service platforms (content analysis). The narrative review method is used to integrate these disparate sources of information and deliver cohesive observations. This knowledge can be used to build better analytics programs in business schools
    corecore