992 research outputs found

    Mining Temporal Patterns of Technical Term Usages in Bibliographical Data

    Full text link

    Empowering open science with reflexive and spatialised indicators

    Get PDF
    Bibliometrics have become commonplace and widely used by authors and journals to monitor, to evaluate and to identify their readership in an ever-increasingly publishing scientific world. This contribution introduces a multi-method corpus analysis tool, specifically conceived for scientific corpuses with spatialised content. We propose a dedicated interactive application that integrates three strategies for building semantic networks, using keywords (self-declared themes), citations (areas of research using the papers) and full-texts (themes derived from the words used in writing). The networks can be studied with respect to their temporal evolution as well as to their spatial expressions, by considering the countries studied in the papers under inquiry. The tool is applied as a proof-of-concept on the papers published in the online open access geography journal Cybergeo since its creation in 1996. Finally, we compare the three methods and conclude that their complementarity can help go beyond simple statistics to better understand the epistemological evolution of a scientific community and the readership target of the journal. Our tool can be applied by any journal on its own corpus, fostering thus open science and reflexivity

    8. Book Reviews

    Get PDF
    Reviews of Pearce, Pragmatism’s Evolution. Organism and Environment in American Philosophy, U. of Chicago Press 2020; Fal’kner, Le Papier-monnaie dans la Révolution française. Une analyse en termes d’économie d’émission, Classiques Garnier 2021; Imbruglia, Utopia. Una storia politica da Savonarola a Babeuf, Carocci 2021; Giuli, L’opulenza del Brasile coloniale. Storia di un trattato di economia e del gesuita Antonil, Carocci 2021

    Global Forest Decimal Classification (GFDC)

    Get PDF
    The English and German sections are provides as two separate files

    A Survey of the First 20 Years of Research on Semantic Web and Linked Data

    Get PDF
    International audienceThis paper is a survey of the research topics in the field of Semantic Web, Linked Data and Web of Data. This study looks at the contributions of this research community over its first twenty years of existence. Compiling several bibliographical sources and bibliometric indicators , we identify the main research trends and we reference some of their major publications to provide an overview of that initial period. We conclude with some perspectives for the future research challenges.Cet article est une étude des sujets de recherche dans le domaine du Web sémantique, des données liées et du Web des données. Cette étude se penche sur les contributions de cette communauté de recherche au cours de ses vingt premières années d'existence. En compilant plusieurs sources bibliographiques et indicateurs bibliométriques, nous identifions les principales tendances de la recherche et nous référençons certaines de leurs publications majeures pour donner un aperçu de cette période initiale. Nous concluons avec une discussion sur les tendances et perspectives de recherche

    Deep Learning Software Repositories

    Get PDF
    Bridging the abstraction gap between artifacts and concepts is the essence of software engineering (SE) research problems. SE researchers regularly use machine learning to bridge this gap, but there are three fundamental issues with traditional applications of machine learning in SE research. Traditional applications are too reliant on labeled data. They are too reliant on human intuition, and they are not capable of learning expressive yet efficient internal representations. Ultimately, SE research needs approaches that can automatically learn representations of massive, heterogeneous, datasets in situ, apply the learned features to a particular task and possibly transfer knowledge from task to task. Improvements in both computational power and the amount of memory in modern computer architectures have enabled new approaches to canonical machine learning tasks. Specifically, these architectural advances have enabled machines that are capable of learning deep, compositional representations of massive data depots. The rise of deep learning has ushered in tremendous advances in several fields. Given the complexity of software repositories, we presume deep learning has the potential to usher in new analytical frameworks and methodologies for SE research and the practical applications it reaches. This dissertation examines and enables deep learning algorithms in different SE contexts. We demonstrate that deep learners significantly outperform state-of-the-practice software language models at code suggestion on a Java corpus. Further, these deep learners for code suggestion automatically learn how to represent lexical elements. We use these representations to transmute source code into structures for detecting similar code fragments at different levels of granularity—without declaring features for how the source code is to be represented. Then we use our learning-based framework for encoding fragments to intelligently select and adapt statements in a codebase for automated program repair. In our work on code suggestion, code clone detection, and automated program repair, everything for representing lexical elements and code fragments is mined from the source code repository. Indeed, our work aims to move SE research from the art of feature engineering to the science of automated discovery

    Partners in Practice: Contemporary Irish Literature, World Literature and Digital Humanities

    Get PDF
    This dissertation examines the opportunities and implications afforded Irish literary studies by developments in the newly emergent disciplines of world literature and the digital humanities. Employing the world literature theories of Wai Chee Dimock, David Damrosch, Franco Moretti and Pascale Casanova in the critical analysis of works of contemporary Irish literature and Irish literary criticism produced in the period 1998-2010, it investigates how these theoretical approaches can generate new perspectives on Irish literature and argues that the real “problem” of world literature as it relates to Irish literary studies lies in establishing an interpretive method which enables considerations of the national within a global framework. This problem serves as the entry point to the engagement with the digital humanities presented throughout the dissertation. Situated within debates surrounding modes of “close” and “distant reading” (Moretti 2000) as they are played out in both the fields of world literature and digital literary studies, this work proposes an alternative digital humanities approach to the study of world literature to the modes of “distant reading” endorsed by literary critic, Franco Moretti and digital humanists such as Alan Liu (Liu 2012). Through a series of interdisciplinary case studies combining national and international, close and distant and old and new modes of literary scholarship, it argues that, rather than being opposed to a nationally-orientated form of literary criticism, the digital humanities have the tools and the methodologies necessary to bring Irish literary scholarship into a productive dialogue with perspectives from elsewhere and thus, to engender a form of Irish literary scholarship that transcends while not denying the significance of the nation state. By illustrating the manner in which the digital humanities can be employed to enhance and extend traditional approaches in Irish literary studies, this project demonstrates that Irish studies and the digital humanities can be “practicing partners” in a way that serves to advance work in both the fields of world literature and digital literary studies

    Data Analytics for Crisis Management: A Case Study of Sharing Economy Services in the COVID-19 Pandemic

    Get PDF
    This dissertation study aims to analyze the role of data-driven decision-making in sharing economy during the COVID-19 pandemic as a crisis management tool. In the twenty-first century, when applying analytical tools has become an essential component of business decision-making, including operations on crisis management, data analytics is an emerging field. To carry out corporate strategies, data-driven decision-making is seen as a crucial component of business operations. Data analytics can be applied to benefit-cost evaluations, strategy planning, client engagement, and service quality. Data forecasting can also be used to keep an eye on business operations and foresee potential risks. Risk Management and planning are essential for allocating the necessary resources with minimal cost and time and to be ready for a crisis. Hidden market trends and customer preferences can help companies make knowledgeable business decisions during crises and recessions. Each company should manage operations and response during emergencies, a path to recovery, and prepare for future similar events with appropriate data management tools. Sharing economy is part of social commerce, that brings together individuals who have underused assets and who want to rent those assets short-term. COVID-19 has emphasized the need for digital transformation. Since the pandemic began, the sharing economy has been facing challenges, while market demand dropped significantly. Shelter-in-Place and Stay-at-Home orders changed the way of offering such sharing services. Stricter safety procedures and the need for a strong balance sheet are the key take points to surviving during this difficult health crisis. Predictive analytics and peer-reviewed articles are used to assess the pandemic\u27s effects. The approaches chosen to assess the research objectives and the research questions are the predictive financial performance of Uber & Airbnb, bibliographic coupling, and keyword occurrence analyses of peer-reviewed works about the influence of data analytics on the sharing economy. The VOSViewer Bibliometric software program is utilized for computing bibliometric analysis, RapidMiner Predictive Data Analytics for computing data analytics, and LucidChart for visualizing data
    corecore