242 research outputs found

    Modeling citation concentration through a mixture of Leimkuhler curves

    Full text link
    When a graphical representation of the cumulative percentage of total citations to articles, ordered from most cited to least cited, is plotted against the cumulative percentage of articles, we obtain a Leimkuhler curve. In this study, we noticed that standard Leimkuhler functions may not be sufficient to provide accurate fits to various empirical informetrics data. Therefore, we introduce a new approach to Leimkuhler curves by fitting a known probability density function to the initial Leimkuhler curve, taking into account the presence of a heterogeneity factor. As a significant contribution to the existing literature, we introduce a pair of mixture distributions (called PG and PIG) to bibliometrics. In addition, we present closed-form expressions for Leimkuhler curves. {Some measures of citation concentration are examined empirically for the basic models (based on the Power {and Pareto distributions}) and the mixed models derived from {these}.} An application to two sources of informetric data was conducted to see how the mixing models outperform the standard basic models. The different models were fitted using non-linear least squares estimation.Comment: 21 pages, 2 figures, 2 table

    Microscopic Aspects of Stretched Exponential Relaxation (SER) in Homogeneous Molecular and Network Glasses and Polymers

    Full text link
    Because the theory of SER is still a work in progress, the phenomenon itself can be said to be the oldest unsolved problem in science, as it started with Kohlrausch in 1847. Many electrical and optical phenomena exhibit SER with probe relaxation I(t) ~ exp[-(t/{\tau}){\beta}], with 0 < {\beta} < 1. Here {\tau} is a material-sensitive parameter, useful for discussing chemical trends. The "shape" parameter {\beta} is dimensionless and plays the role of a non-equilibrium scaling exponent; its value, especially in glasses, is both practically useful and theoretically significant. The mathematical complexity of SER is such that rigorous derivations of this peculiar function were not achieved until the 1970's. The focus of much of the 1970's pioneering work was spatial relaxation of electronic charge, but SER is a universal phenomenon, and today atomic and molecular relaxation of glasses and deeply supercooled liquids provide the most reliable data. As the data base grew, the need for a quantitative theory increased; this need was finally met by the diffusion-to-traps topological model, which yields a remarkably simple expression for the shape parameter {\beta}, given by d*/(d* + 2). At first sight this expression appears to be identical to d/(d + 2), where d is the actual spatial dimensionality, as originally derived. The original model, however, failed to explain much of the data base. Here the theme of earlier reviews, based on the observation that in the presence of short-range forces only d* = d = 3 is the actual spatial dimensionality, while for mixed short- and long-range forces, d* = fd = d/2, is applied to four new spectacular examples, where it turns out that SER is useful not only for purposes of quality control, but also for defining what is meant by a glass in novel contexts. (Please see full abstract in main text

    Theories of Informetrics and Scholarly Communication

    Get PDF
    Scientometrics have become an essential element in the practice and evaluation of science and research, including both the evaluation of individuals and national assessment exercises. Yet, researchers and practitioners in this field have lacked clear theories to guide their work. As early as 1981, then doctoral student Blaise Cronin published The need for a theory of citing - a call to arms for the fledgling scientometric community to produce foundational theories upon which the work of the field could be based. More than three decades later, the time has come to reach out the field again and ask how they have responded to this call. This book compiles the foundational theories that guide informetrics and scholarly communication research. It is a much needed compilation by leading scholars in the field that gathers together the theories that guide our understanding of authorship, citing, and impact

    Contrasting Views of Complexity and Their Implications For Network-Centric Infrastructures

    Get PDF
    There exists a widely recognized need to better understand and manage complex “systems of systems,” ranging from biology, ecology, and medicine to network-centric technologies. This is motivating the search for universal laws of highly evolved systems and driving demand for new mathematics and methods that are consistent, integrative, and predictive. However, the theoretical frameworks available today are not merely fragmented but sometimes contradictory and incompatible. We argue that complexity arises in highly evolved biological and technological systems primarily to provide mechanisms to create robustness. However, this complexity itself can be a source of new fragility, leading to “robust yet fragile” tradeoffs in system design. We focus on the role of robustness and architecture in networked infrastructures, and we highlight recent advances in the theory of distributed control driven by network technologies. This view of complexity in highly organized technological and biological systems is fundamentally different from the dominant perspective in the mainstream sciences, which downplays function, constraints, and tradeoffs, and tends to minimize the role of organization and design

    Theories of Informetrics and Scholarly Communication

    Get PDF
    Scientometrics have become an essential element in the practice and evaluation of science and research, including both the evaluation of individuals and national assessment exercises. Yet, researchers and practitioners in this field have lacked clear theories to guide their work. As early as 1981, then doctoral student Blaise Cronin published "The need for a theory of citing" —a call to arms for the fledgling scientometric community to produce foundational theories upon which the work of the field could be based. More than three decades later, the time has come to reach out the field again and ask how they have responded to this call. This book compiles the foundational theories that guide informetrics and scholarly communication research. It is a much needed compilation by leading scholars in the field that gathers together the theories that guide our understanding of authorship, citing, and impact

    More "normal" than normal: scaling distributions and complex systems

    Get PDF
    One feature of many naturally occurring or engineered complex systems is tremendous variability in event sizes. To account for it, the behavior of these systems is often described using power law relationships or scaling distributions, which tend to be viewed as "exotic" because of their unusual properties (e.g., infinite moments). An alternate view is based on mathematical, statistical, and data-analytic arguments and suggests that scaling distributions should be viewed as "more normal than normal". In support of this latter view that has been advocated by Mandelbrot for the last 40 years, we review in this paper some relevant results from probability theory and illustrate a powerful statistical approach for deciding whether the variability associated with observed event sizes is consistent with an underlying Gaussian-type (finite variance) or scaling-type (infinite variance) distribution. We contrast this approach with traditional model fitting techniques and discuss its implications for future modeling of complex systems

    The skewness of computer science

    Full text link
    Computer science is a relatively young discipline combining science, engineering, and mathematics. The main flavors of computer science research involve the theoretical development of conceptual models for the different aspects of computing and the more applicative building of software artifacts and assessment of their properties. In the computer science publication culture, conferences are an important vehicle to quickly move ideas, and journals often publish deeper versions of papers already presented at conferences. These peculiarities of the discipline make computer science an original research field within the sciences, and, therefore, the assessment of classical bibliometric laws is particularly important for this field. In this paper, we study the skewness of the distribution of citations to papers published in computer science publication venues (journals and conferences). We find that the skewness in the distribution of mean citedness of different venues combines with the asymmetry in citedness of articles in each venue, resulting in a highly asymmetric citation distribution with a power law tail. Furthermore, the skewness of conference publications is more pronounced than the asymmetry of journal papers. Finally, the impact of journal papers, as measured with bibliometric indicators, largely dominates that of proceeding papers.Comment: I applied the goodness-of-fit methodology proposed in: A. Clauset, C. R. Shalizi, M. E. J. Newman. Power-law distributions in empirical data. SIAM Review 51, 661-703 (2009
    corecore