141 research outputs found
The Symbiotic Relationship Between Information Retrieval and Informetrics
Informetrics and information retrieval (IR) represent fundamental areas of study within information science. Historically, researchers have not fully capitalized on the potential research synergies that exist between these two areas. Data sources used in traditional informetrics studies have their analogues in IR, with similar types of empirical regularities found in IR system content and use. Methods for data collection and analysis used in informetrics can help to inform IR system development and evaluation. Areas of application have included automatic indexing, index term weighting and understanding user query and session patterns through the quantitative analysis of user transaction logs. Similarly, developments in database technology have made the study of informetric phenomena less cumbersome, and recent innovations used in IR research, such as language models and ranking algorithms, provide new tools that may be applied to research problems of interest to informetricians. Building on the author’s previous work (Wolfram 2003), this paper reviews a sample of relevant literature published primarily since 2000 to highlight how each area of study may help to inform and benefit the other
Microscopic Aspects of Stretched Exponential Relaxation (SER) in Homogeneous Molecular and Network Glasses and Polymers
Because the theory of SER is still a work in progress, the phenomenon itself
can be said to be the oldest unsolved problem in science, as it started with
Kohlrausch in 1847. Many electrical and optical phenomena exhibit SER with
probe relaxation I(t) ~ exp[-(t/{\tau}){\beta}], with 0 < {\beta} < 1. Here
{\tau} is a material-sensitive parameter, useful for discussing chemical
trends. The "shape" parameter {\beta} is dimensionless and plays the role of a
non-equilibrium scaling exponent; its value, especially in glasses, is both
practically useful and theoretically significant. The mathematical complexity
of SER is such that rigorous derivations of this peculiar function were not
achieved until the 1970's. The focus of much of the 1970's pioneering work was
spatial relaxation of electronic charge, but SER is a universal phenomenon, and
today atomic and molecular relaxation of glasses and deeply supercooled liquids
provide the most reliable data. As the data base grew, the need for a
quantitative theory increased; this need was finally met by the
diffusion-to-traps topological model, which yields a remarkably simple
expression for the shape parameter {\beta}, given by d*/(d* + 2). At first
sight this expression appears to be identical to d/(d + 2), where d is the
actual spatial dimensionality, as originally derived. The original model,
however, failed to explain much of the data base. Here the theme of earlier
reviews, based on the observation that in the presence of short-range forces
only d* = d = 3 is the actual spatial dimensionality, while for mixed short-
and long-range forces, d* = fd = d/2, is applied to four new spectacular
examples, where it turns out that SER is useful not only for purposes of
quality control, but also for defining what is meant by a glass in novel
contexts. (Please see full abstract in main text
Information Metrics (iMetrics): A Research Specialty with a Socio-Cognitive Identity?
"Bibliometrics", "scientometrics", "informetrics", and "webometrics" can all
be considered as manifestations of a single research area with similar
objectives and methods, which we call "information metrics" or iMetrics. This
study explores the cognitive and social distinctness of iMetrics with respect
to the general information science (IS), focusing on a core of researchers,
shared vocabulary and literature/knowledge base. Our analysis investigates the
similarities and differences between four document sets. The document sets are
drawn from three core journals for iMetrics research (Scientometrics, Journal
of the American Society for Information Science and Technology, and Journal of
Informetrics). We split JASIST into document sets containing iMetrics and
general IS articles. The volume of publications in this representation of the
specialty has increased rapidly during the last decade. A core of researchers
that predominantly focus on iMetrics topics can thus be identified. This core
group has developed a shared vocabulary as exhibited in high similarity of
title words and one that shares a knowledge base. The research front of this
field moves faster than the research front of information science in general,
bringing it closer to Price's dream.Comment: Accepted for publication in Scientometric
Theories of Informetrics and Scholarly Communication
Scientometrics have become an essential element in the practice and evaluation of science and research, including both the evaluation of individuals and national assessment exercises. Yet, researchers and practitioners in this field have lacked clear theories to guide their work. As early as 1981, then doctoral student Blaise Cronin published The need for a theory of citing - a call to arms for the fledgling scientometric community to produce foundational theories upon which the work of the field could be based. More than three decades later, the time has come to reach out the field again and ask how they have responded to this call. This book compiles the foundational theories that guide informetrics and scholarly communication research. It is a much needed compilation by leading scholars in the field that gathers together the theories that guide our understanding of authorship, citing, and impact
Theories of Informetrics and Scholarly Communication
Scientometrics have become an essential element in the practice and evaluation of science and research, including both the evaluation of individuals and national assessment exercises. Yet, researchers and practitioners in this field have lacked clear theories to guide their work. As early as 1981, then doctoral student Blaise Cronin published "The need for a theory of citing" —a call to arms for the fledgling scientometric community to produce foundational theories upon which the work of the field could be based. More than three decades later, the time has come to reach out the field again and ask how they have responded to this call.
This book compiles the foundational theories that guide informetrics and scholarly communication research. It is a much needed compilation by leading scholars in the field that gathers together the theories that guide our understanding of authorship, citing, and impact
Theories of Informetrics and Scholarly Communication
Scientometrics have become an essential element in the practice and evaluation of science and research, including both the evaluation of individuals and national assessment exercises. Yet, researchers and practitioners in this field have lacked clear theories to guide their work. As early as 1981, then doctoral student Blaise Cronin published "The need for a theory of citing" —a call to arms for the fledgling scientometric community to produce foundational theories upon which the work of the field could be based. More than three decades later, the time has come to reach out the field again and ask how they have responded to this call.
This book compiles the foundational theories that guide informetrics and scholarly communication research. It is a much needed compilation by leading scholars in the field that gathers together the theories that guide our understanding of authorship, citing, and impact
Recent Development in Information Science: Implications for Information Systems Research
Over past several decades, the management information systems (MIS) community has adopted theories, methodologies, philosophical bases, and assumptions from sister disciplines. This paper reports the changing nature of information science (IS) towards multi-disciplinarity and its development over the past decade. It also examines the contribution of informetrics to MIS research in delineating the intellectual structure of information systems, comparing cumulative research traditions, demonstrating theoretical differences between competing approaches, tracing a paradigm shift. Development in IS provides MIS researchers with ample opportunities for cross-disciplinary research, new research tools, new theories to understand information systems phenomena, etc
Science Models as Value-Added Services for Scholarly Information Systems
The paper introduces scholarly Information Retrieval (IR) as a further
dimension that should be considered in the science modeling debate. The IR use
case is seen as a validation model of the adequacy of science models in
representing and predicting structure and dynamics in science. Particular
conceptualizations of scholarly activity and structures in science are used as
value-added search services to improve retrieval quality: a co-word model
depicting the cognitive structure of a field (used for query expansion), the
Bradford law of information concentration, and a model of co-authorship
networks (both used for re-ranking search results). An evaluation of the
retrieval quality when science model driven services are used turned out that
the models proposed actually provide beneficial effects to retrieval quality.
From an IR perspective, the models studied are therefore verified as expressive
conceptualizations of central phenomena in science. Thus, it could be shown
that the IR perspective can significantly contribute to a better understanding
of scholarly structures and activities.Comment: 26 pages, to appear in Scientometric
On the correction of “old” omitted citations by bibliometric databases
Omitted citations – i.e., missing links between a cited paper and the corresponding citing papers – are the main consequence of several bibliometric-database errors. To reduce these errors, databases may undertake two actions: (i) improving the control of the (new) papers to be indexed, i.e., limiting the introduction of “new” dirty data, and (ii) detecting and correcting errors in the papers already indexed by the database, i.e., cleaning “old” dirty data. The latter action is probably more complicated, as it requires the application of suitable error-detection procedures to a huge amount of data.
Based on an extensive sample of scientific papers in the Engineering-Manufacturing field, this study focuses on old dirty data in the Scopus and WoS databases. To this purpose, a recent automated algorithm for estimating the omitted-citation rate of databases is applied to the same sample of papers, but in three different-time sessions. A database’s ability to clean the old dirty data is evaluated considering the variations in the omitted-citation rate from session to session. The major outcomes of this study are that: (i) both databases slowly correct old omitted citations, and (ii) a small portion of initially corrected citations can surprisingly come off from databases over time
- …