1,203 research outputs found
Why We Read Wikipedia
Wikipedia is one of the most popular sites on the Web, with millions of users
relying on it to satisfy a broad range of information needs every day. Although
it is crucial to understand what exactly these needs are in order to be able to
meet them, little is currently known about why users visit Wikipedia. The goal
of this paper is to fill this gap by combining a survey of Wikipedia readers
with a log-based analysis of user activity. Based on an initial series of user
surveys, we build a taxonomy of Wikipedia use cases along several dimensions,
capturing users' motivations to visit Wikipedia, the depth of knowledge they
are seeking, and their knowledge of the topic of interest prior to visiting
Wikipedia. Then, we quantify the prevalence of these use cases via a
large-scale user survey conducted on live Wikipedia with almost 30,000
responses. Our analyses highlight the variety of factors driving users to
Wikipedia, such as current events, media coverage of a topic, personal
curiosity, work or school assignments, or boredom. Finally, we match survey
responses to the respondents' digital traces in Wikipedia's server logs,
enabling the discovery of behavioral patterns associated with specific use
cases. For instance, we observe long and fast-paced page sequences across
topics for users who are bored or exploring randomly, whereas those using
Wikipedia for work or school spend more time on individual articles focused on
topics such as science. Our findings advance our understanding of reader
motivations and behavior on Wikipedia and can have implications for developers
aiming to improve Wikipedia's user experience, editors striving to cater to
their readers' needs, third-party services (such as search engines) providing
access to Wikipedia content, and researchers aiming to build tools such as
recommendation engines.Comment: Published in WWW'17; v2 fixes caption of Table
A new technique for elucidating -decay schemes which involve daughter nuclei with very low energy excited states
A new technique of elucidating -decay schemes of isotopes with large
density of states at low excitation energies has been developed, in which a
Broad Energy Germanium (BEGe) detector is used in conjunction with coaxial
hyper-pure germanium detectors. The power of this technique has been
demonstrated on the example of 183Hg decay. Mass-separated samples of 183Hg
were produced by a deposition of the low-energy radioactive-ion beam delivered
by the ISOLDE facility at CERN. The excellent energy resolution of the BEGe
detector allowed rays energies to be determined with a precision of a
few tens of electronvolts, which was sufficient for the analysis of the
Rydberg-Ritz combinations in the level scheme. The timestamped structure of the
data was used for unambiguous separation of rays arising from the
decay of 183Hg from those due to the daughter decays
Diagnosis and assessment of dilated cardiomyopathy: a guideline protocol from the British Society of Echocardiography.
Heart failure (HF) is a debilitating and life-threatening condition, with 5-year survival rate lower than breast or prostate cancer. It is the leading cause of hospital admission in over 65s, and these admissions are projected to rise by more than 50% over the next 25 years. Transthoracic echocardiography (TTE) is the first-line step in diagnosis in acute and chronic HF and provides immediate information on chamber volumes, ventricular systolic and diastolic function, wall thickness, valve function and the presence of pericardial effusion, while contributing to information on aetiology. Dilated cardiomyopathy (DCM) is the third most common cause of HF and is the most common cardiomyopathy. It is defined by the presence of left ventricular dilatation and left ventricular systolic dysfunction in the absence of abnormal loading conditions (hypertension and valve disease) or coronary artery disease sufficient to cause global systolic impairment. This document provides a practical approach to diagnosis and assessment of dilated cardiomyopathy that is aimed at the practising sonographer
Manipulation and removal of defects in spontaneous optical patterns
Defects play an important role in a number of fields dealing with ordered
structures. They are often described in terms of their topology, mutual
interaction and their statistical characteristics. We demonstrate theoretically
and experimentally the possibility of an active manipulation and removal of
defects. We focus on the spontaneous formation of two-dimensional spatial
structures in a nonlinear optical system, a liquid crystal light valve under
single optical feedback. With increasing distance from threshold, the
spontaneously formed hexagonal pattern becomes disordered and contains several
defects. A scheme based on Fourier filtering allows us to remove defects and to
restore spatial order. Starting without control, the controlled area is
progressively expanded, such that defects are swept out of the active area.Comment: 4 pages, 4 figure
Recommended from our members
Design and Analysis of Cognitive Interviews for Comparative Multinational Testing
This article summarizes the work of the Comparative Cognitive Testing Workgroup, an international coalition of survey methodologists interested in developing an evidence-based methodology for examining the comparability of survey questions within cross-cultural or multinational contexts. To meet this objective, it was necessary to ensure that the cognitive interviewing (CI) method itself did not introduce method bias. Therefore, the workgroup first identified specific characteristics inherent in CI methodology that could undermine the comparability of CI evidence. The group then developed and implemented a protocol addressing those issues. In total, 135 cognitive interviews were conducted by participating countries. Through the process, the group identified various interpretive patterns resulting from sociocultural and language-related differences among countries as well as other patterns of error that would impede comparability of survey data
Controlling pattern formation and spatio-temporal disorder in nonlinear optics
We present a feedback control method for the stabilization of unstable patterns and for the control of spatio-temporal disorder. The control takes the form of a spatial modulation to the input pump, which is obtained via filtering in Fourier space of the output electric field. The control is powerful, flexible and non-invasive: the feedback vanishes once control is achieved. We demonstrate by means of computer simulation, the effect of the control in two different optical systems
Turbulent Mixing in the Interstellar Medium -- an application for Lagrangian Tracer Particles
We use 3-dimensional numerical simulations of self-gravitating compressible
turbulent gas in combination with Lagrangian tracer particles to investigate
the mixing process of molecular hydrogen (H2) in interstellar clouds. Tracer
particles are used to represent shock-compressed dense gas, which is associated
with H2. We deposit tracer particles in regions of density contrast in excess
of ten times the mean density. Following their trajectories and using
probability distribution functions, we find an upper limit for the mixing
timescale of H2, which is of order 0.3 Myr. This is significantly smaller than
the lifetime of molecular clouds, which demonstrates the importance of the
turbulent mixing of H2 as a preliminary stage to star formation.Comment: 10 pages, 5 figures, conference proceedings "Turbulent Mixing and
Beyond 2007
Methodological approaches to determining the marine radiocarbon reservoir effect
The marine radiocarbon reservoir effect is an offset in 14C age between contemporaneous organisms from the terrestrial environment and organisms that derive their carbon from the marine environment. Quantification of this effect is of crucial importance for correct calibration of the <sup>14</sup>C ages of marine-influenced samples to the calendrical timescale. This is fundamental to the construction of archaeological and palaeoenvironmental chronologies when such samples are employed in <sup>14</sup>C analysis. Quantitative measurements of temporal variations in regional marine reservoir ages also have the potential to be used as a measure of process changes within Earth surface systems, due to their link with climatic and oceanic changes. The various approaches to quantification of the marine radiocarbon reservoir effect are assessed, focusing particularly on the North Atlantic Ocean. Currently, the global average marine reservoir age of surface waters, R(t), is c. 400 radiocarbon years; however, regional values deviate from this as a function of climate and oceanic circulation systems. These local deviations from R(t) are expressed as +R values. Hence, polar waters exhibit greater reservoir ages (δR = c. +400 to +800 <sup>14</sup>C y) than equatorial waters (δR = c. 0 <sup>14</sup>C y). Observed temporal variations in δR appear to reflect climatic and oceanographic changes. We assess three approaches to quantification of marine reservoir effects using known age samples (from museum collections), tephra isochrones (present onshore/offshore) and paired marine/terrestrial samples (from the same context in, for example, archaeological sites). The strengths and limitations of these approaches are evaluated using examples from the North Atlantic region. It is proposed that, with a suitable protocol, accelerator mass spectrometry (AMS) measurements on paired, short-lived, single entity marine and terrestrial samples from archaeological deposits is the most promising approach to constraining changes over at least the last 5 ky BP
Sun, Moon, Stars, Rain, Vol. 7 No. 11
Official publication of the Sigma Tau Delta English Honor Society, Alpha Zet Chapter, Stephen F. Austin State University.
Published one a year in the Fall Semester, in cooperation with the English Department of Stephen F. Austin State University.https://scholarworks.sfasu.edu/smsr/1000/thumbnail.jp
- …