1,715 research outputs found
Macro-micro feedback links of water management in South Africa : CGE analyses of selected policy regimes
The pressure on an already stressed water situation in South Africa is predicted to increase significantly under climate change, plans for large industrial expansion, observed rapid urbanization, and government programs to provide access to water to millions of previously excluded people. The present study employed a general equilibrium approach to examine the economy-wide impacts of selected macro and water related policy reforms on water use and allocation, rural livelihoods, and the economy at large. The analyses reveal that implicit crop-level water quotas reduce the amount of irrigated land allocated to higher-value horticultural crops and create higher shadow rents for production of lower-value, water-intensive field crops, such as sugarcane and fodder. Accordingly, liberalizing local water allocation in irrigation agriculture is found to work in favor of higher-value crops, and expand agricultural production and exports and farm employment. Allowing for water trade between irrigation and non-agricultural uses fueled by higher competition for water from industrial expansion and urbanization leads to greater water shadow prices for irrigation water with reduced income and employment benefits to rural households and higher gains for non-agricultural households. The analyses show difficult tradeoffs between general economic gains and higher water prices, making irrigation subsidies difficult to justify.Water Supply and Sanitation Governance and Institutions,Town Water Supply and Sanitation,Water Supply and Systems,Water and Industry,Water Conservation
Sequential processing deficits of reading disabled persons is independent of inter-stimulus interval
AbstractDevelopmental dyslexia is a language-based learning disability with frequently associated non-linguistic sensory deficits that have been the basis of various perception-based theories. It remains an open question whether the underlying deficit in dyslexia is a low level impairment that causes speech and orthographic perception deficits that in turn impedes higher phonological and reading processes, or a high level impairment that affects both perceptual and reading related skills.We investigated by means of contrast detection thresholds two low-level theories of developmental dyslexia, the magnocellular and the fast temporal processing hypotheses, as well as a more recent suggestion that dyslexics have difficulties in sequential comparison tasks that can be attributed to a higher-order deficit. It was found that dyslexics had significantly higher thresholds only on a sequential, but not a spatial, detection task, and that this impairment was found to be independent of the inter-stimulus interval. We also found that the poor performance of dyslexics on the temporal task was dependent on the size of the required memory trace of the image rather than on the number of images. Our findings do not support the magnocellular theory and challenge the fast temporal deficit hypothesis. We suggest that dyslexics may have a higher order, dual mechanism impairment. We also discuss the clinical implications of our findings
Detecting Sarcasm in Multimodal Social Platforms
Sarcasm is a peculiar form of sentiment expression, where the surface
sentiment differs from the implied sentiment. The detection of sarcasm in
social media platforms has been applied in the past mainly to textual
utterances where lexical indicators (such as interjections and intensifiers),
linguistic markers, and contextual information (such as user profiles, or past
conversations) were used to detect the sarcastic tone. However, modern social
media platforms allow to create multimodal messages where audiovisual content
is integrated with the text, making the analysis of a mode in isolation
partial. In our work, we first study the relationship between the textual and
visual aspects in multimodal posts from three major social media platforms,
i.e., Instagram, Tumblr and Twitter, and we run a crowdsourcing task to
quantify the extent to which images are perceived as necessary by human
annotators. Moreover, we propose two different computational frameworks to
detect sarcasm that integrate the textual and visual modalities. The first
approach exploits visual semantics trained on an external dataset, and
concatenates the semantics features with state-of-the-art textual features. The
second method adapts a visual neural network initialized with parameters
trained on ImageNet to multimodal sarcastic posts. Results show the positive
effect of combining modalities for the detection of sarcasm across platforms
and methods.Comment: 10 pages, 3 figures, final version published in the Proceedings of
ACM Multimedia 201
Competition and Selection Among Conventions
In many domains, a latent competition among different conventions determines
which one will come to dominate. One sees such effects in the success of
community jargon, of competing frames in political rhetoric, or of terminology
in technical contexts. These effects have become widespread in the online
domain, where the data offers the potential to study competition among
conventions at a fine-grained level.
In analyzing the dynamics of conventions over time, however, even with
detailed on-line data, one encounters two significant challenges. First, as
conventions evolve, the underlying substance of their meaning tends to change
as well; and such substantive changes confound investigations of social
effects. Second, the selection of a convention takes place through the complex
interactions of individuals within a community, and contention between the
users of competing conventions plays a key role in the convention's evolution.
Any analysis must take place in the presence of these two issues.
In this work we study a setting in which we can cleanly track the competition
among conventions. Our analysis is based on the spread of low-level authoring
conventions in the eprint arXiv over 24 years: by tracking the spread of macros
and other author-defined conventions, we are able to study conventions that
vary even as the underlying meaning remains constant. We find that the
interaction among co-authors over time plays a crucial role in the selection of
them; the distinction between more and less experienced members of the
community, and the distinction between conventions with visible versus
invisible effects, are both central to the underlying processes. Through our
analysis we make predictions at the population level about the ultimate success
of different synonymous conventions over time--and at the individual level
about the outcome of "fights" between people over convention choices.Comment: To appear in Proceedings of WWW 2017, data at
https://github.com/CornellNLP/Macro
Recommended from our members
Modeling the efficient access of full-text information
The title of this paper describes a research goal set by many offices within US DOE. This paper reviews efficient full-text searching techniques being developed to better understand and meet this goal. Classical computer human interaction (CHI) approaches provided by commercial information retrieval (IR) engines fail to contextualize information in ways that facilitate timely decision making. Use of advanced CHI techniques (eg, visualization) in combination with deductive database technology augment the weaknesses found in presentation capabilities of IR engines and are discussed. Various techniques employed in a Web-based prototype system currently under development are presented
Measurements of Flavour Dependent Fragmentation Functions in Z^0 -> qq(bar) Events
Fragmentation functions for charged particles in Z -> qq(bar) events have
been measured for bottom (b), charm (c) and light (uds) quarks as well as for
all flavours together. The results are based on data recorded between 1990 and
1995 using the OPAL detector at LEP. Event samples with different flavour
compositions were formed using reconstructed D* mesons and secondary vertices.
The \xi_p = ln(1/x_E) distributions and the position of their maxima \xi_max
are also presented separately for uds, c and b quark events. The fragmentation
function for b quarks is significantly softer than for uds quarks.Comment: 29 pages, LaTeX, 5 eps figures (and colour figs) included, submitted
to Eur. Phys. J.
Determination of alpha_s using Jet Rates at LEP with the OPAL detector
Hadronic events produced in e+e- collisions by the LEP collider and recorded
by the OPAL detector were used to form distributions based on the number of
reconstructed jets. The data were collected between 1995 and 2000 and
correspond to energies of 91 GeV, 130-136 GeV and 161-209 GeV. The jet rates
were determined using four different jet-finding algorithms (Cone, JADE, Durham
and Cambridge). The differential two-jet rate and the average jet rate with the
Durham and Cambridge algorithms were used to measure alpha(s) in the LEP energy
range by fitting an expression in which order alpah_2s calculations were
matched to a NLLA prediction and fitted to the data. Combining the measurements
at different centre-of-mass energies, the value of alpha_s (Mz) was determined
to be
alpha(s)(Mz)=0.1177+-0.0006(stat.)+-0.0012$(expt.)+-0.0010(had.)+-0.0032(theo.)
\.Comment: 40 pages, 17 figures, Submitted to Euro. Phys. J.
Bose-Einstein Correlations in e+e- to W+W- at 172 and 183 GeV
Bose-Einstein correlations between like-charge pions are studied in hadronic
final states produced by e+e- annihilations at center-of-mass energies of 172
and 183 GeV. Three event samples are studied, each dominated by one of the
processes W+W- to qqlnu, W+W- to qqqq, or (Z/g)* to qq. After demonstrating the
existence of Bose-Einstein correlations in W decays, an attempt is made to
determine Bose-Einstein correlations for pions originating from the same W
boson and from different W bosons, as well as for pions from (Z/g)* to qq
events. The following results are obtained for the individual chaoticity
parameters lambda assuming a common source radius R: lambda_same = 0.63 +- 0.19
+- 0.14, lambda_diff = 0.22 +- 0.53 +- 0.14, lambda_Z = 0.47 +- 0.11 +- 0.08, R
= 0.92 +- 0.09 +- 0.09. In each case, the first error is statistical and the
second is systematic. At the current level of statistical precision it is not
established whether Bose-Einstein correlations, between pions from different W
bosons exist or not.Comment: 24 pages, LaTeX, including 6 eps figures, submitted to European
Physical Journal
W+W- production and triple gauge boson couplings at LEP energies up to 183 GeV
A study of W-pair production in e+e- annihilations at Lep2 is presented,
based on 877 W+W- candidates corresponding to an integrated luminosity of 57
pb-1 at sqrt(s) = 183 GeV. Assuming that the angular distributions of the
W-pair production and decay, as well as their branching fractions, are
described by the Standard Model, the W-pair production cross-section is
measured to be 15.43 +- 0.61 (stat.) +- 0.26 (syst.) pb. Assuming lepton
universality and combining with our results from lower centre-of-mass energies,
the W branching fraction to hadrons is determined to be 67.9 +- 1.2 (stat.) +-
0.5 (syst.)%. The number of W-pair candidates and the angular distributions for
each final state (qqlnu,qqqq,lnulnu) are used to determine the triple gauge
boson couplings. After combining these values with our results from lower
centre-of-mass energies we obtain D(kappa_g)=0.11+0.52-0.37,
D(g^z_1)=0.01+0.13-0.12 and lambda=-0.10+0.13-0.12, where the errors include
both statistical and systematic uncertainties and each coupling is determined
setting the other two couplings to the Standard Model value. The fraction of W
bosons produced with a longitudinal polarisation is measured to be
0.242+-0.091(stat.)+-0.023(syst.). All these measurements are consistent with
the Standard Model expectations.Comment: 48 pages, LaTeX, including 13 eps or ps figures, submitted to
European Physical Journal
- …