17,776 research outputs found
User Donations in a Crowdsourced Video System
Crowdsourced video systems like YouTube and Twitch.tv have been a major
internet phenomenon and are nowadays entertaining over a billion users. In
addition to video sharing and viewing, over the years they have developed new
features to boost the community engagement and some managed to attract users to
donate, to the community as well as to other users. User donation directly
reflects and influences user engagement in the community, and has a great
impact on the success of such systems. Nevertheless, user donations in
crowdsourced video systems remain trade secrets for most companies and to date
are still unexplored. In this work, we attempt to fill this gap, and we obtain
and provide a publicly available dataset on user donations in one crowdsourced
video system named BiliBili. Based on information on nearly 40 thousand
donators, we examine the dynamics of user donations and their social
relationships, we quantitively reveal the factors that potentially impact user
donation, and we adopt machine-learned classifiers and network representation
learning models to timely and accurately predict the destinations of the
majority and the individual donations.Comment: 8 page
A Distributed Incremental Update Scheme for Probability Distribution of Wind Power Forecast Error
Due to the uncertainty of distributed wind generations (DWGs), a better
understanding of the probability distributions (PD) of their wind power
forecast errors (WPFEs) can help market participants (MPs) who own DWGs perform
better during trading. Under the premise of an accurate PD model, considering
the correlation among DWGs and absorbing the new information carried by the
latest data are two ways to maintain an accurate PD. These two ways both
require the historical and latest wind power and forecast data of all DWGs.
Each MP, however, only has access to the data of its own DWGs and may refuse to
share these data with MPs belonging to other stakeholders. Besides, because of
the endless generation of new data, the PD updating burden increases sharply.
Therefore, we use the distributed strategy to deal with the data collection
problem. In addition, we further apply the incremental learning strategy to
reduce the updating burden. Finally, we propose a distributed incremental
update scheme to make each MP continually acquire the latest conditional PD of
its DWGs' WPFE. Specifically, we first use the Gaussian-mixture-model-based
(GMM-based) joint PD to characterize the correlation among DWGs. Then, we
propose a distributed modified incremental GMM algorithm to enable MPs to
update the parameters of the joint PD in a distributed and incremental manner.
After that, we further propose a distributed derivation algorithm to make MPs
derive their conditional PD of WPFE from the joint one in a distributed way.
Combining the two original algorithms, we finally achieve the complete
distributed incremental update scheme, by which each MP can continually obtain
its latest conditional PD of its DWGs' WPFE via neighborhood communication and
local calculation with its own data. The effectiveness, correctness, and
efficiency of the proposed scheme are verified using the dataset from the NREL
A congruence involving harmonic sums modulo
In 2014, Wang and Cai established the following harmonic congruence for any
odd prime and positive integer , \begin{equation*}
Z(p^{r})\equiv-2p^{r-1}B_{p-3} ~(\bmod ~ p^{r}), \end{equation*} where and
denote the set of positive integers which are prime to .
In this note, we obtain a congruence for distinct odd primes and
positive integers , \begin{equation*}
Z(p^{\alpha}q^{\beta})\equiv
2(2-q)(1-\frac{1}{q^{3}})p^{\alpha-1}q^{\beta-1}B_{p-3}\pmod{p^{\alpha}}
\end{equation*} and the necessary and sufficient condition for
\begin{equation*} Z(p^{\alpha}q^{\beta})\equiv 0\pmod{p^{\alpha}q^{\beta}}.
\end{equation*} Finally, we raise a conjecture that for and odd prime
power , , \begin{eqnarray} \nonumber Z(n)\equiv
\prod\limits_{q|n\atop{q\neq
p}}(1-\frac{2}{q})(1-\frac{1}{q^{3}})(-\frac{2n}{p})B_{p-3}\pmod{p^{\alpha}}.
\end{eqnarray
Low dose CT reconstruction assisted by an image manifold prior
X-ray Computed Tomography (CT) is an important tool in medical imaging to
obtain a direct visualization of patient anatomy. However, the x-ray radiation
exposure leads to the concern of lifetime cancer risk. Low-dose CT scan can
reduce the radiation exposure to patient while the image quality is usually
degraded due to the appearance of noise and artifacts. Numerous studies have
been conducted to regularize CT image for better image quality. Yet, exploring
the underlying manifold where real CT images residing on is still an open
problem. In this paper, we propose a fully data-driven manifold learning
approach by incorporating the emerging deep-learning technology. An
encoder-decoder convolutional neural network has been established to map a CT
image to the inherent low-dimensional manifold, as well as to restore the CT
image from its corresponding manifold representation. A novel reconstruction
algorithm assisted by the leant manifold prior has been developed to achieve
high quality low-dose CT reconstruction. In order to demonstrate the
effectiveness of the proposed framework, network training, testing, and
comprehensive simulation study have been performed using patient abdomen CT
images. The trained encoder-decoder CNN is capable of restoring high-quality CT
images with average error of ~20 HU. Furthermore, the proposed manifold prior
assisted reconstruction scheme achieves high-quality low-dose CT
reconstruction, with average reconstruction error of < 30 HU, more than five
times and two times lower than that of filtered back projection method and
total-variation based iterative reconstruction method, respectively
Discussion of Parameters Setting for A Distributed Probabilistic Modeling Algorithm
This manuscript provides additional case analysis for the parameters setting
of the distributed probabilistic modeling algorithm for the aggregated wind
power forecast error
Experimental review of the physics at colliders and the LHC
The three lowest-lying states, i.e. ,
, and , composed of pairs and below the
threshold, provide a good platform for the researches of hadronic
physics and physics beyond the Standard Model. They can be produced directly in
colliding experiments, such as CLEO, Babar, and Belle, with low
continuum backgrounds. In these experiments, many measurements of the exclusive
and decays into light hadrons, which shed light
on the "80\% rule" for the Okubo-Zweig-Iizuka suppressed decays in the
bottomonium sector, were carried out. Meanwhile, many studies of the charmonium
and bottomonium productions in decays were performed, to
distinguish different Quantum Chromodynamics (QCD) models. Besides, exotic
states and new physics were also extensively explored in
decays at CLEO, BaBar, and Belle. The states can also be
produced in collisions and in collisions involving heavy ions. The
precision measurements of their cross sections and polarizations at the large
hadron collider (LHC), especially in the CMS, ATLAS, and LHCb experiments, help
to understand production mechanisms in collisions. The
observation of the sequential suppression in heavy ion collisions at
CMS is of great importance for verifying the quark-gluon plasma predicted by
QCD. In this article, we review the experimental results on
at colliders and the LHC, and summarize their
prospects at Belle II and the LHC.Comment: 42 pages, 40 figures; revised version, accepted by Frontiers of
Physic
Electromagnetic fingerprints of the Little Bang
Measurements of thermal photons emitted from the rapidly expanding hot and
dense medium ("Little Bang") formed in ultra relativistic heavy-ion collisions,
and their current theoretical interpretation, are reviewed.Comment: 6 pages, 4 figures. Invited talk at Hard Probes 2013, Stellenbosch,
South Africa, Nov. 4-8, 2013. To be published in the Proceedings by Nuclear
Physics
Homeomorphic approximation of the intersection curve of two rational surfaces
We present an approach of computing the intersection curve of
two rational parametric surface and , one being
projectable and hence can easily be implicitized. Plugging the parametric
surface to the implicit surface yields a plane algebraic curve . By
analyzing the topology graph \G of and the singular points on the
intersection curve we associate a space topology graph to
, which is homeomorphic to and therefore leads us to
an approximation for in a given precision.Comment: 18 pages,15 figure
Pre-equilibrium evolution effects on heavy-ion collision observables
In order to investigate the importance of pre-equilibrium dynamics on
relativistic heavy-ion collision observables, we match a highly non-equilibrium
early evolution stage, modeled by free-streaming partons generated from the
Monte Carlo Kharzeev-Levin-Nardi (MC-KLN) and Monte Carlo Glauber (MC-Glb)
models, to a locally approximately thermalized later evolution stage described
by viscous hydrodynamics, and study the dependence of final hadronic transverse
momentum distributions, in particular their underlying radial and anisotropic
flows, on the switching time between these stages. Performing a 3-parameter fit
of the measured values for the average transverse momenta for pions, kaons and protons as well as the elliptic and triangular
flows of charged hadrons , with the switching time
, the specific shear viscosity during the hydrodynamic stage,
and the kinetic decoupling temperature as free parameters, we
find that the preferred "thermalization" times depend strongly on the
model of the initial conditions. MC-KLN initial conditions require an earlier
transition to hydrodynamic behavior (at 0.13 fm/) ,
followed by hydrodynamic evolution with a larger specific shear viscosity
0.2, than MC-Glb initial conditions which prefer switching at a
later time ( 0.6 fm/) followed by a less viscous hydrodynamic
evolution with 0.16. These new results including
pre-equilibrium evolution are compared to fits without a pre-equilbrium stage
where all dynamic evolution before the onset of hydrodynamic behavior is
ignored. In each case, the quality of the dynamical descriptions for the
optimized parameter sets, as well as the observables which show the strongest
constraining power for the thermalization time, are discussed
Cultivating Online: Question Routing in a Question and Answering Community for Agriculture
Community-based Question and Answering (CQA) platforms are nowadays
enlightening over a billion people with crowdsourced knowledge. A key design
issue in CQA platforms is how to find the potential answerers and to provide
the askers timely and suitable answers, i.e., the so-called \textit{question
routing} problem. State-of-art approaches often rely on extracting topics from
the question texts. In this work, we analyze the question routing problem in a
CQA system named Farm-Doctor that is exclusive for agricultural knowledge. The
major challenge is that its questions contain limited textual information.
To this end, we conduct an extensive measurement and obtain the whole
knowledge repository of Farm-Doctor that consists of over 690 thousand
questions and over 3 million answers. To remedy the text deficiency, we model
Farm-Doctor as a heterogeneous information network that incorporates rich side
information and based on network representation learning models we accurately
recommend for each question the users that are highly likely to answer it. With
an average income of fewer than 6 dollars a day, over 300 thousands farmers in
China seek online in Farm-Doctor for agricultural advices. Our method helps
these less eloquent farmers with their cultivation and hopefully provides a way
to improve their lives
- β¦