20,502 research outputs found
Randomized Composable Core-sets for Distributed Submodular Maximization
An effective technique for solving optimization problems over massive data
sets is to partition the data into smaller pieces, solve the problem on each
piece and compute a representative solution from it, and finally obtain a
solution inside the union of the representative solutions for all pieces. This
technique can be captured via the concept of {\em composable core-sets}, and
has been recently applied to solve diversity maximization problems as well as
several clustering problems. However, for coverage and submodular maximization
problems, impossibility bounds are known for this technique \cite{IMMM14}. In
this paper, we focus on efficient construction of a randomized variant of
composable core-sets where the above idea is applied on a {\em random
clustering} of the data. We employ this technique for the coverage, monotone
and non-monotone submodular maximization problems. Our results significantly
improve upon the hardness results for non-randomized core-sets, and imply
improved results for submodular maximization in a distributed and streaming
settings.
In summary, we show that a simple greedy algorithm results in a
-approximate randomized composable core-set for submodular maximization
under a cardinality constraint. This is in contrast to a known impossibility result for (non-randomized) composable core-set. Our
result also extends to non-monotone submodular functions, and leads to the
first 2-round MapReduce-based constant-factor approximation algorithm with
total communication complexity for either monotone or non-monotone
functions. Finally, using an improved analysis technique and a new algorithm
, we present an improved -approximation algorithm
for monotone submodular maximization, which is in turn the first
MapReduce-based algorithm beating factor in a constant number of rounds
Effective diffusion constant in a two dimensional medium of charged point scatterers
We obtain exact results for the effective diffusion constant of a two
dimensional Langevin tracer particle in the force field generated by charged
point scatterers with quenched positions. We show that if the point scatterers
have a screened Coulomb (Yukawa) potential and are uniformly and independently
distributed then the effective diffusion constant obeys the
Volgel-Fulcher-Tammann law where it vanishes. Exact results are also obtained
for pure Coulomb scatterers frozen in an equilibrium configuration of the same
temperature as that of the tracer.Comment: 9 pages IOP LaTex, no figure
The effects of recent mortgage refinancing
Rising home prices and generally falling interest rates in recent years, together with a desire to convert the accumulated equity in their homes into spendable funds, have prompted many homeowners to refinance their mortgages. In the spring of 1999, the Federal Reserve surveyed consumers to determine the extent of refinancing, the extent to which refinancing homeowners "cashed-out" some of their equity when they refinanced, how much equity they took out, and how they spent the funds. Survey results suggest that cash-out refinancings in 1998 and early 1999 likely boosted consumption spending a bit, may have had a larger effect on home improvement spending, and may have moderated the growth of consumer credit during that period.Mortgages ; Housing - Finance ; Interest rates
Inequalities for low-energy symmetric nuclear matter
Using effective field theory we prove inequalities for the correlations of
two-nucleon operators in low-energy symmetric nuclear matter. For physical
values of operator coefficients in the effective Lagrangian, the S = 1, I = 0
channel correlations must have the lowest energy and longest correlation length
in the two-nucleon sector. This result is valid at nonzero density and
temperature.Comment: 9 page
Continuum Derrida Approach to Drift and Diffusivity in Random Media
By means of rather general arguments, based on an approach due to Derrida
that makes use of samples of finite size, we analyse the effective diffusivity
and drift tensors in certain types of random medium in which the motion of the
particles is controlled by molecular diffusion and a local flow field with
known statistical properties. The power of the Derrida method is that it uses
the equilibrium probability distribution, that exists for each {\em finite}
sample, to compute asymptotic behaviour at large times in the {\em infinite}
medium. In certain cases, where this equilibrium situation is associated with a
vanishing microcurrent, our results demonstrate the equality of the
renormalization processes for the effective drift and diffusivity tensors. This
establishes, for those cases, a Ward identity previously verified only to
two-loop order in perturbation theory in certain models. The technique can be
applied also to media in which the diffusivity exhibits spatial fluctuations.
We derive a simple relationship between the effective diffusivity in this case
and that for an associated gradient drift problem that provides an interesting
constraint on previously conjectured results.Comment: 18 pages, Latex, DAMTP-96-8
Distance traveled by random walkers before absorption in a random medium
We consider the penetration length of random walkers diffusing in a
medium of perfect or imperfect absorbers of number density . We solve
this problem on a lattice and in the continuum in all dimensions , by means
of a mean-field renormalization group. For a homogeneous system in , we
find that , where is the absorber density
correlation length. The cases of D=1 and D=2 are also treated. In the presence
of long-range correlations, we estimate the temporal decay of the density of
random walkers not yet absorbed. These results are illustrated by exactly
solvable toy models, and extensive numerical simulations on directed
percolation, where the absorbers are the active sites. Finally, we discuss the
implications of our results for diffusion limited aggregation (DLA), and we
propose a more effective method to measure in DLA clusters.Comment: Final version: also considers the case of imperfect absorber
Ab-initio computation of neutron-rich oxygen isotopes
We compute the binding energy of neutron-rich oxygen isotopes and employ the
coupled-cluster method and chiral nucleon-nucleon interactions at
next-to-next-to-next-to-leading order with two different cutoffs. We obtain
rather well-converged results in model spaces consisting of up to 21 oscillator
shells. For interactions with a momentum cutoff of 500 MeV, we find that 28O is
stable with respect to 24O, while calculations with a momentum cutoff of 600
MeV result in a slightly unbound 28O. The theoretical error estimates due to
the omission of the three-nucleon forces and the truncation of excitations
beyond three-particle-three-hole clusters indicate that the stability of 28O
cannot be ruled out from ab-initio calculations, and that three-nucleon forces
and continuum effects play the dominant role in deciding this question.Comment: 5 pages + eps, 3 figure
Determinants of Corporate Performance (CP) in Public Health Service Organizations (PHSO) in Eastern Province of Sri Lanka: A Use of Balanced Score Card (BSC)
Corporate performance in public health service organizations is how public health service organization looks at its patients, key disease treatment service lines, learning & growth and resources. Therefore, many authors have used BSC for organisational performance. This study tries to determine factors affecting performance of PHSOs; know the reliability and validity of items & factors and to create a mathematical equation model. Data are collected in both secondary and primary sources. Researcher collected 54 from corporate performance in public health service organisations’ performance during the period of 2012 to 1996. Primary data have been collected using questionnaire. Since this is a pilot study researcher selected only 100 hospital employees out of 3 selected government hospitals in Addalaichenai Divisional Secretariat of Ampara District. Collected questionnaires have been analysed by a factor analysis and regression analysis. Results found that patient, key service line, learning & growth and resource factors have been identified as performance of public health service organizations. Cronbach alpha for items in these factors are 0.888, 0.807, 0.651 and 0.857. It shows high reliability for items. KMO is used to know the statistical validity of factors. In this study, values of KMO for patient, key service line, learning & growth and resource are 0.687, 0.502, 0.559 and 0.818. Content validity and convergent validity are higher. Discriminant validity are lower statistically. Log log model is the best fitted model than linear models. Keywords: Corporate performance, Public Health Service Organizations, Eastern Province, Sri Lanka, Balanced Score Card (BSC
Simulation Approach to Assess the Precision of Estimates Derived from Linking Survey and Administrative Records
Probabilistic record linkage implies that there is some level of uncertainty related to the classification of pairs as links or non-links vis-à-vis their true match status. As record linkage is usually performed as a preliminary step to developing statistical estimates, the question then is how does this linkage uncertainty propagate to them? In this paper, we develop an approach to estimate the impact of linkage uncertainty on derived estimates by using a re-sampling approach. For each iteration of the re-sampling, pairs are classified as links or non-links by Monte-Carlo assignment to model estimated true match probabilities. By looking at the range of estimates produced in a series of re-samples, we can estimate the distribution of derived statistics under the prevailing incidence of linkage uncertainty. For this analysis we use the results of linking the 2014 National Hospital Care Survey to the National Death Index performed at the National Center for Health Statistics. We assess the precision of hospital-level death rate estimates
Trace checking of Metric Temporal Logic with Aggregating Modalities using MapReduce
Modern complex software systems produce a large amount of execution data,
often stored in logs. These logs can be analyzed using trace checking
techniques to check whether the system complies with its requirements
specifications. Often these specifications express quantitative properties of
the system, which include timing constraints as well as higher-level
constraints on the occurrences of significant events, expressed using aggregate
operators. In this paper we present an algorithm that exploits the MapReduce
programming model to check specifications expressed in a metric temporal logic
with aggregating modalities, over large execution traces. The algorithm
exploits the structure of the formula to parallelize the evaluation, with a
significant gain in time. We report on the assessment of the implementation -
based on the Hadoop framework - of the proposed algorithm and comment on its
scalability.Comment: 16 pages, 6 figures, Extended version of the SEFM 2014 pape
- …