1,487 research outputs found
Viscoelastic behaviour and fracture toughness of linear-low-density polyethylene reinforced with synthetic boehmite alumina nanoparticles
Distributed Minimum Cut Approximation
We study the problem of computing approximate minimum edge cuts by
distributed algorithms. We use a standard synchronous message passing model
where in each round, bits can be transmitted over each edge (a.k.a.
the CONGEST model). We present a distributed algorithm that, for any weighted
graph and any , with high probability finds a cut of size
at most in
rounds, where is the size of the minimum cut. This algorithm is based
on a simple approach for analyzing random edge sampling, which we call the
random layering technique. In addition, we also present another distributed
algorithm, which is based on a centralized algorithm due to Matula [SODA '93],
that with high probability computes a cut of size at most
in rounds for any .
The time complexities of both of these algorithms almost match the
lower bound of Das Sarma et al. [STOC '11], thus
leading to an answer to an open question raised by Elkin [SIGACT-News '04] and
Das Sarma et al. [STOC '11].
Furthermore, we also strengthen the lower bound of Das Sarma et al. by
extending it to unweighted graphs. We show that the same lower bound also holds
for unweighted multigraphs (or equivalently for weighted graphs in which
bits can be transmitted in each round over an edge of weight ),
even if the diameter is . For unweighted simple graphs, we show
that even for networks of diameter , finding an -approximate minimum cut
in networks of edge connectivity or computing an
-approximation of the edge connectivity requires rounds
Globally Optimal Crowdsourcing Quality Management
We study crowdsourcing quality management, that is, given worker responses to
a set of tasks, our goal is to jointly estimate the true answers for the tasks,
as well as the quality of the workers. Prior work on this problem relies
primarily on applying Expectation-Maximization (EM) on the underlying maximum
likelihood problem to estimate true answers as well as worker quality.
Unfortunately, EM only provides a locally optimal solution rather than a
globally optimal one. Other solutions to the problem (that do not leverage EM)
fail to provide global optimality guarantees as well. In this paper, we focus
on filtering, where tasks require the evaluation of a yes/no predicate, and
rating, where tasks elicit integer scores from a finite domain. We design
algorithms for finding the global optimal estimates of correct task answers and
worker quality for the underlying maximum likelihood problem, and characterize
the complexity of these algorithms. Our algorithms conceptually consider all
mappings from tasks to true answers (typically a very large number), leveraging
two key ideas to reduce, by several orders of magnitude, the number of mappings
under consideration, while preserving optimality. We also demonstrate that
these algorithms often find more accurate estimates than EM-based algorithms.
This paper makes an important contribution towards understanding the inherent
complexity of globally optimal crowdsourcing quality management
On -Simple -Path
An -simple -path is a {path} in the graph of length that passes
through each vertex at most times. The -SIMPLE -PATH problem, given a
graph as input, asks whether there exists an -simple -path in . We
first show that this problem is NP-Complete. We then show that there is a graph
that contains an -simple -path and no simple path of length greater
than . So this, in a sense, motivates this problem especially
when one's goal is to find a short path that visits many vertices in the graph
while bounding the number of visits at each vertex.
We then give a randomized algorithm that runs in time that solves the -SIMPLE -PATH on a graph with
vertices with one-sided error. We also show that a randomized algorithm
with running time with gives a
randomized algorithm with running time \poly(n)\cdot 2^{cn} for the
Hamiltonian path problem in a directed graph - an outstanding open problem. So
in a sense our algorithm is optimal up to an factor
Almost-Tight Distributed Minimum Cut Algorithms
We study the problem of computing the minimum cut in a weighted distributed
message-passing networks (the CONGEST model). Let be the minimum cut,
be the number of nodes in the network, and be the network diameter. Our
algorithm can compute exactly in time. To the best of our knowledge, this is the first paper that
explicitly studies computing the exact minimum cut in the distributed setting.
Previously, non-trivial sublinear time algorithms for this problem are known
only for unweighted graphs when due to Pritchard and
Thurimella's -time and -time algorithms for
computing -edge-connected and -edge-connected components.
By using the edge sampling technique of Karger's, we can convert this
algorithm into a -approximation -time algorithm for any . This improves
over the previous -approximation -time algorithm and
-approximation -time algorithm of Ghaffari and Kuhn. Due to the lower
bound of by Das Sarma et al. which holds for any
approximation algorithm, this running time is tight up to a factor.
To get the stated running time, we developed an approximation algorithm which
combines the ideas of Thorup's algorithm and Matula's contraction algorithm. It
saves an factor as compared to applying Thorup's tree
packing theorem directly. Then, we combine Kutten and Peleg's tree partitioning
algorithm and Karger's dynamic programming to achieve an efficient distributed
algorithm that finds the minimum cut when we are given a spanning tree that
crosses the minimum cut exactly once
Quantum effect induced reverse kinetic molecular sieving in microporous materials
We report kinetic molecular sieving of hydrogen and deuterium in zeolite rho at low temperatures, using atomistic molecular dynamics simulations incorporating quantum effects via the Feynman-Hibbs approach. We find that diffusivities of confined molecules decrease when quantum effects are considered, in contrast with bulk fluids which show an increase. Indeed, at low temperatures, a reverse kinetic sieving effect is demonstrated in which the heavier isotope, deuterium, diffuses faster than hydrogen. At 65 K, the flux selectivity is as high as 46, indicating a good potential for isotope separation
Tunable Electron Multibunch Production in Plasma Wakefield Accelerators
Synchronized, independently tunable and focused J-class laser pulses are
used to release multiple electron populations via photo-ionization inside an
electron-beam driven plasma wave. By varying the laser foci in the laboratory
frame and the position of the underdense photocathodes in the co-moving frame,
the delays between the produced bunches and their energies are adjusted. The
resulting multibunches have ultra-high quality and brightness, allowing for
hitherto impossible bunch configurations such as spatially overlapping bunch
populations with strictly separated energies, which opens up a new regime for
light sources such as free-electron-lasers
The future of social is personal: the potential of the personal data store
This chapter argues that technical architectures that facilitate the longitudinal, decentralised and individual-centric personal collection and curation of data will be an important, but partial, response to the pressing problem of the autonomy of the data subject, and the asymmetry of power between the subject and large scale service providers/data consumers. Towards framing the scope and role of such Personal Data Stores (PDSes), the legalistic notion of personal data is examined, and it is argued that a more inclusive, intuitive notion expresses more accurately what individuals require in order to preserve their autonomy in a data-driven world of large aggregators. Six challenges towards realising the PDS vision are set out: the requirement to store data for long periods; the difficulties of managing data for individuals; the need to reconsider the regulatory basis for third-party access to data; the need to comply with international data handling standards; the need to integrate privacy-enhancing technologies; and the need to future-proof data gathering against the evolution of social norms. The open experimental PDS platform INDX is introduced and described, as a means of beginning to address at least some of these six challenges
Biogenic Nitrogen Gas Production at the Oxic–Anoxic Interface in the Cariaco Basin, Venezuela
Excess nitrogen gas (N2xs) was measured in samples collected at six locations in the eastern and western sub-basins of the Cariaco Basin, Venezuela, in September 2008 (non-upwelling conditions) and March 2009 (upwelling conditions). During both sampling periods, N2xs concentrations were below detection in surface waters, increasing to ~ 22 μmol N kg−1 at the oxic–anoxic interface ([O2] \u3c ~ 4 μmol kg−1, ~ 250 m). Below the oxic–anoxic interface (300–400 m), the average concentration of N2xs was 24.7 ± 1.9 μmol N kg−1 in September 2008 and 27.5 ± 2.0 μmol N kg−1 in March 2009, i.e., N2xs concentrations within this depth interval were ~ 3 μmol N kg−1 higher (p \u3c 0.001) during the upwelling season compared to the non-upwelling period. These results suggest that N-loss in the Cariaco Basin may vary seasonally in response to changes in the flux of sinking particulate organic matter. We attribute the increase in N2xs concentrations, or N-loss, observed during upwelling to: (1) higher availability of fixed nitrogen derived from suspended and sinking particles at the oxic–anoxic interface and/or (2) enhanced ventilation at the oxic–anoxic interface during upwelling
Variability of Surface Pigment Concentrations in the South Atlantic Bight
A 1‐year time sequence (November 1978 through October 1979) of surface pigment images from the South Atlantic Bight (SAB) was derived from the Nimbus 7 coastal zone color scanner. This data set is augmented with in situ observations of hydrographic parameters, freshwater discharge, sea level, coastal winds, and currents for the purpose of examining the coupling between physical processes and the spatial and temporal variability of the surface pigment fields. The SAB is divided into three regions: the east Florida shelf, the Georgia‐South Carolina shelf and the Carolina Capes. Six‐month seasonal mean pigment fields and time series of mean values within subregions were generated. While the seasonal mean isopleths were closely oriented along isobaths, significant differences between seasons in each region were found to exist. These differences are explained by correlating the pigment time series with physical parameters and processes known to be important in the SAB. Specifically, summertime concentrations between Cape Romain and Cape Canaveral were greater than those in winter, but the opposite was true north of Cape Romain. It is suggested that during the abnormally high freshwater discharge in the winter‐spring of 1979, Cape Romain and Cape Fear were the major sites of cross‐shelf transport, while the cross‐shelf exchange during the fall of 1979 occurred just north of Cape Canaveral. Finally, the alongshore band of high pigment concentrations increased in width throughout the year in the vicinity of Charleston, but near Jacksonville it exhibited a minimum width in the summer and a maximum width in the fall of 1979
- …