30,199 research outputs found
A New View on Worst-Case to Average-Case Reductions for NP Problems
We study the result by Bogdanov and Trevisan (FOCS, 2003), who show that
under reasonable assumptions, there is no non-adaptive worst-case to
average-case reduction that bases the average-case hardness of an NP-problem on
the worst-case complexity of an NP-complete problem. We replace the hiding and
the heavy samples protocol in [BT03] by employing the histogram verification
protocol of Haitner, Mahmoody and Xiao (CCC, 2010), which proves to be very
useful in this context. Once the histogram is verified, our hiding protocol is
directly public-coin, whereas the intuition behind the original protocol
inherently relies on private coins
On the Complexity and Performance of Parsing with Derivatives
Current algorithms for context-free parsing inflict a trade-off between ease
of understanding, ease of implementation, theoretical complexity, and practical
performance. No algorithm achieves all of these properties simultaneously.
Might et al. (2011) introduced parsing with derivatives, which handles
arbitrary context-free grammars while being both easy to understand and simple
to implement. Despite much initial enthusiasm and a multitude of independent
implementations, its worst-case complexity has never been proven to be better
than exponential. In fact, high-level arguments claiming it is fundamentally
exponential have been advanced and even accepted as part of the folklore.
Performance ended up being sluggish in practice, and this sluggishness was
taken as informal evidence of exponentiality.
In this paper, we reexamine the performance of parsing with derivatives. We
have discovered that it is not exponential but, in fact, cubic. Moreover,
simple (though perhaps not obvious) modifications to the implementation by
Might et al. (2011) lead to an implementation that is not only easy to
understand but also highly performant in practice.Comment: 13 pages; 12 figures; implementation at
http://bitbucket.org/ucombinator/parsing-with-derivatives/ ; published in
PLDI '16, Proceedings of the 37th ACM SIGPLAN Conference on Programming
Language Design and Implementation, June 13 - 17, 2016, Santa Barbara, CA,
US
Trends in office internal gains and the impact on space heating and cooling demands
Internal gains from occupants, equipment and lighting contribute a significant proportion of the heat gains in an office space. Looking at trends in Generation-Y, it appears there are two diverging paths for future ICT demand: one where energy demand is carefully regulated and the other where productivity enhancers such as multiple monitors and media walls causes an explosion of energy demand within the space. These internal gains scenarios were simulated on a variety of different building archetypes to test their influence on the space heating and cooling demand. It was demonstrated that in offices with a high quality facade, internal gains are the dominant factor. As a case study, it was shown that natural ventilation is only possible when the ICT demand is carefully regulated
Resource location based on precomputed partial random walks in dynamic networks
The problem of finding a resource residing in a network node (the
\emph{resource location problem}) is a challenge in complex networks due to
aspects as network size, unknown network topology, and network dynamics. The
problem is especially difficult if no requirements on the resource placement
strategy or the network structure are to be imposed, assuming of course that
keeping centralized resource information is not feasible or appropriate. Under
these conditions, random algorithms are useful to search the network. A
possible strategy for static networks, proposed in previous work, uses short
random walks precomputed at each network node as partial walks to construct
longer random walks with associated resource information. In this work, we
adapt the previous mechanisms to dynamic networks, where resource instances may
appear in, and disappear from, network nodes, and the nodes themselves may
leave and join the network, resembling realistic scenarios. We analyze the
resulting resource location mechanisms, providing expressions that accurately
predict average search lengths, which are validated using simulation
experiments. Reduction of average search lengths compared to simple random walk
searches are found to be very large, even in the face of high network
volatility. We also study the cost of the mechanisms, focusing on the overhead
implied by the periodic recomputation of partial walks to refresh the
information on resources, concluding that the proposed mechanisms behave
efficiently and robustly in dynamic networks.Comment: 39 pages, 25 figure
Cost-Benefit Analysis of Climate Change: Stern Revisited
This paper explores the challenges facing orthodox economic approaches to assessing climate control as if it were appraisal of an investment project. Serious flaws are noted in the work of economists with especial attention to the UK Government report by Stern and colleagues. The opinions expressed in this paper are those of the authors and may not be taken to reflect the views CSIRO or the Australian Government.enhanced greenhouse effect, global CBA, Stern Report
PPP-Completeness with Connections to Cryptography
Polynomial Pigeonhole Principle (PPP) is an important subclass of TFNP with
profound connections to the complexity of the fundamental cryptographic
primitives: collision-resistant hash functions and one-way permutations. In
contrast to most of the other subclasses of TFNP, no complete problem is known
for PPP. Our work identifies the first PPP-complete problem without any circuit
or Turing Machine given explicitly in the input, and thus we answer a
longstanding open question from [Papadimitriou1994]. Specifically, we show that
constrained-SIS (cSIS), a generalized version of the well-known Short Integer
Solution problem (SIS) from lattice-based cryptography, is PPP-complete.
In order to give intuition behind our reduction for constrained-SIS, we
identify another PPP-complete problem with a circuit in the input but closely
related to lattice problems. We call this problem BLICHFELDT and it is the
computational problem associated with Blichfeldt's fundamental theorem in the
theory of lattices.
Building on the inherent connection of PPP with collision-resistant hash
functions, we use our completeness result to construct the first natural hash
function family that captures the hardness of all collision-resistant hash
functions in a worst-case sense, i.e. it is natural and universal in the
worst-case. The close resemblance of our hash function family with SIS, leads
us to the first candidate collision-resistant hash function that is both
natural and universal in an average-case sense.
Finally, our results enrich our understanding of the connections between PPP,
lattice problems and other concrete cryptographic assumptions, such as the
discrete logarithm problem over general groups
Inapproximability of Combinatorial Optimization Problems
We survey results on the hardness of approximating combinatorial optimization
problems
Emission-aware Energy Storage Scheduling for a Greener Grid
Reducing our reliance on carbon-intensive energy sources is vital for
reducing the carbon footprint of the electric grid. Although the grid is seeing
increasing deployments of clean, renewable sources of energy, a significant
portion of the grid demand is still met using traditional carbon-intensive
energy sources. In this paper, we study the problem of using energy storage
deployed in the grid to reduce the grid's carbon emissions. While energy
storage has previously been used for grid optimizations such as peak shaving
and smoothing intermittent sources, our insight is to use distributed storage
to enable utilities to reduce their reliance on their less efficient and most
carbon-intensive power plants and thereby reduce their overall emission
footprint. We formulate the problem of emission-aware scheduling of distributed
energy storage as an optimization problem, and use a robust optimization
approach that is well-suited for handling the uncertainty in load predictions,
especially in the presence of intermittent renewables such as solar and wind.
We evaluate our approach using a state of the art neural network load
forecasting technique and real load traces from a distribution grid with 1,341
homes. Our results show a reduction of >0.5 million kg in annual carbon
emissions -- equivalent to a drop of 23.3% in our electric grid emissions.Comment: 11 pages, 7 figure, This paper will appear in the Proceedings of the
ACM International Conference on Future Energy Systems (e-Energy 20) June
2020, Australi
- âŠ