199,856 research outputs found
A multivariate complexity analysis of the material consumption scheduling problem
The NP-hard problem Material Consumption Scheduling and related problems have been thoroughly studied since the 1980’s. Roughly speaking, the problem deals with scheduling jobs that consume non-renewable resources—each job has individual resource demands. The goal is to minimize the makespan. We focus on the single-machine case without preemption: from time to time, the resources of the machine are (partially) replenished, thus allowing for meeting a necessary precondition for processing further jobs. We initiate a systematic exploration of the parameterized computational complexity landscape of Material Consumption Scheduling , providing parameterized tractability as well as intractability results. Doing so, we mainly investigate how parameters related to the resource supplies influence the problem’s computational complexity. This leads to a deepened understanding of this fundamental scheduling problem
What Is a Macrostate? Subjective Observations and Objective Dynamics
We consider the question of whether thermodynamic macrostates are objective
consequences of dynamics, or subjective reflections of our ignorance of a
physical system. We argue that they are both; more specifically, that the set
of macrostates forms the unique maximal partition of phase space which 1) is
consistent with our observations (a subjective fact about our ability to
observe the system) and 2) obeys a Markov process (an objective fact about the
system's dynamics). We review the ideas of computational mechanics, an
information-theoretic method for finding optimal causal models of stochastic
processes, and argue that macrostates coincide with the ``causal states'' of
computational mechanics. Defining a set of macrostates thus consists of an
inductive process where we start with a given set of observables, and then
refine our partition of phase space until we reach a set of states which
predict their own future, i.e. which are Markovian. Macrostates arrived at in
this way are provably optimal statistical predictors of the future values of
our observables.Comment: 15 pages, no figure
Computation- and Space-Efficient Implementation of SSA
The computational complexity of different steps of the basic SSA is
discussed. It is shown that the use of the general-purpose "blackbox" routines
(e.g. found in packages like LAPACK) leads to huge waste of time resources
since the special Hankel structure of the trajectory matrix is not taken into
account. We outline several state-of-the-art algorithms (for example,
Lanczos-based truncated SVD) which can be modified to exploit the structure of
the trajectory matrix. The key components here are hankel matrix-vector
multiplication and hankelization operator. We show that both can be computed
efficiently by the means of Fast Fourier Transform. The use of these methods
yields the reduction of the worst-case computational complexity from O(N^3) to
O(k N log(N)), where N is series length and k is the number of eigentriples
desired.Comment: 27 pages, 8 figure
- …