1,402 research outputs found
Optimum soil water content for chickpea emergence in heavy-textured soils of north-west Bangladesh
Sowing of chickpea in the heavy-textured soils of north-west Bangladesh with minimum tillage technology aims to increase the timely planting of large areas during a relatively short sowing window before soil water deficit limits germination and emergence. However, the seedbed conditions into which chickpea is sown need to be better quantified, so that limiting factors which affect germination and emergence can be identified. Two of the soil physical characteristics of importance are soil water and aeration. Growth cabinet studies have identified the fastest germination and emergence of chickpea on representative soils for this area at gravimetric water contents of 17-18 %, whilst soil water contents above and below this delayed germination and emergence. Emergence was recorded at soil water potentials between field capacity (-10 kPa) and wilting point (-1500 kPa). Emergence was possible at lower soil water potentials in the finer textured soil, whilst in coarser textured soil, emergence was still possible at higher soil water potentials
Multiprocessor speed scaling for jobs with arbitrary sizes and deadlines
In this paper we study energy efficient deadline scheduling on multiprocessors in which the processors consumes power at a rate of sα when running at speeds, where α ℠2. The problem is to dispatch jobs to processors and determine the speed and jobs to run for each processor so as to complete all jobs by their deadlines using the minimum energy. The problem has been well studied for the single processor case. For the multiprocessor setting, constant competitive online algorithms for special cases of unit size jobs or arbitrary size jobs with agreeable deadlines have been proposed by Albers et al. (2007). A randomized algorithm has been proposed for jobs of arbitrary sizes and arbitrary deadlines by Greiner et al. (2009). We propose a deterministic online algorithm for the general setting and show that it is O(logαP)-competitive, where P is the ratio of the maximum and minimum job size
Response to Nauenberg's "Critique of Quantum Enigma: Physics Encounters Consciousness"
Nauenberg's extended critique of Quantum Enigma rests on fundamental
misunderstandings.Comment: To be published in Foundations of Physic
Quantum models of classical mechanics: maximum entropy packets
In a previous paper, a project of constructing quantum models of classical
properties has been started. The present paper concludes the project by turning
to classical mechanics. The quantum states that maximize entropy for given
averages and variances of coordinates and momenta are called ME packets. They
generalize the Gaussian wave packets. A non-trivial extension of the
partition-function method of probability calculus to quantum mechanics is
given. Non-commutativity of quantum variables limits its usefulness. Still, the
general form of the state operators of ME packets is obtained with its help.
The diagonal representation of the operators is found. A general way of
calculating averages that can replace the partition function method is
described. Classical mechanics is reinterpreted as a statistical theory.
Classical trajectories are replaced by classical ME packets. Quantum states
approximate classical ones if the product of the coordinate and momentum
variances is much larger than Planck constant. Thus, ME packets with large
variances follow their classical counterparts better than Gaussian wave
packets.Comment: 26 pages, no figure. Introduction and the section on classical limit
are extended, new references added. Definitive version accepted by Found.
Phy
An quantum approach of measurement based on the Zurek's triple model
In a close form without referring the time-dependent Hamiltonian to the total
system, a consistent approach for quantum measurement is proposed based on
Zurek's triple model of quantum decoherence [W.Zurek, Phys. Rev. D 24, 1516
(1981)]. An exactly-solvable model based on the intracavity system is dealt
with in details to demonstrate the central idea in our approach: by peeling off
one collective variable of the measuring apparatus from its many degrees of
freedom, as the pointer of the apparatus, the collective variable de-couples
with the internal environment formed by the effective internal variables, but
still interacts with the measured system to form a triple entanglement among
the measured system, the pointer and the internal environment. As another
mechanism to cause decoherence, the uncertainty of relative phase and its
many-particle amplification can be summed up to an ideal entanglement or an
Shmidt decomposition with respect to the preferred basis.Comment: 22pages,3figure
Typicality vs. probability in trajectory-based formulations of quantum mechanics
Bohmian mechanics represents the universe as a set of paths with a
probability measure defined on it. The way in which a mathematical model of
this kind can explain the observed phenomena of the universe is examined in
general. It is shown that the explanation does not make use of the full
probability measure, but rather of a suitable set function deriving from it,
which defines relative typicality between single-time cylinder sets. Such a set
function can also be derived directly from the standard quantum formalism,
without the need of an underlying probability measure. The key concept for this
derivation is the {\it quantum typicality rule}, which can be considered as a
generalization of the Born rule. The result is a new formulation of quantum
mechanics, in which particles follow definite trajectories, but which is only
based on the standard formalism of quantum mechanics.Comment: 24 pages, no figures. To appear in Foundation of Physic
Quantum measurement as driven phase transition: An exactly solvable model
A model of quantum measurement is proposed, which aims to describe
statistical mechanical aspects of this phenomenon, starting from a purely
Hamiltonian formulation. The macroscopic measurement apparatus is modeled as an
ideal Bose gas, the order parameter of which, that is, the amplitude of the
condensate, is the pointer variable. It is shown that properties of
irreversibility and ergodicity breaking, which are inherent in the model
apparatus, ensure the appearance of definite results of the measurement, and
provide a dynamical realization of wave-function reduction or collapse. The
measurement process takes place in two steps: First, the reduction of the state
of the tested system occurs over a time of order , where
is the temperature of the apparatus, and is the number of its degrees of
freedom. This decoherence process is governed by the apparatus-system
interaction. During the second step classical correlations are established
between the apparatus and the tested system over the much longer time-scale of
equilibration of the apparatus. The influence of the parameters of the model on
non-ideality of the measurement is discussed. Schr\"{o}dinger kittens, EPR
setups and information transfer are analyzed.Comment: 35 pages revte
Explaining the unobserved: why quantum mechanics is not only about information
A remarkable theorem by Clifton, Bub and Halvorson (2003)(CBH) characterizes
quantum theory in terms of information--theoretic principles. According to Bub
(2004, 2005) the philosophical significance of the theorem is that quantum
theory should be regarded as a ``principle'' theory about (quantum) information
rather than a ``constructive'' theory about the dynamics of quantum systems.
Here we criticize Bub's principle approach arguing that if the mathematical
formalism of quantum mechanics remains intact then there is no escape route
from solving the measurement problem by constructive theories. We further
propose a (Wigner--type) thought experiment that we argue demonstrates that
quantum mechanics on the information--theoretic approach is incomplete.Comment: 34 Page
Decoherence and wave function collapse
The possibility of consistency between the basic quantum principles of
quantum mechanics and wave function collapse is reexamined. A specific
interpretation of environment is proposed for this aim and applied to
decoherence. When the organization of a measuring apparatus is taken into
account, this approach leads also to an interpretation of wave function
collapse, which would result in principle from the same interactions with
environment as decoherence. This proposal is shown consistent with the
non-separable character of quantum mechanics
Entanglement Dynamics in Two-Qubit Open System Interacting with a Squeezed Thermal Bath via Quantum Nondemolition interaction
We analyze the dynamics of entanglement in a two-qubit system interacting
with an initially squeezed thermal environment via a quantum nondemolition
system-reservoir interaction, with the system and reservoir assumed to be
initially separable. We compare and contrast the decoherence of the two-qubit
system in the case where the qubits are mutually close-by (`collective regime')
or distant (`localized regime') with respect to the spatial variation of the
environment. Sudden death of entanglement (as quantified by concurrence) is
shown to occur in the localized case rather than in the collective case, where
entanglement tends to `ring down'. A consequence of the QND character of the
interaction is that the time-evolved fidelity of a Bell state never falls below
, a fact that is useful for quantum communication applications like
a quantum repeater. Using a novel quantification of mixed state entanglement,
we show that there are noise regimes where even though entanglement vanishes,
the state is still available for applications like NMR quantum computation,
because of the presence of a pseudo-pure component.Comment: 17 pages, 9 figures, REVTeX
- âŠ