57 research outputs found
Quantum Hypercomputation - Hype or Computation?
A recent attempt to compute a (recursion--theoretic) non--computable function using the quantum adiabatic algorithm is criticized and found wanting. Quantum algorithms may outperform classical algorithms in some cases, but so far they retain the classical (recursion--theoretic) notion of computability. A speculation is then offered as to where the putative power of quantum computers may come from
Quantum Hamiltonian Complexity
Constraint satisfaction problems are a central pillar of modern computational
complexity theory. This survey provides an introduction to the rapidly growing
field of Quantum Hamiltonian Complexity, which includes the study of quantum
constraint satisfaction problems. Over the past decade and a half, this field
has witnessed fundamental breakthroughs, ranging from the establishment of a
"Quantum Cook-Levin Theorem" to deep insights into the structure of 1D
low-temperature quantum systems via so-called area laws. Our aim here is to
provide a computer science-oriented introduction to the subject in order to
help bridge the language barrier between computer scientists and physicists in
the field. As such, we include the following in this survey: (1) The
motivations and history of the field, (2) a glossary of condensed matter
physics terms explained in computer-science friendly language, (3) overviews of
central ideas from condensed matter physics, such as indistinguishable
particles, mean field theory, tensor networks, and area laws, and (4) brief
expositions of selected computer science-based results in the area. For
example, as part of the latter, we provide a novel information theoretic
presentation of Bravyi's polynomial time algorithm for Quantum 2-SAT.Comment: v4: published version, 127 pages, introduction expanded to include
brief introduction to quantum information, brief list of some recent
developments added, minor changes throughou
Against the Tide. A Critical Review by Scientists of How Physics and Astronomy Get Done
Nobody should have a monopoly of the truth in this universe. The censorship and suppression of challenging ideas against the tide of mainstream research, the blacklisting of scientists, for instance, is neither the best way to do and filter science, nor to promote progress in the human knowledge. The removal of good and novel ideas from the scientific stage is very detrimental to the pursuit of the truth. There are instances in which a mere unqualified belief can occasionally be converted into a generally accepted scientific theory through the screening action of refereed literature and meetings planned by the scientific organizing committees and through the distribution of funds controlled by "club opinions". It leads to unitary paradigms and unitary thinking not necessarily associated to the unique truth. This is the topic of this book: to critically analyze the problems of the official (and sometimes illicit) mechanisms under which current science (physics and astronomy in particular) is being administered and filtered today, along with the onerous consequences these mechanisms have on all of us.\ud
\ud
The authors, all of them professional researchers, reveal a pessimistic view of the miseries of the actual system, while a glimmer of hope remains in the "leitmotiv" claim towards the freedom in doing research and attaining an acceptable level of ethics in science
- âŠ