40,632 research outputs found
In situ conservation of crop wild relatives
Poster presented at BGCI Congress. Wuhan (China), 16-20 Apr 200
Effects of radiation environment on reusable nuclear shuttle system
Parametric tradeoff analyses of a wide spectrum of alternate tank configurations to minimize both primary and secondary, direct and scattered radiation sources emanating from the NERVA are reported. The analytical approach utilizing point kernel techniques is described and detailed data are presented on the magnitude of neutron/gamma doses for different locations. Single-tank configurations utilizing smaller cone angles and end cap radii were found to minimize integral radiation levels, hence, stage shielding-weight penalties for shuttle missions. Hybrid configurations employing an upper tank with a reduced cone angle and end cap radius result in low integral payload doses primarily due to the increased separation distance caused by the elongation of the larger capacity upper tank. A preliminary radiation damage assessment is discussed of possible reusable nuclear shuttle materials, components, and subsystems, and the possible effects of the radiation environment on various phases of RNS mission operations
A Second Step Towards Complexity-Theoretic Analogs of Rice's Theorem
Rice's Theorem states that every nontrivial language property of the
recursively enumerable sets is undecidable. Borchert and Stephan initiated the
search for complexity-theoretic analogs of Rice's Theorem. In particular, they
proved that every nontrivial counting property of circuits is UP-hard, and that
a number of closely related problems are SPP-hard.
The present paper studies whether their UP-hardness result itself can be
improved to SPP-hardness. We show that their UP-hardness result cannot be
strengthened to SPP-hardness unless unlikely complexity class containments
hold. Nonetheless, we prove that every P-constructibly bi-infinite counting
property of circuits is SPP-hard. We also raise their general lower bound from
unambiguous nondeterminism to constant-ambiguity nondeterminism.Comment: 14 pages. To appear in Theoretical Computer Scienc
An Atypical Survey of Typical-Case Heuristic Algorithms
Heuristic approaches often do so well that they seem to pretty much always
give the right answer. How close can heuristic algorithms get to always giving
the right answer, without inducing seismic complexity-theoretic consequences?
This article first discusses how a series of results by Berman, Buhrman,
Hartmanis, Homer, Longpr\'{e}, Ogiwara, Sch\"{o}ening, and Watanabe, from the
early 1970s through the early 1990s, explicitly or implicitly limited how well
heuristic algorithms can do on NP-hard problems. In particular, many desirable
levels of heuristic success cannot be obtained unless severe, highly unlikely
complexity class collapses occur. Second, we survey work initiated by Goldreich
and Wigderson, who showed how under plausible assumptions deterministic
heuristics for randomized computation can achieve a very high frequency of
correctness. Finally, we consider formal ways in which theory can help explain
the effectiveness of heuristics that solve NP-hard problems in practice.Comment: This article is currently scheduled to appear in the December 2012
issue of SIGACT New
- …