456 research outputs found
Rationales for the Lightning Flight-Commit Criteria
Since natural and artificially-initiated (or "triggered") lightning are demonstrated hazards to the launch of space vehicles, the American space program has responded by establishing a set of Lightning Flight Commit Criteria (LFCC), also known as Lightning Launch Commit Criteria (LLCC), and associated Definitions to mitigate the risk. The LLCC apply to all Federal Government ranges and similar LFCC have been adopted by the Federal Aviation Administration for application at state-operated and private spaceports. The LLCC and Definitions have been developed, reviewed, and approved over the years of the American space program, progressing from relatively simple rules in the mid-twentieth century (that were inadequate) to a complex suite for launch operations in the early 21st century. During this evolutionary process, a "Lightning Advisory Panel (LAP)" of top American scientists in the field of atmospheric electricity was established to guide it. Details of this process are provided in a companion document entitled "A History of the Lightning Launch Commit Criteria and the Lightning Advisory Panel for America s Space program" which is available as NASA Special Publication 2010-216283. As new knowledge and additional operational experience have been gained, the LFCC/LLCC have been updated to preserve or increase their safety and to increase launch availability. All launches of both manned and unmanned vehicles at all Federal Government ranges now use the same rules. This simplifies their application and minimizes the cost of the weather infrastructure to support them. Vehicle operators and Range safety personnel have requested that the LAP provide a detailed written rationale for each of the LFCC so that they may better understand and appreciate the scientific and operational justifications for them. This document provides the requested rationale
A History of the Lightning Launch Commit Criteria and the Lightning Advisory Panel for America's Space Program
The history of the Lightning Launch Commit Criteria (LLCC) used at all spaceports under the jurisdiction of the United States is provided. The formation and history of the Lightning Advisory Panel (LAP) that now advises NASA, the Air Force and the Federal Aviation Administration on LLCC development and improvement is emphasized. The period covered extends from the early days of space flight through 2010. Extensive appendices provide significant detail about important aspects that are only summarized in the main text
Neural Decision Boundaries for Maximal Information Transmission
We consider here how to separate multidimensional signals into two
categories, such that the binary decision transmits the maximum possible
information transmitted about those signals. Our motivation comes from the
nervous system, where neurons process multidimensional signals into a binary
sequence of responses (spikes). In a small noise limit, we derive a general
equation for the decision boundary that locally relates its curvature to the
probability distribution of inputs. We show that for Gaussian inputs the
optimal boundaries are planar, but for non-Gaussian inputs the curvature is
nonzero. As an example, we consider exponentially distributed inputs, which are
known to approximate a variety of signals from natural environment.Comment: 5 pages, 3 figure
Ten Misconceptions from the History of Analysis and Their Debunking
The widespread idea that infinitesimals were "eliminated" by the "great
triumvirate" of Cantor, Dedekind, and Weierstrass is refuted by an
uninterrupted chain of work on infinitesimal-enriched number systems. The
elimination claim is an oversimplification created by triumvirate followers,
who tend to view the history of analysis as a pre-ordained march toward the
radiant future of Weierstrassian epsilontics. In the present text, we document
distortions of the history of analysis stemming from the triumvirate ideology
of ontological minimalism, which identified the continuum with a single number
system. Such anachronistic distortions characterize the received interpretation
of Stevin, Leibniz, d'Alembert, Cauchy, and others.Comment: 46 pages, 4 figures; Foundations of Science (2012). arXiv admin note:
text overlap with arXiv:1108.2885 and arXiv:1110.545
Photoswitching Mechanism of Cyanine Dyes
Photoswitchable fluorescent probes have been used in recent years to enable super-resolution fluorescence microscopy by single-molecule imaging.1-6 Among these probes are red carbocyanine dyes, which can be reversibly photoconverted between a fluorescent state and a dark state for hundreds of cycles, yielding several thousand detected photons per switching cycle, before permanent photobleaching occurs.7,8 While these properties make them excel-lent probes for super-resolution imaging, the mechanism by which cyanine dyes are photoconverted has yet to be determined. Such an understanding could prove useful for creating new photoswit-chable probes with improved properties. The photoconversion of red cyanine dyes into their dark states occurs upon illumination by red light and is facilitated by a primary thiol in solution,7,9 whereas agents with a secondary thiol do not support photoswitching. These observations suggest that the reactiv
Leibniz's Infinitesimals: Their Fictionality, Their Modern Implementations, And Their Foes From Berkeley To Russell And Beyond
Many historians of the calculus deny significant continuity between
infinitesimal calculus of the 17th century and 20th century developments such
as Robinson's theory. Robinson's hyperreals, while providing a consistent
theory of infinitesimals, require the resources of modern logic; thus many
commentators are comfortable denying a historical continuity. A notable
exception is Robinson himself, whose identification with the Leibnizian
tradition inspired Lakatos, Laugwitz, and others to consider the history of the
infinitesimal in a more favorable light. Inspite of his Leibnizian sympathies,
Robinson regards Berkeley's criticisms of the infinitesimal calculus as aptly
demonstrating the inconsistency of reasoning with historical infinitesimal
magnitudes. We argue that Robinson, among others, overestimates the force of
Berkeley's criticisms, by underestimating the mathematical and philosophical
resources available to Leibniz. Leibniz's infinitesimals are fictions, not
logical fictions, as Ishiguro proposed, but rather pure fictions, like
imaginaries, which are not eliminable by some syncategorematic paraphrase. We
argue that Leibniz's defense of infinitesimals is more firmly grounded than
Berkeley's criticism thereof. We show, moreover, that Leibniz's system for
differential calculus was free of logical fallacies. Our argument strengthens
the conception of modern infinitesimals as a development of Leibniz's strategy
of relating inassignable to assignable quantities by means of his
transcendental law of homogeneity.Comment: 69 pages, 3 figure
Correction: Selenium mediates exercise-induced adult neurogenesis and reverses learning deficits induced by hippocampal injury and aging
4D Super-Resolution Microscopy with Conventional Fluorophores and Single Wavelength Excitation in Optically Thick Cells and Tissues
Optical super-resolution imaging of fluorescently stained biological samples is rapidly becoming an important tool to investigate protein distribution at the molecular scale. It is therefore important to develop practical super-resolution methods that allow capturing the full three-dimensional nature of biological systems and also can visualize multiple protein species in the same sample
Measurement of the Hadronic Photon Structure Function F_2^gamma at LEP2
The hadronic structure function of the photon F_2^gamma is measured as a
function of Bjorken x and of the factorisation scale Q^2 using data taken by
the OPAL detector at LEP. Previous OPAL measurements of the x dependence of
F_2^gamma are extended to an average Q^2 of 767 GeV^2. The Q^2 evolution of
F_2^gamma is studied for average Q^2 between 11.9 and 1051 GeV^2. As predicted
by QCD, the data show positive scaling violations in F_2^gamma. Several
parameterisations of F_2^gamma are in agreement with the measurements whereas
the quark-parton model prediction fails to describe the data.Comment: 4 pages, 2 figures, to appear in the proceedings of Photon 2001,
Ascona, Switzerlan
Search for Higgs Bosons in e+e- Collisions at 183 GeV
The data collected by the OPAL experiment at sqrts=183 GeV were used to
search for Higgs bosons which are predicted by the Standard Model and various
extensions, such as general models with two Higgs field doublets and the
Minimal Supersymmetric Standard Model (MSSM). The data correspond to an
integrated luminosity of approximately 54pb-1. None of the searches for neutral
and charged Higgs bosons have revealed an excess of events beyond the expected
background. This negative outcome, in combination with similar results from
searches at lower energies, leads to new limits for the Higgs boson masses and
other model parameters. In particular, the 95% confidence level lower limit for
the mass of the Standard Model Higgs boson is 88.3 GeV. Charged Higgs bosons
can be excluded for masses up to 59.5 GeV. In the MSSM, mh > 70.5 GeV and mA >
72.0 GeV are obtained for tan{beta}>1, no and maximal scalar top mixing and
soft SUSY-breaking masses of 1 TeV. The range 0.8 < tanb < 1.9 is excluded for
minimal scalar top mixing and m{top} < 175 GeV. More general scans of the MSSM
parameter space are also considered.Comment: 49 pages. LaTeX, including 33 eps figures, submitted to European
Physical Journal
- …