11,871 research outputs found
The art of being human : a project for general philosophy of science
Throughout the medieval and modern periods, in various sacred and secular guises, the unification of all forms of knowledge under the rubric of ‘science’ has been taken as the prerogative of humanity as a species. However, as our sense of species privilege has been called increasingly into question, so too has the very salience of ‘humanity’ and ‘science’ as general categories, let alone ones that might bear some essential relationship to each other. After showing how the ascendant Stanford School in the philosophy of science has contributed to this joint demystification of ‘humanity’ and ‘science’, I proceed on a more positive note to a conceptual framework for making sense of science as the art of being human. My understanding of ‘science’ is indebted to the red thread that runs from Christian theology through the Scientific Revolution and Enlightenment to the Humboldtian revival of the university as the site for the synthesis of knowledge as the culmination of self-development. Especially salient to this idea is science‘s epistemic capacity to manage modality (i.e. to determine the conditions under which possibilities can be actualised) and its political capacity to organize humanity into projects of universal concern. However, the challenge facing such an ideal in the twentyfirst century is that the predicate ‘human’ may be projected in three quite distinct ways, governed by what I call ‘ecological’, ‘biomedical’ and ‘cybernetic’ interests. Which one of these future humanities would claim today’s humans as proper ancestors and could these futures co-habit the same world thus become two important questions that general philosophy of science will need to address in the coming years
Theoretical calculation of the electromagnetic response of a radially layered model moon Technical report
Theoretical calculation of electromagnetic response of radially layered moon mode
Determining trophic niche width: a novel approach using stable isotope analysis
1. Although conceptually robust, it has proven difficult to find practical measures of niche width that are simple to obtain, yet provide an adequate descriptor of the ecological position of the population examined. 2. Trophic niche has proven more tractable than other niche dimensions. However, indices used as a proxy for trophic niche width often suffer from the following difficulties. Such indices rarely lie along a single scale making comparisons between populations or species difficult; have difficulty in combining dietary prey diversity and evenness in an ecologically meaningful way; and fail to integrate diet over ecological time-scales thus usually only comprise single snapshots of niche width. 3. We propose an alternative novel method for the comparison of trophic niche width: the use of variance of tissue stable isotope ratios, especially those of nitrogen and carbon. 4. This approach is a potentially powerful method of measuring trophic niche width, particularly if combined with conventional approaches, because: it provides a single measure on a continuous axis that is common to all species; it integrates information on only assimilated prey over time; the integration period changes with choice of tissue sampled; and data production is theoretically fast and testing among populations simple. 5. Empirical studies are now required to test the benefits of using isotopic variance as a measure of niche width, and in doing so help refine this approach
Neutrino-Neutrino Scattering and Matter-Enhanced Neutrino Flavor Transformation in Supernovae
We examine matter-enhanced neutrino flavor transformation
() in the region above the neutrino
sphere in Type II supernovae. Our treatment explicitly includes contributions
to the neutrino-propagation Hamiltonian from neutrino-neutrino forward
scattering. A proper inclusion of these contributions shows that they have a
completely negligible effect on the range of - vacuum
mass-squared difference, , and vacuum mixing angle, , or
equivalently , required for enhanced supernova shock re-heating.
When neutrino background effects are included, we find that -process
nucleosynthesis from neutrino-heated supernova ejecta remains a sensitive probe
of the mixing between a light and a with a
cosmologically significant mass. Neutrino-neutrino scattering contributions are
found to have a generally small effect on the
parameter region probed by -process nucleosynthesis. We point out that the
nonlinear effects of the neutrino background extend the range of sensitivity of
-process nucleosynthesis to smaller values of .Comment: 38 pages, tex, DOE/ER/40561-150-INT94-00-6
On Characterizing the Data Access Complexity of Programs
Technology trends will cause data movement to account for the majority of
energy expenditure and execution time on emerging computers. Therefore,
computational complexity will no longer be a sufficient metric for comparing
algorithms, and a fundamental characterization of data access complexity will
be increasingly important. The problem of developing lower bounds for data
access complexity has been modeled using the formalism of Hong & Kung's
red/blue pebble game for computational directed acyclic graphs (CDAGs).
However, previously developed approaches to lower bounds analysis for the
red/blue pebble game are very limited in effectiveness when applied to CDAGs of
real programs, with computations comprised of multiple sub-computations with
differing DAG structure. We address this problem by developing an approach for
effectively composing lower bounds based on graph decomposition. We also
develop a static analysis algorithm to derive the asymptotic data-access lower
bounds of programs, as a function of the problem size and cache size
Real-time depth sectioning: Isolating the effect of stress on structure development in pressure-driven flow
Transient structure development at a specific distance from the channel wall in a pressure-driven flow is obtained from a set of real-time measurements that integrate contributions throughout the thickness of a rectangular channel. This “depth sectioning method” retains the advantages of pressure-driven flow while revealing flow-induced structures as a function of stress. The method is illustrated by applying it to isothermal shear-induced crystallization of an isotactic polypropylene using both synchrotron x-ray scattering and optical retardance. Real-time, depth-resolved information about the development of oriented precursors reveals features that cannot be extracted from ex-situ observation of the final morphology and that are obscured in the depth-averaged in-situ measurements. For example, at 137 °C and at the highest shear stress examined (65 kPa), oriented thread-like nuclei formed rapidly, saturated within the first 7 s of flow, developed significant crystalline overgrowth during flow and did not relax after cessation of shear. At lower stresses, threads formed later and increased at a slower rate. The depth sectioning method can be applied to the flow-induced structure development in diverse complex fluids, including block copolymers, colloidal systems, and liquid-crystalline polymers
MSW-like Enhancements without Matter
We study the effects of a scalar field, coupled only to neutrinos, on
oscillations among weak interaction current eigenstates. The effect of a real
scalar field appears as effective masses for the neutrino mass eigenstates, the
same for \nbar as for \n. Under some conditions, this can lead to a
vanishing of , giving rise to MSW-like effects. We discuss some
examples and show that it is possible to resolve the apparent discrepancy in
spectra required by r-process nucleosynthesis in the mantles of supernovae and
by Solar neutrino solutions.Comment: 9 pages, latex, 1 figur
Gaussian capacity of the quantum bosonic channel with additive correlated Gaussian noise
We present an algorithm for calculation of the Gaussian classical capacity of
a quantum bosonic memory channel with additive Gaussian noise. The algorithm,
restricted to Gaussian input states, is applicable to all channels with noise
correlations obeying certain conditions and works in the full input energy
domain, beyond previous treatments of this problem. As an illustration, we
study the optimal input states and capacity of a quantum memory channel with
Gauss-Markov noise [J. Sch\"afer, Phys. Rev. A 80, 062313 (2009)]. We evaluate
the enhancement of the transmission rate when using these optimal entangled
input states by comparison with a product coherent-state encoding and find out
that such a simple coherent-state encoding achieves not less than 90% of the
capacity.Comment: 12+6 pages, 9 figures. Errors corrected, figures were made clearer,
appendix improved and extende
- …