3,084 research outputs found
Climate Change, Risk and Grain Production in China
This paper employs the production function-based method proposed by Just and Pope (1978, 1979) to explicitly analyze production risk in the context of Chinese grain farming and climate change, and test for a potential endogeneity of climate factors in Chinese grain production. Our results indicate that China might, at least in the short run, become a net beneficiary of climate change. In particular, we find that increases in annual average temperature increase mean output at the margin and at the same time lead to a reduction of production risk. Further calculations suggest that a 1 °C increase in annual average temperature would entail an economic benefit of $1.1 billion due to the increasing mean output. Furthermore, a Hausman test reveals no endogeneity of climate variables in Chinese grain production.Agriculture, grain production, climate change, production risk, China, Crop Production/Industries, Environmental Economics and Policy, Risk and Uncertainty, Q1, Q54,
Auto-structure of spike trains matters for testing on synchronous activity
Poster presentation: Coordinated neuronal activity across many neurons, i.e. synchronous or spatiotemporal pattern, had been believed to be a major component of neuronal activity. However, the discussion if coordinated activity really exists remained heated and controversial. A major uncertainty was that many analysis approaches either ignored the auto-structure of the spiking activity, assumed a very simplified model (poissonian firing), or changed the auto-structure by spike jittering. We studied whether a statistical inference that tests whether coordinated activity is occurring beyond chance can be made false if one ignores or changes the real auto-structure of recorded data. To this end, we investigated the distribution of coincident spikes in mutually independent spike-trains modeled as renewal processes. We considered Gamma processes with different shape parameters as well as renewal processes in which the ISI distribution is log-normal. For Gamma processes of integer order, we calculated the mean number of coincident spikes, as well as the Fano factor of the coincidences, analytically. We determined how these measures depend on the bin width and also investigated how they depend on the firing rate, and on rate difference between the neurons. We used Monte-Carlo simulations to estimate the whole distribution for these parameters and also for other values of gamma. Moreover, we considered the effect of dithering for both of these processes and saw that while dithering does not change the average number of coincidences, it does change the shape of the coincidence distribution. Our major findings are: 1) the width of the coincidence count distribution depends very critically and in a non-trivial way on the detailed properties of the inter-spike interval distribution, 2) the dependencies of the Fano factor on the coefficient of variation of the ISI distribution are complex and mostly non-monotonic. Moreover, the Fano factor depends on the very detailed properties of the individual point processes, and cannot be predicted by the CV alone. Hence, given a recorded data set, the estimated value of CV of the ISI distribution is not sufficient to predict the Fano factor of the coincidence count distribution, and 3) spike jittering, even if it is as small as a fraction of the expected ISI, can falsify the inference on coordinated firing. In most of the tested cases and especially for complex synchronous and spatiotemporal pattern across many neurons, spike jittering increased the likelihood of false positive finding very strongly. Last, we discuss a procedure [1] that considers the complete auto-structure of each individual spike-train for testing whether synchrony firing occurs at chance and therefore overcomes the danger of an increased level of false positives
Anticipated results from dust experiments on cometary missions
The major scientific objectives of a mission are: to determine the chemical nature and physical structure of comet nuclei, and to characterize the changes that occur as a function of time orbital position; to characterize the chemical and physical nature of the atmospheres and ionospheres of comets as well as the processes that occur in them, and to characterize the development of the atmospheres and ionospheres as functions of time and orbital position; and to determine the nature of comet tails and processes by which they are formed, and to characterize the interaction of comets with the solar wind. Since dust is a major constituent of a comet, the achievement of these goals requires the intensive study of the paticulate emission from a comet
Extended beta regression in R : shaken, stirred, mixed, and partitioned
Beta regression – an increasingly popular approach for modeling rates and proportions – is extended in various directions: (a) bias correction/reduction of the maximum likelihood estimator, (b) beta regression tree models by means of recursive partitioning, (c) latent class beta regression by means of finite mixture models. All three extensions may be of importance for enhancing the beta regression toolbox in practice to provide more reliable inference and capture both observed and unobserved/latent heterogeneity in the data. Using the analogy of Smithson and Verkuilen (2006), these extensions make beta regression not only “a better lemon squeezer” (compared to classical least squares regression) but a full-fledged modern juicer offering lemon-based drinks: shaken and stirred (bias correction and reduction), mixed (finite mixture model), or partitioned (tree model). All three extensions are provided in the R package betareg (at least 2.4-0), building on generic algorithms and implementations for bias correction/reduction, model-based recursive partioning, and finite mixture models, respectively. Specifically, the new functions betatree() and betamix() reuse the object-oriented flexible implementation from the R packages party and flexmix, respectively
Cosmic Dust Collection Facility: Scientific objectives and programmatic relations
The science objectives are summarized for the Cosmic Dust Collection Facility (CDCF) on Space Station Freedom and these objectives are related to ongoing science programs and mission planning within NASA. The purpose is to illustrate the potential of the CDCF project within the broad context of early solar system sciences that emphasize the study of primitive objects in state-of-the-art analytical and experimental laboratories on Earth. Current knowledge about the sources of cosmic dust and their associated orbital dynamics is examined, and the results are reviewed of modern microanalytical investigations of extraterrestrial dust particles collected on Earth. Major areas of scientific inquiry and uncertainty are identified and it is shown how CDCF will contribute to their solution. General facility and instrument concepts that need to be pursued are introduced, and the major development tasks that are needed to attain the scientific objectives of the CDCF project are identified
Debris and micrometeorite impact measurements in the laboratory
A method was developed to simulate space debris in the laboratory. This method, which is an outgrowth of research in inertial confinement fusion (ICF), uses laser ablation to accelerate material. Using this method, single 60 micron aluminum spheres were accelerated to 15 km/sec and larger 500 micron aluminum spheres were accelerated to 2 km/sec. Also, many small (less than 10 micron diameter) irregularly shaped particles were accelerated to speeds of 100 km/sec
A domain-level DNA strand displacement reaction enumerator allowing arbitrary non-pseudoknotted secondary structures
Information technologies enable programmers and engineers to design and synthesize systems of startling complexity that nonetheless behave as intended. This mastery of complexity is made possible by a hierarchy of formal abstractions that span from high-level programming languages down to low-level implementation specifications, with rigorous connections between the levels. DNA nanotechnology presents us with a new molecular information technology whose potential has not yet been fully unlocked in this way. Developing an effective hierarchy of abstractions may be critical for increasing the complexity of programmable DNA systems. Here, we build on prior practice to provide a new formalization of ‘domain-level’ representations of DNA strand displacement systems that has a natural connection to nucleic acid biophysics while still being suitable for formal analysis. Enumeration of unimolecular and bimolecular reactions provides a semantics for programmable molecular interactions, with kinetics given by an approximate biophysical model. Reaction condensation provides a tractable simplification of the detailed reactions that respects overall kinetic properties. The applicability and accuracy of the model is evaluated across a wide range of engineered DNA strand displacement systems. Thus, our work can serve as an interface between lower-level DNA models that operate at the nucleotide sequence level, and high-level chemical reaction network models that operate at the level of interactions between abstract species
- …
