1,235 research outputs found
Climate Change Disaster Management: Mitigation and Adaptation in a Public Goods Framework
This paper explores the collective action problem as it relates to climate change and develops two models that capture the mitigation/adaptation trade-off. The first model presents climate change as a certain disaster, while the second models climate change as a stochastic event. A one-shot public goods experiment with students reveals a relatively low rate of mitigation for both models. The effect of vulnerability towards climate change is also examined by varying the magnitude of the disaster across treatments. Our results find no significant difference between the high and low-vulnerability environments. This research contributes to the literature concerning public goods experiments as well as the analysis of climate change policy.Public good; climate change; mitigation; adaptation; experiment; risk
Maladaptation and the paradox of robustness in evolution
Background. Organisms use a variety of mechanisms to protect themselves
against perturbations. For example, repair mechanisms fix damage, feedback
loops keep homeostatic systems at their setpoints, and biochemical filters
distinguish signal from noise. Such buffering mechanisms are often discussed in
terms of robustness, which may be measured by reduced sensitivity of
performance to perturbations. Methodology/Principal Findings. I use a
mathematical model to analyze the evolutionary dynamics of robustness in order
to understand aspects of organismal design by natural selection. I focus on two
characters: one character performs an adaptive task; the other character
buffers the performance of the first character against perturbations. Increased
perturbations favor enhanced buffering and robustness, which in turn decreases
sensitivity and reduces the intensity of natural selection on the adaptive
character. Reduced selective pressure on the adaptive character often leads to
a less costly, lower performance trait. Conclusions/Significance. The paradox
of robustness arises from evolutionary dynamics: enhanced robustness causes an
evolutionary reduction in the adaptive performance of the target character,
leading to a degree of maladaptation compared to what could be achieved by
natural selection in the absence of robustness mechanisms. Over evolutionary
time, buffering traits may become layered on top of each other, while the
underlying adaptive traits become replaced by cheaper, lower performance
components. The paradox of robustness has widespread implications for
understanding organismal design
AGI and the Knight-Darwin Law: why idealized AGI reproduction requires collaboration
Can an AGI create a more intelligent AGI? Under idealized assumptions, for a certain theoretical type of intelligence, our answer is: “Not without outside help”. This is a paper on the mathematical structure of AGI populations when parent AGIs create child AGIs. We argue that such populations satisfy a certain biological law. Motivated by observations of sexual reproduction in seemingly-asexual species, the Knight-Darwin Law states that it is impossible for one organism to asexually produce another, which asexually produces another, and so on forever: that any sequence of organisms (each one a child of the previous) must contain occasional multi-parent organisms, or must terminate. By proving that a certain measure (arguably an intelligence measure) decreases when an idealized parent AGI single-handedly creates a child AGI, we argue that a similar Law holds for AGIs
Hybridization between wild and cultivated potato species in the Peruvian Andes and biosafety implications for deployment of GM potatoes
The nature and extent of past and current hybridization between cultivated potato and wild relatives in nature is of interest to crop evolutionists, taxonomists, breeders and recently to molecular biologists because of the possibilities of inverse gene flow in the deployment of genetically-modified (GM) crops. This research proves that natural hybridization occurs in areas of potato diversity in the Andes, the possibilities for survival of these new hybrids, and shows a possible way forward in case of GM potatoes should prove advantageous in such areas
Conformally rescaled spacetimes and Hawking radiation
We study various derivations of Hawking radiation in conformally rescaled
metrics. We focus on two important properties, the location of the horizon
under a conformal transformation and its associated temperature. We find that
the production of Hawking radiation cannot be associated in all cases to the
trapping horizon because its location is not invariant under a conformal
transformation. We also find evidence that the temperature of the Hawking
radiation should transform simply under a conformal transformation, being
invariant for asymptotic observers in the limit that the conformal
transformation factor is unity at their location.Comment: 22 pages, version submitted to journa
Particle creation rate for dynamical black holes
We present the particle creation probability rate around a general black hole
as an outcome of quantum fluctuations. Using the uncertainty principle for
these fluctuation, we derive a new ultraviolet frequency cutoff for the
radiation spectrum of a dynamical black hole. Using this frequency cutoff, we
define the probability creation rate function for such black holes. We consider
a dynamical Vaidya model, and calculate the probability creation rate for this
case when its horizon is in a slowly evolving phase. Our results show that one
can expect the usual Hawking radiation emission process in the case of a
dynamical black hole when it has a slowly evolving horizon. Moreover,
calculating the probability rate for a dynamical black hole gives a measure of
when Hawking radiation can be killed off by an incoming flux of matter or
radiation. Our result strictly suggests that we have to revise the Hawking
radiation expectation for primordial black holes that have grown substantially
since they were created in the early universe. We also infer that this
frequency cut off can be a parameter that shows the primordial black hole
growth at the emission moment.Comment: 10 pages, 1 figure. The paper was rewritten in more clear
presentation and one more appendix is adde
Prediction of lethal and synthetically lethal knock-outs in regulatory networks
The complex interactions involved in regulation of a cell's function are
captured by its interaction graph. More often than not, detailed knowledge
about enhancing or suppressive regulatory influences and cooperative effects is
lacking and merely the presence or absence of directed interactions is known.
Here we investigate to which extent such reduced information allows to forecast
the effect of a knock-out or a combination of knock-outs. Specifically we ask
in how far the lethality of eliminating nodes may be predicted by their network
centrality, such as degree and betweenness, without knowing the function of the
system. The function is taken as the ability to reproduce a fixed point under a
discrete Boolean dynamics. We investigate two types of stochastically generated
networks: fully random networks and structures grown with a mechanism of node
duplication and subsequent divergence of interactions. On all networks we find
that the out-degree is a good predictor of the lethality of a single node
knock-out. For knock-outs of node pairs, the fraction of successors shared
between the two knocked-out nodes (out-overlap) is a good predictor of
synthetic lethality. Out-degree and out-overlap are locally defined and
computationally simple centrality measures that provide a predictive power
close to the optimal predictor.Comment: published version, 10 pages, 6 figures, 2 tables; supplement at
http://www.bioinf.uni-leipzig.de/publications/supplements/11-01
Incidence of cancer in the area around Amsterdam Airport Schiphol in 1988–2003: a population-based ecological study
BACKGROUND: Amsterdam Airport Schiphol is a major source of complaints about aircraft noise, safety risks and concerns about long term adverse health effects, including cancer. We investigated whether residents of the area around Schiphol are at higher risk of developing cancer than the general Dutch population. METHODS: In a population-based study using the regional cancer registry, we estimated the cancer incidence during 1988–2003 in residents of the area surrounding Schiphol. We defined a study area based on aircraft noise contours and 4-digit postal code areas, since historical data on ambient air pollution were not available and recent emission data did not differ from the background urban air quality. RESULTS: In residents of the study area 13 207 cancer cases were diagnosed, which was close to the expected number, using national incidence rates as a reference (standardized incidence ratio [SIR] 1.02). We found a statistically significantly increased incidence of hematological malignancies (SIR 1.12, 95% confidence interval [CI]: 1.05, 1.19), mainly due to high rates for non-Hodgkin lymphoma (SIR 1.22, 95% CI: 1.12, 1.33) and acute lymphoblastic leukemia (SIR 1.34, 95% CI: 0.95, 1.83). The incidence of cancer of the respiratory system was statistically significantly decreased (SIR 0.94, 95% CI: 0.90, 0.99), due to the low rate in males (SIR 0.89). In the core zone of the study area, cancer incidence was slightly higher than in the remaining ring zone (rate ratio of the core zone compared to the ring zone 1.05, 95% CI 1.01, 1.10). This was caused by the higher incidence of cancer of the respiratory system, prostate and the female genital organs in the core zone in comparison to the ring zone. CONCLUSION: The overall cancer incidence in the Schiphol area was similar to the national incidence. The moderately increased risk of hematological malignancies could not be explained by higher levels of ambient air pollution in the Schiphol area. This observation warrants further research, for example in a study with focus on substances in urban ambient air pollution, as similar findings were observed in Greater Amsterdam
Evaluation of rate law approximations in bottom-up kinetic models of metabolism.
BackgroundThe mechanistic description of enzyme kinetics in a dynamic model of metabolism requires specifying the numerical values of a large number of kinetic parameters. The parameterization challenge is often addressed through the use of simplifying approximations to form reaction rate laws with reduced numbers of parameters. Whether such simplified models can reproduce dynamic characteristics of the full system is an important question.ResultsIn this work, we compared the local transient response properties of dynamic models constructed using rate laws with varying levels of approximation. These approximate rate laws were: 1) a Michaelis-Menten rate law with measured enzyme parameters, 2) a Michaelis-Menten rate law with approximated parameters, using the convenience kinetics convention, 3) a thermodynamic rate law resulting from a metabolite saturation assumption, and 4) a pure chemical reaction mass action rate law that removes the role of the enzyme from the reaction kinetics. We utilized in vivo data for the human red blood cell to compare the effect of rate law choices against the backdrop of physiological flux and concentration differences. We found that the Michaelis-Menten rate law with measured enzyme parameters yields an excellent approximation of the full system dynamics, while other assumptions cause greater discrepancies in system dynamic behavior. However, iteratively replacing mechanistic rate laws with approximations resulted in a model that retains a high correlation with the true model behavior. Investigating this consistency, we determined that the order of magnitude differences among fluxes and concentrations in the network were greatly influential on the network dynamics. We further identified reaction features such as thermodynamic reversibility, high substrate concentration, and lack of allosteric regulation, which make certain reactions more suitable for rate law approximations.ConclusionsOverall, our work generally supports the use of approximate rate laws when building large scale kinetic models, due to the key role that physiologically meaningful flux and concentration ranges play in determining network dynamics. However, we also showed that detailed mechanistic models show a clear benefit in prediction accuracy when data is available. The work here should help to provide guidance to future kinetic modeling efforts on the choice of rate law and parameterization approaches
Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems
A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud
\u
- …