353 research outputs found
The Microeconomics of Poverty Traps in Mexico
Macroeconomists, development scholars, and policy makers have long recognized the importance of poverty traps as a mayor cause of persistent inequality and a serious limitation to growth. A poverty trap may be defined as a threshold level below which individuals or households will not increase their well-being despite the conditions of the economy. While the importance of poverty traps is widely accepted, their microfoundations (the rationality) behind them are not very well understood. Under the Mexican setting, this paper contributes in two ways. First, we assume that income depends on the capital (both physical and human) that a household posses. Hence, if a household is poor and it is not able to accumulate capital it will remain poor (unless there is a sudden increase to the returns of its existing capital). Thus a poverty trap will be generated. Following Chavas (2004, 2005) we explicitly model the preferences, consumption, and the physical and human capital accumulation of Mexican households. We argue that the typical dynamic model with additive utilities and constant discount rates will not be able to capture poverty traps. The reason is that survival motives are involved (endogenous discounting is needed). Second, employing the same model, we test the impact of the Mexican government most important social policy program (Progresa-Oportunidades), in alleviating poverty traps. In the case of households with youngsters, this program can provide funds conditioned on kids attending school. This will somehow, force the participants to increase their human capital. A comparison between households in the programs versus non participants should shed some light in the effectiveness of the program and the sensitivity of persistent poverty to cash transfers.
A simple model for predicting tropical cyclone minimum central pressure from intensity and size
Minimum central pressure () is an integrated measure of the tropical
cyclone wind field and is known to be a useful indicator of storm damage
potential. A simple model that predicts from routinely-estimated
quantities, including storm size, would be of great value. Here we present a
simple linear empirical model for predicting from maximum wind speed,
the radius of 34-knot winds (), storm-center latitude, and the
environmental pressure. An empirical model for the pressure deficit is first
developed that takes as predictors specific combinations of these quantities
that are derived directly from theory, based on gradient wind balance and a
modified-Rankine-type wind profile known to capture storm structure inside of
. Model coefficients are estimated using data from the southwestern
North Atlantic and eastern North Pacific from 2004--2022 using aircraft-based
estimates of , Extended Best Track data, and estimates of
environmental pressure from Global Forecast System (GFS) analyses. The model
has near-zero conditional bias even for low , explaining 94.4\% of the
variance. Performance is superior to a variety of other model formulations,
including a standard wind-pressure model that does not account for storm size
or latitude (89.4\% variance explained). Model performance is also strong when
applied to high-latitude data and data near coastlines. Finally, the model is
shown to perform comparably well in an operations-like setting based solely on
routinely-estimated variables, including the pressure of the outermost closed
isobar. Case study applications to five impactful historical storms are
discussed. Overall, the model offers a simple and fast prediction for
for practical use in operations and research
Exhaustive enumeration unveils clustering and freezing in random 3-SAT
We study geometrical properties of the complete set of solutions of the
random 3-satisfiability problem. We show that even for moderate system sizes
the number of clusters corresponds surprisingly well with the theoretic
asymptotic prediction. We locate the freezing transition in the space of
solutions which has been conjectured to be relevant in explaining the onset of
computational hardness in random constraint satisfaction problems.Comment: 4 pages, 3 figure
On the role of information in decision making: the case of sorghum yield in Burkina Faso
This paper investigates the role of temporal uncertainty and information issues in economic decisions. It shows that the nature of the economic environment (e.g., the production technology) can influence the valuation of information, which in turn affects the choice functions. This is illustrated by an empirical application to sorghum yield response analysis in Burkina Faso. The paper stresses the importance of technology and information valuation in risk behaviou
Tropical Cyclone Cold Wake Size and Its Applications to Power Dissipation and Ocean Heat Uptake Estimates
Mixing of the upper ocean by the wind field associated with tropical cyclones (TCs) creates observable cold wakes in sea surface temperature and may potentially influence ocean heat uptake. The relationship between cold wake size and storm size, however, has yet to be explored. Here we apply two objective methods to observed daily sea surface temperature data to quantify the size of TC-induced cold wakes. The obtained cold wake sizes agree well with the TC sizes estimated from the QuikSCAT-R wind field database with a correlation coefficient of 0.51 and 0.59, respectively. Furthermore, our new estimate of the total cooling that incorporates the variations in the cold wake size provides improved estimates of TC power dissipation and TC-induced ocean heat uptake. This study thus highlights the importance of cold wake size in evaluating the climatological effects of TCs
Survey-propagation decimation through distributed local computations
We discuss the implementation of two distributed solvers of the random K-SAT
problem, based on some development of the recently introduced
survey-propagation (SP) algorithm. The first solver, called the "SP diffusion
algorithm", diffuses as dynamical information the maximum bias over the system,
so that variable nodes can decide to freeze in a self-organized way, each
variable making its decision on the basis of purely local information. The
second solver, called the "SP reinforcement algorithm", makes use of
time-dependent external forcing messages on each variable, which let the
variables get completely polarized in the direction of a solution at the end of
a single convergence. Both methods allow us to find a solution of the random
3-SAT problem in a range of parameters comparable with the best previously
described serialized solvers. The simulated time of convergence towards a
solution (if these solvers were implemented on a distributed device) grows as
log(N).Comment: 18 pages, 10 figure
Phase Transitions and Computational Difficulty in Random Constraint Satisfaction Problems
We review the understanding of the random constraint satisfaction problems,
focusing on the q-coloring of large random graphs, that has been achieved using
the cavity method of the physicists. We also discuss the properties of the
phase diagram in temperature, the connections with the glass transition
phenomenology in physics, and the related algorithmic issues.Comment: 10 pages, Proceedings of the International Workshop on
Statistical-Mechanical Informatics 2007, Kyoto (Japan) September 16-19, 200
Recommended from our members
Global Projections of Intense Tropical Cyclone Activity for the Late Twenty-First Century from Dynamical Downscaling of CMIP5/RCP4.5 Scenarios
Global projections of intense tropical cyclone activity are derived from the Geophysical Fluid Dynamics Laboratory (GFDL) High Resolution Atmospheric Model (HiRAM; 50-km grid) and the GFDL hurricane model using a two-stage downscaling procedure. First, tropical cyclone genesis is simulated globally using HiRAM. Each storm is then downscaled into the GFDL hurricane model, with horizontal grid spacing near the storm of 6 km, including ocean coupling (e.g., cold wake generation). Simulations are performed using observed sea surface temperatures (SSTs) (1980-2008) for a control run with 20 repeating seasonal cycles and for a late-twenty-first-century projection using an altered SST seasonal cycle obtained from a phase 5 of CMIP (CMIP5)/representative concentration pathway 4.5 (RCP4.5) multimodel ensemble. In general agreement with most previous studies, projections with this framework indicate fewer tropical cyclones globally in a warmer late-twenty-first-century climate, but also an increase in average cyclone intensity, precipitation rates, and the number and occurrence days of very intense category 4 and 5 storms. While these changes are apparent in the globally averaged tropical cyclone statistics, they are not necessarily present in each individual basin. The interbasin variation of changes in most of the tropical cyclone metrics examined is directly correlated to the variation in magnitude of SST increases between the basins. Finally, the framework is shown to be capable of reproducing both the observed global distribution of outer storm size-albeit with a slight high bias-and its interbasin variability. Projected median size is found to remain nearly constant globally, with increases in most basins offset by decreases in the northwest Pacific
On the cavity method for decimated random constraint satisfaction problems and the analysis of belief propagation guided decimation algorithms
We introduce a version of the cavity method for diluted mean-field spin
models that allows the computation of thermodynamic quantities similar to the
Franz-Parisi quenched potential in sparse random graph models. This method is
developed in the particular case of partially decimated random constraint
satisfaction problems. This allows to develop a theoretical understanding of a
class of algorithms for solving constraint satisfaction problems, in which
elementary degrees of freedom are sequentially assigned according to the
results of a message passing procedure (belief-propagation). We confront this
theoretical analysis to the results of extensive numerical simulations.Comment: 32 pages, 24 figure
- …