3,639 research outputs found

    Phase Equilibria of Lattice Polymers from Histogram Reweighting Monte Carlo Simulations

    Full text link
    Histogram-reweighting Monte Carlo simulations were used to obtain polymer / solvent phase diagrams for lattice homopolymers of chain lengths up to r=1000 monomers. The simulation technique was based on performing a series of grand canonical Monte Carlo calculations for a small number of state points and combining the results to obtain the phase behavior of a system over a range of temperatures and densities. Critical parameters were determined from mixed-field finite-size scaling concepts by matching the order parameter distribution near the critical point to the distribution for the three-dimensional Ising universality class. Calculations for the simple cubic lattice (coordination number z=6) and for a high coordination number version of the same lattice (z=26) were performed for chain lengths significantly longer than in previous simulation studies. The critical temperature was found to scale with chain length following the Flory-Huggins functional form. For the z=6 lattice, the extrapolated infinite chain length critical temperature is 3.70+-0.01, in excellent agreement with previous calculations of the temperature at which the osmotic second virial coefficient is zero and the mean end-to-end distance proportional to the number of bonds. This confirms that the three alternative definitions of the Theta-temperature are equivalent in the limit of long chains. The critical volume fraction scales with chain length with an exponent equal to 0.38+-0.01, in agreement with experimental data but in disagreement with polymer solution theories. The width of the coexistence curve prefactor was tentatively found to scale with chain length with an exponent of 0.20+-0.03 for z = 6 and 0.22+-0.03 for z = 26. These values are near the lower range of values obtained from experimental data.Comment: 23 pages, including 7 figure

    Market Structure, Technology Spillovers, and Persistence in Productivity Differentials

    Get PDF
    Using data from 11 main manufacturing industries in 17 OECD countries, this paper empirically investigates the determinants of cross-country differences in the persistence of productivity differentials Specifically, we focus on the effects of product market structure and technology diffusion. It is found that the manufacturing industries display a wide range of convergence rates. Consistent with theories, the persistence of productivity differentials is found to be positively correlated with the price-cost margin and the intra-industry trade index - the proxies for market monopolistic behavior. The proxies for tecnology diffusion, however, do not exhibit consistently significant effect. Among the conditioning macro variables, productivity convergence appears to be enhanced by human capital but deterred by government spending.Total factor productivity, convergence, market structure, technology diffusion

    Testing for Output Convergence: A Re-Examination

    Get PDF
    This paper investigates output convergence for the G7 countries using multivariate time series techniques. We consider both the null hypotheses of no convergence and convergence. It is shown that inferences on output convergence depend on which one of the two null hypotheses is considered. Further, the no convergence results reported in previous studies using the time series definition may be attributed to the low power of the test procedures being used. Our results also highlight some potential problems on interpreting results from some typical multivariate unit root and stationarity tests.Output convergence, multivariate test, unit root test, stationarity test

    A measurement of the [tau lepton] polarization at the Z resonance with the DELPHI detector at LEP

    Get PDF
    The polarization of [tau] leptons produced in the reaction e[superscript]+e[superscript]-→[tau][superscript]+[tau][superscript]- near the peak of the Z° resonance has been measured using a sample of 24904 [tau][superscript]+[tau][superscript]- events, with an estimated background of 1.5%. We have selected 4562 [tau]→ e[nu][macron][nu], 2218 [tau]→[pi][nu] and 5133 [tau]→[rho][nu] candidates. The mean value obtained is P[subscript][tau]=-0.176± 0.029. This corresponds to a ratio of the neutral current vector to the axial-vector coupling constants of the [tau] lepton of g[subscript]spV[tau]/g[subscript]spA[tau]=0.088± 0.014. This leads to a value of the electroweak mixing angle of sin[superscript]2[theta][subscript]W=0.2280± 0.0036. This result is in good agreement with previous measurements of the weak mixing angle from the study of the Z° lineshape and the forward-backward asymmetries in the processes Z°→ l[superscript]+l[superscript]- and Z°→ q[macron]q

    Empirical Exchange Rate Models of the Nineties: Are Any Fit to Survive?

    Get PDF
    Previous assessments of nominal exchange rate determination have focused upon a narrow set of models typically of the 1970's vintage. The canonical papers in this literature are by Meese and Rogoff (1983, 1988), who examined monetary and portfolio balance models. Succeeding works by Mark (1995) and Chinn and Meese (1995) focused on similar models. In this paper we re-assess exchange rate prediction using a wider set of models that have been proposed in the last decade: interest rate parity, productivity based models, and behavioral equilibrium exchange rate' models. The performance of these models is compared against a benchmark model the Dornbusch-Frankel sticky price monetary model. The models are estimated in error correction and first-difference specifications. Rather than estimating the cointegrating vector over the entire sample and treating it as part of the ex ante information set as is commonly done in the literature, we recursively update the cointegrating vector, thereby generating true ex ante forecasts. We examine model performance at various forecast horizons (1 quarter, 4 quarters, 20 quarters) using differing metrics (mean squared error, direction of change), as well as the consistency' test of Cheung and Chinn (1998). No model consistently outperforms a random walk, by a mean squared error measure; however, along a direction-of-change dimension, certain structural models do outperform a random walk with statistical significance. Moreover, one finds that these forecasts are cointegrated with the actual values of exchange rates, although in a large number of cases, the elasticity of the forecasts with respect to the actual values is different from unity. Overall, model/specification/currency combinations that work well in one period will not necessarily work well in another period.

    What Do We Know about Recent Exchange Rate Models? In-Sample Fit and Out-of-Sample Performance Evaluated

    Get PDF
    Previous assessments of nominal exchange rate determination have focused upon a narrow set of models typically of the 1970’s vintage, including monetary and portfolio balance models. In this paper we re-assess the in-sample fit and out-of-sample prediction of a wider set of models that have been proposed in the last decade, namely interest rate parity, productivitybased models, and "behavioral equilibrium exchange rate" models. These models are compared against a benchmark model, the Dornbusch-Frankel sticky price monetary model. First, the parameter estimates of the models are compared against the theoretically predicted values. Second, we conduct an extensive out-of-sample forecasting exercise, using the last eight years of data to determine whether our in-sample conclusions hold up. We examine model performance at various forecast horizons (1 quarter, 4 quarters, 20 quarters) using differing metrics (mean squared error, direction of change), as well as the “consistency” test of Cheung and Chinn (1998). We find that no model fits the data particularly well, nor does any model consistently out-predict a random walk, even at long horizons. There is little correspondence between how well a model conforms to theoretical priors and how well the model performs in a prediction context. However, we do confirm previous findings that out-performance of a random walk is more likely at long horizons.exchange rates, monetary model, productivity, interest rate parity, behavioral equilibrium exchange rate model, forecasting performance

    Race and Health Disparities Among Seniors in Urban Areas in Brazil

    Get PDF
    White seniors report better health than Black seniors in urban areas in Sao Paulo, Brazil. This is the case even after controlling for baseline health conditions and several demographic, socio-economic and family support characteristics. Furthermore, adjusted racial disparities in self-reported health are larger than the disparities found using alternative measures of functional health. Our empirical research in this paper suggests that the two most important factors driving racial disparities in health among seniors (in our sample) are historical differences in rural living conditions and current income. Present economic conditions are more relevant to racial disparities among poor seniors than among rich seniors. Moreover, racial differences in health not attributable to observable characteristics are more important when comparing individuals in the upper half of the income distribution.

    Online regenerator placement.

    Get PDF
    Connections between nodes in optical networks are realized by lightpaths. Due to the decay of the signal, a regenerator has to be placed on every lightpath after at most d hops, for some given positive integer d. A regenerator can serve only one lightpath. The placement of regenerators has become an active area of research during recent years, and various optimization problems have been studied. The first such problem is the Regeneration Location Problem (Rlp), where the goal is to place the regenerators so as to minimize the total number of nodes containing them. We consider two extreme cases of online Rlp regarding the value of d and the number k of regenerators that can be used in any single node. (1) d is arbitrary and k unbounded. In this case a feasible solution always exists. We show an O(log|X| ·logd)-competitive randomized algorithm for any network topology, where X is the set of paths of length d. The algorithm can be made deterministic in some cases. We show a deterministic lower bound of W([(log(|E|/d) ·logd)/(log(log(|E|/d) ·logd))])log(Ed)logdlog(log(Ed)logd) , where E is the edge set. (2) d = 2 and k = 1. In this case there is not necessarily a solution for a given input. We distinguish between feasible inputs (for which there is a solution) and infeasible ones. In the latter case, the objective is to satisfy the maximum number of lightpaths. For a path topology we show a lower bound of Öl/2l2 for the competitive ratio (where l is the number of internal nodes of the longest lightpath) on infeasible inputs, and a tight bound of 3 for the competitive ratio on feasible inputs

    Revealing hidden scenes by photon-efficient occlusion-based opportunistic active imaging

    Full text link
    The ability to see around corners, i.e., recover details of a hidden scene from its reflections in the surrounding environment, is of considerable interest in a wide range of applications. However, the diffuse nature of light reflected from typical surfaces leads to mixing of spatial information in the collected light, precluding useful scene reconstruction. Here, we employ a computational imaging technique that opportunistically exploits the presence of occluding objects, which obstruct probe-light propagation in the hidden scene, to undo the mixing and greatly improve scene recovery. Importantly, our technique obviates the need for the ultrafast time-of-flight measurements employed by most previous approaches to hidden-scene imaging. Moreover, it does so in a photon-efficient manner based on an accurate forward model and a computational algorithm that, together, respect the physics of three-bounce light propagation and single-photon detection. Using our methodology, we demonstrate reconstruction of hidden-surface reflectivity patterns in a meter-scale environment from non-time-resolved measurements. Ultimately, our technique represents an instance of a rich and promising new imaging modality with important potential implications for imaging science.Comment: Related theory in arXiv:1711.0629

    Racial and Ethnic Disparities in Health in Latin America and the Caribbean

    Get PDF
    There is increasing awareness that race and ethnicity play an important role in the poverty and social marginalization of Latin American and Caribbean populations.Health Care
    corecore