76 research outputs found
Main Characteristics of the Hungarian Income Inequality as Shown by the Data of the Income Surveys Carried out by the CSO in the Last Half Century
The study shortly surveys the main characteristics of the income surveys carried out by the Hungarian CSO in the last half century, then examines how the incomes of the households and especially the income inequalities developed in this period. The changes in the income inequality are shown in several inequality measures in the study. The emphasis is on the Theil inequality measure, because it can be unequivocally additively decomposed into parts representing the differences in the mean income between the various social groups and their weights on the one hand and the average within group inequalities on the other. The decomposition enlightens how and to what extent the various personal, household and regional characteristics contribute to the income inequality within the population and how the extent of this contribution changes in time and because of what causes. Based on the data of the last two income surveys the study examines the contribution to the inequality not only on the basis of the per capita income, but also on that of the equivalent income. Finally, on the basis of the huge amount of empirical data the study makes a few summary statements.income statistics, income inequality, index numbers
Electronic transport in quasi-one-dimensional arrays of gold nanocrystals
We report on the fabrication and current-voltage (IV) characteristics of very
narrow, strip-like arrays of metal nanoparticles. The arrays were formed from
gold nanocrystals self-assembled between in-plane electrodes. Local
cross-linking of the ligands by exposure to a focused electron beam and
subsequent removal of the unexposed regions produced arrays as narrow as four
particles wide and sixty particles long, with high degree of structural
ordering. Remarkably, even for such quasi-one-dimensional strips, we find
nonlinear, power-law IV characteristics similar to that of much wider
two-dimensional (2D) arrays. However, in contrast to the robust behavior of 2D
arrays, the shape of the IV characteristics is much more sensitive to
temperature changes and temperature cycling. Furthermore, at low temperatures
we observe pronounced two-level current fluctuations, indicative of discrete
rearrangements in the current paths. We associate this behavior with the
inherent high sensitivity of single electron tunneling to the polarization
caused by the quenched offset charges in the underlying substrate.Comment: 5 pages, 4 figure
Building on Success: Establishing an Information Literacy Program at Portland State University
Discovering Linear Models of Grid Workload
Despite extensive research focused on enabling QoS for grid users through economic and intelligent resource provisioning, no consensus has emerged on the most promising strategies. On top of intrinsically challenging problems, the complexity and size of data has so far drastically limited the number of comparative experiments. An alternative to experimenting on real, large, and complex data, is to look for well-founded and parsimonious representations. The goal of this paper is to answer a set of preliminary questions, which may help steering the design of those along feasible paths: is it possible to exhibit consistent models of the grid workload? If such models do exist, which classes of models are more appropriate, considering both simplicity and descriptive power? How can we actually discover such models? And finally, how can we assess the quality of these models on a statistically rigorous basis? Our main contributions are twofold. First we found that grid workload models can consistently be discovered from the real data, and that limiting the range of models to piecewise linear time series models is sufficiently powerful. Second, we presents a bootstrapping strategy for building more robust models from the limited samples at hand. This study is based on exhaustive information representative of a significant fraction of e-science computing activity in Europe
Percolating through networks of random thresholds: Finite temperature electron tunneling in metal nanocrystal arrays
We investigate how temperature affects transport through large networks of
nonlinear conductances with distributed thresholds. In monolayers of
weakly-coupled gold nanocrystals, quenched charge disorder produces a range of
local thresholds for the onset of electron tunneling. Our measurements
delineate two regimes separated by a cross-over temperature . Up to
the nonlinear zero-temperature shape of the current-voltage curves survives,
but with a threshold voltage for conduction that decreases linearly with
temperature. Above the threshold vanishes and the low-bias conductance
increases rapidly with temperature. We develop a model that accounts for these
findings and predicts .Comment: 5 pages including 3 figures; replaced 3/30/04: minor changes; final
versio
Discovering Piecewise Linear Models of Grid Workload
International audienceDespite extensive research focused on enabling QoS for grid users through economic and intelligent resource provisioning, no consensus has emerged on the most promising strategies. On top of intrinsically challenging problems, the complexity and size of data has so far drastically limited the number of comparative experiments. An alternative to experimenting on real, large, and complex data, is to look for well-founded and parsimonious representations. This study is based on exhaustive information about the gLite-monitored jobs from the EGEE grid, representative of a significant fraction of e-science computing activity in Europe. Our main contributions are twofold. First we found that workload models for this grid can consistently be discovered from the real data, and that limiting the range of models to piecewise linear time series models is sufficiently powerful. Second, we present a bootstrapping strategy for building more robust models from the limited samples at hand
A model for the onset of transport in systems with distributed thresholds for conduction
We present a model supported by simulation to explain the effect of
temperature on the conduction threshold in disordered systems. Arrays with
randomly distributed local thresholds for conduction occur in systems ranging
from superconductors to metal nanocrystal arrays. Thermal fluctuations provide
the energy to overcome some of the local thresholds, effectively erasing them
as far as the global conduction threshold for the array is concerned. We
augment this thermal energy reasoning with percolation theory to predict the
temperature at which the global threshold reaches zero. We also study the
effect of capacitive nearest-neighbor interactions on the effective charging
energy. Finally, we present results from Monte Carlo simulations that find the
lowest-cost path across an array as a function of temperature. The main result
of the paper is the linear decrease of conduction threshold with increasing
temperature: , where is an
effective charging energy that depends on the particle radius and interparticle
distance, and is the percolation threshold of the underlying lattice. The
predictions of this theory compare well to experiments in one- and
two-dimensional systems.Comment: 14 pages, 10 figures, submitted to PR
Recommended from our members
Tracking human skill learning with a hierarchical Bayesian sequence model
Humans can implicitly learn complex perceptuo-motor skills over the course of large numbers of trials. This likely depends on our becoming better able to take advantage of ever richer and temporally deeper predictive relationships in the environment. Here, we offer a novel characterization of this process, fitting a non-parametric, hierarchical Bayesian sequence model to the reaction times of human participants’ responses over ten sessions, each comprising thousands of trials, in a serial reaction time task involving higher-order dependencies. The model, adapted from the domain of language, forgetfully updates trial-by-trial, and seamlessly combines predictive information from shorter and longer windows onto past events, weighing the windows proportionally to their predictive power. As the model implies a posterior over window depths, we were able to determine how, and how many, previous sequence elements influenced individual participants’ internal predictions, and how this changed with practice. Already in the first session, the model showed that participants had begun to rely on two previous elements (i.e., trigrams), thereby successfully adapting to the most prominent higher-order structure in the task. The extent to which local statistical fluctuations in trigram frequency influenced participants’ responses waned over subsequent sessions, as participants forgot the trigrams less and evidenced skilled performance. By the eighth session, a subset of participants shifted their prior further to consider a context deeper than two previous elements. Finally, participants showed resistance to interference and slow forgetting of the old sequence when it was changed in the final sessions. Model parameters for individual participants covaried appropriately with independent measures of working memory and error characteristics. In sum, the model offers the first principled account of the adaptive complexity and nuanced dynamics of humans’ internal sequence representations during long-term implicit skill learning
Recommended from our members
Adaptation to recent outcomes attenuates the lasting effect of initial experience on risky decisions
Both primarily and recently encountered information have been shown to influence experience-based risky decision making. The primacy effect predicts that initial experience will influence later choices even if outcome probabilities change and reward is ultimately more or less sparse than primarily experienced. However, it has not been investigated whether extended initial experience would induce a more profound primacy effect upon risky choices than brief experience. Therefore, the present study tested in two experiments whether young adults adjusted their risk-taking behavior in the Balloon Analogue Risk Task after an unsignaled and unexpected change point. The change point separated early “good luck” or “bad luck” trials from subsequent ones. While mostly positive (more reward) or mostly negative (no reward) events characterized the early trials, subsequent trials were unbiased. In Experiment 1, the change point occurred after one-sixth or one-third of the trials (brief vs. extended experience) without intermittence, whereas in Experiment 2, it occurred between separate task phases. In Experiment 1, if negative events characterized the early trials, after the change point, risk-taking behavior increased as compared with the early trials. Conversely, if positive events characterized the early trials, risk-taking behavior decreased after the change point. Although the adjustment of risk-taking behavior occurred due to integrating recent experiences, the impact of initial experience was simultaneously observed. The length of initial experience did not reliably influence the adjustment of behavior. In Experiment 2, participants became more prone to take risks as the task progressed, indicating that the impact of initial experience could be overcome. Altogether, we suggest that initial beliefs about outcome probabilities can be updated by recent experiences to adapt to the continuously changing decision environment
- …