21,762 research outputs found
Ethnic Identification, Intermarriage, and Unmeasured Progress by Mexican Americans
Using Census and CPS data, we show that U.S.-born Mexican Americans who marry non-
Mexicans are substantially more educated and English proficient, on average, than are
Mexican Americans who marry co-ethnics (whether they be Mexican Americans or Mexican
immigrants). In addition, the non-Mexican spouses of intermarried Mexican Americans
possess relatively high levels of schooling and English proficiency, compared to the spouses
of endogamously married Mexican Americans. The human capital selectivity of Mexican
intermarriage generates corresponding differences in the employment and earnings of
Mexican Americans and their spouses. Moreover, the children of intermarried Mexican
Americans are much less likely to be identified as Mexican than are the children of
endogamous Mexican marriages. These forces combine to produce strong negative
correlations between the education, English proficiency, employment, and earnings of
Mexican-American parents and the chances that their children retain a Mexican ethnicity.
Such findings raise the possibility that selective ethnic âattritionâ might bias observed
measures of intergenerational progress for Mexican Americans
Immigration and the U.S. labour market
Over the last several decades, two of the most significant developments in the U.S. labor
market have been: (1) rising inequality, and (2) growth in both the size and the diversity of
immigration flows. Because a large share of new immigrants arrive with very low levels of
schooling, English proficiency, and other skills that have become increasingly important
determinants of success in the U.S. labor market, an obvious concern is that such
immigrants are a poor fit for the restructured American economy. In this chapter, we
evaluate this concern by discussing evidence for the United States on two relevant topics:
the labor market integration of immigrants, and the impact of immigration on the wages and
employment opportunities of native workers. In these dimensions, the overall labor market
performance of U.S. immigrants seems quite favorable. U.S. immigrants have little trouble
finding jobs, and this is particularly true of unskilled immigrants. Most U.S. immigrants
experience substantial earnings growth as they adapt to the American labor market. For
most immigrant groups, the U.S.-born second generation has achieved socioeconomic parity
with mainstream society; for some Hispanic groups, however, this is not the case. On the
whole, immigration to the United States has not had large adverse consequences for the
labor market opportunities of native workers. Therefore, with regard to the economic
integration and labor market impacts of immigration, it is not obvious that the seemingly
haphazard nature of U.S. immigration policy has led to unfavorable outcomes
Lumley's energy cascade dissipation rate model for boundary-free turbulent shear flows
True dissipation occurs mainly at the highest wavenumbers where the eddy sizes are comparatively small. These high wavenumbers receive their energy through the spectral cascade of energy starting with the largest eddies spilling energy into the smaller eddies, passing through each wavenumber until it is dissipated at the microscopic scale. However, a small percentage of the energy does not spill continuously through the cascade but is instantly passed to the higher wavenumbers. Consequently, the smallest eddies receive a certain amount of energy almost immediately. As the spectral energy cascade continues, the highest wavenumber needs a certain time to receive all the energy which has been transferred from the largest eddies. As such, there is a time delay, of the order of tau, between the generation of energy by the largest eddies and the eventual dissipation of this energy. For equilibrium turbulence at high Reynolds numbers, there is a wide range where energy is neither produced by the large eddies nor dissipated by viscosity, but is conserved and passed from wavenumber to higher wavenumbers. The rate at which energy cascades from one wavenumber to another is proportional to the energy contained within that wavenumber. This rate is constant and has been used in the past as a dissipation rate of turbulent kinetic energy. However, this is true only in steady, equilibrium turbulence. Most dissipation models contend that the production of dissipation is proportional to the production of energy and that the destruction of dissipation is proportional to the destruction of energy. In essence, these models state that the change in the dissipation rate is proportional to the change in the kinetic energy. This assumption is obviously incorrect for the case where there is no production of turbulent energy, yet energy continues to cascade from large to small eddies. If the time lag between the onset on the energy cascade to the destruction of energy at the microscale can be modeled, then there will be a better representation of the dissipation process. Development of an energy cascade time scale equation is discussed
Heuristic Refinement Method for the Derivation of Protein Solution Structures: Validation on Cytochrome B562
A method is described for determining the family of protein structures compatible with solution data obtained primarily from nuclear magnetic resonance (NMR) spectroscopy. Starting with all possible conformations, the method systematically excludes conformations until the remaining structures are only those compatible with the data. The apparent computational intractability of this approach is reduced by assembling the protein in pieces, by considering the protein at several levels of abstraction, by utilizing constraint satisfaction methods to consider only a few atoms at a time, and by utilizing artificial intelligence methods of heuristic control to decide which actions will exclude the most conformations. Example results are presented for simulated NMR data from the known crystal structure of cytochrome b562 (103 residues). For 10 sample backbones an average root-mean-square deviation from the crystal of 4.1 A was found for all alpha-carbon atoms and 2.8 A for helix alpha-carbons alone. The 10 backbones define the family of all structures compatible with the data and provide nearly correct starting structures for adjustment by any of the current structure determination methods
Recommended from our members
Using shared goal setting to improve access and equity: a mixed methods study of the Good Goals intervention
Background: Access and equity in childrenâs therapy services may be improved by directing cliniciansâ use of resources toward specific goals that are important to patients. A practice-change intervention (titled âGood Goalsâ) was designed to achieve this. This study investigated uptake, adoption, and possible effects of that intervention in childrenâs occupational therapy services.
Methods: Mixed methods case studies (n = 3 services, including 46 therapists and 558 children) were conducted. The intervention was delivered over 25 weeks through face-to-face training, team workbooks, and âtools for changeâ. Data were collected before, during, and after the intervention on a range of factors using interviews, a focus group, case note analysis, routine data, document analysis, and researchersâ observations.
Results: Factors related to uptake and adoptions were: mode of intervention delivery, competing demands on therapistsâ time, and leadership by service manager. Service managers and therapists reported that the intervention: helped therapists establish a shared rationale for clinical decisions; increased clarity in service provision; and improved interactions with families and schools. During the study period, therapistsâ behaviours changed: identifying goals, odds ratio 2.4 (95% CI 1.5 to 3.8); agreeing goals, 3.5 (2.4 to 5.1); evaluating progress, 2.0 (1.1 to 3.5). Childrenâs LoT decreased by two months [95% CI â8 to +4 months] across the services. Cost per therapist trained ranged from ÂŁ1,003 to ÂŁ1,277, depending upon service size and therapistsâ salary bands.
Conclusions: Good Goals is a promising quality improvement intervention that can be delivered and adopted in practice and may have benefits. Further research is required to evaluate its: (i) impact on patient outcomes, effectiveness, cost-effectiveness, and (ii) transferability to other clinical contexts
Quantum picturalism for topological cluster-state computing
Topological quantum computing is a way of allowing precise quantum
computations to run on noisy and imperfect hardware. One implementation uses
surface codes created by forming defects in a highly-entangled cluster state.
Such a method of computing is a leading candidate for large-scale quantum
computing. However, there has been a lack of sufficiently powerful high-level
languages to describe computing in this form without resorting to single-qubit
operations, which quickly become prohibitively complex as the system size
increases. In this paper we apply the category-theoretic work of Abramsky and
Coecke to the topological cluster-state model of quantum computing to give a
high-level graphical language that enables direct translation between quantum
processes and physical patterns of measurement in a computer - a "compiler
language". We give the equivalence between the graphical and topological
information flows, and show the applicable rewrite algebra for this computing
model. We show that this gives us a native graphical language for the design
and analysis of topological quantum algorithms, and finish by discussing the
possibilities for automating this process on a large scale.Comment: 18 pages, 21 figures. Published in New J. Phys. special issue on
topological quantum computin
A Damping of the de Haas-van Alphen Oscillations in the superconducting state
Deploying a recently developed semiclassical theory of quasiparticles in the
superconducting state we study the de Haas-van Alphen effect. We find that the
oscillations have the same frequency as in the normal state but their amplitude
is reduced. We find an analytic formulae for this damping which is due to
tunnelling between semiclassical quasiparticle orbits comprising both
particle-like and hole-like segments. The quantitative predictions of the
theory are consistent with the available data.Comment: 7 pages, 5 figure
Quantum Picturalism
The quantum mechanical formalism doesn't support our intuition, nor does it
elucidate the key concepts that govern the behaviour of the entities that are
subject to the laws of quantum physics. The arrays of complex numbers are kin
to the arrays of 0s and 1s of the early days of computer programming practice.
In this review we present steps towards a diagrammatic `high-level' alternative
for the Hilbert space formalism, one which appeals to our intuition. It allows
for intuitive reasoning about interacting quantum systems, and trivialises many
otherwise involved and tedious computations. It clearly exposes limitations
such as the no-cloning theorem, and phenomena such as quantum teleportation. As
a logic, it supports `automation'. It allows for a wider variety of underlying
theories, and can be easily modified, having the potential to provide the
required step-stone towards a deeper conceptual understanding of quantum
theory, as well as its unification with other physical theories. Specific
applications discussed here are purely diagrammatic proofs of several quantum
computational schemes, as well as an analysis of the structural origin of
quantum non-locality. The underlying mathematical foundation of this high-level
diagrammatic formalism relies on so-called monoidal categories, a product of a
fairly recent development in mathematics. These monoidal categories do not only
provide a natural foundation for physical theories, but also for proof theory,
logic, programming languages, biology, cooking, ... The challenge is to
discover the necessary additional pieces of structure that allow us to predict
genuine quantum phenomena.Comment: Commissioned paper for Contemporary Physics, 31 pages, 84 pictures,
some colo
Density-and trait-mediated effects of a parasite and a predator in a tri-trophic food web
1. Despite growing interest in ecological consequences of parasitism in food webs, relatively little is known about effects of parasites on long-term population dynamics of non-host species or about whether such effects are density- or trait- mediated.
2. We studied a tri-trophic food chain comprised of: (i) a bacterial basal resource (Serratia fonticola), (ii) an intermediate consumer (Paramecium caudatum), (iii) a top predator (Didinium nasutum), and (iv) a parasite of the intermediate consumer (Holospora undulata). A fully-factorial experimental manipulation of predator and parasite presence/absence was combined with analyses of population dynamics, modelling, and analyses of host (Paramecium) morphology and behavior.
3. Predation and parasitism each reduced the abundance of the intermediate consumer (Paramecium), and parasitism indirectly reduced the abundance of the basal resource (Serratia). However, in combination, predation and parasitism had non-additive effects on the abundance of the intermediate consumer, as well as on that of the basal resource. In both cases, the negative effect of parasitism seemed to be effaced by predation.
4. Infection of the intermediate consumer reduced predator abundance. Modelling and additional experimentation revealed that this was most likely due to parasite reduction of intermediate host abundance (a density-mediated effect), as opposed to changes in predator functional or numerical response.
5. Parasitism altered morphological and behavioural traits, by reducing host cell length and increasing the swimming speed of cells with moderate parasite loads. Additional tests showed no significant difference in Didinium feeding rate on infected and uninfected hosts, suggesting that the combination of these modifications does not affect host vulnerability to predation. However, estimated rates of encounter with Serratia based on these modifications were higher for infected Paramecium than for uninfected Paramecium.
6. A mixture of density-mediated and trait-mediated indirect effects of parasitism on non- host species creates rich and complex possibilities for effects of parasites in food webs that should be included in assessments of possible impacts of parasite eradication or introduction
Drawing bobbin lace graphs, or, Fundamental cycles for a subclass of periodic graphs
In this paper, we study a class of graph drawings that arise from bobbin lace
patterns. The drawings are periodic and require a combinatorial embedding with
specific properties which we outline and demonstrate can be verified in linear
time. In addition, a lace graph drawing has a topological requirement: it
contains a set of non-contractible directed cycles which must be homotopic to
, that is, when drawn on a torus, each cycle wraps once around the minor
meridian axis and zero times around the major longitude axis. We provide an
algorithm for finding the two fundamental cycles of a canonical rectangular
schema in a supergraph that enforces this topological constraint. The polygonal
schema is then used to produce a straight-line drawing of the lace graph inside
a rectangular frame. We argue that such a polygonal schema always exists for
combinatorial embeddings satisfying the conditions of bobbin lace patterns, and
that we can therefore create a pattern, given a graph with a fixed
combinatorial embedding of genus one.Comment: Appears in the Proceedings of the 25th International Symposium on
Graph Drawing and Network Visualization (GD 2017
- âŠ