19,044 research outputs found
Social inequalities in early childhood competences, and the relative role of social and emotional versus cognitive skills in predicting adult outcomes
This study draws on the nationally representative British Birth Cohort Study (BCS70) to examine (1) the association between social background and early socio-emotional and cognitive competences at age 5 and (2) the relative and independent contributions of early socio-emotional and cognitive competences to educational and socio-economic attainment in adulthood. A multi-dimensional (multiple exposure, multiple outcome) approach is adopted in conceptualising social background, childhood competences and adult outcomes by age 42. Indicators of social background include parental education, social class, employment status, family income, as well as home ownership, enabling us to test which aspects of socio-economic risk uniquely influence the development of early competences. Indicators of childhood competences include directly assessed cognitive competences (i.e. verbal and visual motor skills), while measures of socio-emotional competences include hyperactivity, good conduct, emotional health and social skills, reported by the child’s mother at age 5. Adult outcomes include highest qualifications, social class and household income by age 42. The findings suggest that multiple indicators of social background are associated with both socio-emotional and cognitive competences, although the associations with socio-emotional competences are less strong than those with cognitive competences. We find significant long-term predictive effects of early cognitive skills on adult outcomes, but also independent effects of socio-emotional competences, in particular self-regulation, over and above the role of family background. The study supports calls for early interventions aiming to reduce family socio-economic risk exposure and supporting the development of cognitive skills and self-regulation (i.e. reducing hyperactivity and conduct problems)
The Impact of Devegetated Dune Fields on North American Climate During the Late Medieval Climate Anomaly
During the Medieval Climate Anomaly, North America experienced severe droughts and widespread mobilization of dune fields that persisted for decades. We use an atmosphere general circulation model, forced by a tropical Pacific sea surface temperature reconstruction and changes in the land surface consistent with estimates of dune mobilization (conceptualized as partial devegetation), to investigate whether the devegetation could have exacerbated the medieval droughts. Presence of devegetated dunes in the model significantly increases surface temperatures, but has little impact on precipitation or drought severity, as defined by either the Palmer Drought Severity Index or the ratio of precipitation to potential evapotranspiration. Results are similar to recent studies of the 1930s Dust Bowl drought, suggesting bare soil associated with the dunes, in and of itself, is not sufficient to amplify droughts over North America
Pan-Continental Droughts in North America over the Last Millennium
Regional droughts are common in North America, but pan-continental droughts extending across multiple regions, including the 2012 event, are rare relative to single-region events. Here, the tree-ring-derived North American Drought Atlas is used to investigate drought variability in four regions over the last millennium, focusing on pan-continental droughts. During the Medieval Climate Anomaly (MCA), the central plains (CP), Southwest (SW), and Southeast (SE) regions experienced drier conditions and increased occurrence of droughts and the Northwest (NW) experienced several extended pluvials. Enhanced MCA aridity in the SW and CP manifested as multidecadal megadroughts. Notably, megadroughts in these regions differed in their timing and persistence, suggesting that they represent regional events influenced by local dynamics rather than a unified, continental-scale phenomena. There is no trend in pan-continental drought occurrence, defined as synchronous droughts in three or more regions. SW, CP, and SE (SW+CP+SE) droughts are the most common, occurring in 12 percent of all years and peaking in prevalence during the twelfth and thirteenth centuries; patterns involving three other regions occur in about 8 percent of years. Positive values of the Southern Oscillation index (La Nina conditions) are linked to SW, CP, and SE (SW+CP+SE) droughts and SW, CP, and NW (SW+CP+NW) droughts, whereas CP, NW, and SE (CP+NW+SE) droughts are associated with positive values of the Pacific decadal oscillation and Atlantic multidecadal oscillation. While relatively rare, pan-continental droughts are present in the paleo record and are linked to defined modes of climate variability, implying the potential for seasonal predictability. Assuming stable drought teleconnections, these events will remain an important feature of future North American hydroclimate, possibly increasing in their severity in step with other expected hydroclimate responses to increased greenhouse gas forcing
Recommended from our members
Bridging Past and Future Climate across Paleoclimatic Reconstructions, Observations, and Models: A Hydroclimate Case Study
Potential biases in tree-ring reconstructed Palmer drought severity index (PDSI) are evaluated using Thornthwaite (TH), Penman–Monteith (PM), and self-calibrating Penman–Monteith (SC) PDSI in three diverse regions of the United States and tree-ring chronologies from the North American drought atlas (NADA). Minimal differences are found between the three PDSI reconstructions and all compare favorably to independently reconstructed Thornthwaite-based PDSI from the NADA. Reconstructions are bridged with model-derived PDSI_TH and PDSI_PM, which both closely track modeled soil moisture (near surface and full column) during the twentieth century. Differences between modeled moisture-balance metrics only emerge in twenty-first-century projections. These differences confirm the tendency of PDSI_TH to overestimate drying when temperatures exceed the range of the normalization interval; the more physical accounting of PDSI_PM compares well with modeled soil moisture in the projection interval. Remaining regional differences in the secular behavior of projected soil moisture and PDSI_PM are interpreted in terms of underlying physical processes and temporal sampling. Results demonstrate the continued utility of PDSI as a metric of surface moisture balance while additionally providing two recommendations for future work: 1) PDSI_PM (or similar moisture-balance metrics) compare well to modeled soil moisture and are an appropriate means of representing soil-moisture balance in model simulations and 2) although PDSI_PM is more physically appropriate than PDSI_TH, the latter metric does not bias tree-ring reconstructions of past hydroclimate variability and, as such, reconstructions targeting PDSI_TH can be used with confidence in data–model comparisons. These recommendations and the collective results of this study thus provide a framework for comparing hydroclimate variability within paleoclimatic, observational, and modeled data
Lattice QCD study of a five-quark hadronic molecule
We compute the ground-state energies of a heavy-light K-Lambda like system as
a function of the relative distance r of the hadrons. The heavy quarks, one in
each hadron, are treated as static. Then, the energies give rise to an
adiabatic potential Va(r) which we use to study the structure of the five-quark
system. The simulation is based on an anisotropic and asymmetric lattice with
Wilson fermions. Energies are extracted from spectral density functions
obtained with the maximum entropy method. Our results are meant to give
qualitative insight: Using the resulting adiabatic potential in a Schroedinger
equation produces bound state wave functions which indicate that the ground
state of the five-quark system resembles a hadronic molecule, whereas the first
excited state, having a very small rms radius, is probably better described as
a five-quark cluster, or a pentaquark. We hypothesize that an all light-quark
pentaquark may not exist, but in the heavy-quark sector it might, albeit only
as an excited state.Comment: 11 pages, 15 figures, 4 table
Recommended from our members
Forced and unforced variability of twentieth century North American droughts and pluvials
Research on the forcing of drought and pluvial events over North America is dominated by general circulation model experiments that often have operational limitations (e.g., computational expense, ability to simulate relevant processes, etc). We use a statistically based modeling approach to investigate sea surface temperature (SST) forcing of the twentieth century pluvial (1905-1917) and drought (1932-1939, 1948-1957, 1998-2002) events. A principal component (PC) analysis of Palmer Drought Severity Index (PDSI) from the North American Drought Atlas separates the drought variability into five leading modes accounting for 62% of the underlying variance. Over the full period spanning these events (1900-2005), the first three PCs significantly correlate with SSTs in the equatorial Pacific (PC 1), North Pacific (PC 2), and North Atlantic (PC 3), with spatial patterns (as defined by the empirical orthogonal functions) consistent with our understanding of North American drought responses to SST forcing. We use a large ensemble statistical modeling approach to determine how successfully we can reproduce these drought/pluvial events using these three modes of variability. Using Pacific forcing only (PCs 1-2), we are able to reproduce the 1948-1957 drought and 1905-1917 pluvial above a 95% random noise threshold in over 90% of the ensemble members; the addition of Atlantic forcing (PCs 1-2-3) provides only marginal improvement. For the 1998-2002 drought, Pacific forcing reproduces the drought above noise in over 65% of the ensemble members, with the addition of Atlantic forcing increasing the number passing to over 80%. The severity of the drought, however, is underestimated in the ensemble median, suggesting this drought intensity can only be achieved through internal variability or other processes. Pacific only forcing does a poor job of reproducing the 1932-1939 drought pattern in the ensemble median, and less than one third of ensemble members exceed the noise threshold (28%). Inclusion of Atlantic forcing improves the ensemble median drought pattern and nearly doubles the number of ensemble members passing the noise threshold (52%). Even with the inclusion of Atlantic forcing, the intensity of the simulated 1932-1939 drought is muted, and the drought itself extends too far into the southwest and southern Great Plains. To an even greater extent than the 1998-2002 drought, these results suggest much of the variance in the 1932-1939 drought is dependent on processes other than SST forcing. This study highlights the importance of internal noise and non SST processes for hydroclimatic variability over North America, complementing existing research using general circulation models
N-tree approximation for the largest Lyapunov exponent of a coupled-map lattice
The N-tree approximation scheme, introduced in the context of random directed
polymers, is here applied to the computation of the maximum Lyapunov exponent
in a coupled map lattice. We discuss both an exact implementation for small
tree-depth and a numerical implementation for larger s. We find that the
phase-transition predicted by the mean field approach shifts towards larger
values of the coupling parameter when the depth is increased. We conjecture
that the transition eventually disappears.Comment: RevTeX, 15 pages,5 figure
On Automated Lemma Generation for Separation Logic with Inductive Definitions
Separation Logic with inductive definitions is a well-known approach for
deductive verification of programs that manipulate dynamic data structures.
Deciding verification conditions in this context is usually based on
user-provided lemmas relating the inductive definitions. We propose a novel
approach for generating these lemmas automatically which is based on simple
syntactic criteria and deterministic strategies for applying them. Our approach
focuses on iterative programs, although it can be applied to recursive programs
as well, and specifications that describe not only the shape of the data
structures, but also their content or their size. Empirically, we find that our
approach is powerful enough to deal with sophisticated benchmarks, e.g.,
iterative procedures for searching, inserting, or deleting elements in sorted
lists, binary search tress, red-black trees, and AVL trees, in a very efficient
way
A Proper Motion Survey for White Dwarfs with the Wide Field Planetary Camera 2
We have performed a search for halo white dwarfs as high proper motion
objects in a second epoch WFPC2 image of the Groth-Westphal strip. We identify
24 high proper motion objects with mu > 0.014 ''/yr. Five of these high proper
motion objects are identified as strong white dwarf candidates on the basis of
their position in a reduced proper motion diagram. We create a model of the
Milky Way thin disk, thick disk and stellar halo and find that this sample of
white dwarfs is clearly an excess above the < 2 detections expected from these
known stellar populations. The origin of the excess signal is less clear.
Possibly, the excess cannot be explained without invoking a fourth galactic
component: a white dwarf dark halo. We present a statistical separation of our
sample into the four components and estimate the corresponding local white
dwarf densities using only the directly observable variables, V, V-I, and mu.
For all Galactic models explored, our sample separates into about 3 disk white
dwarfs and 2 halo white dwarfs. However, the further subdivision into the thin
and thick disk and the stellar and dark halo, and the subsequent calculation of
the local densities are sensitive to the input parameters of our model for each
Galactic component. Using the lowest mean mass model for the dark halo we find
a 7% white dwarf halo and six times the canonical value for the thin disk white
dwarf density (at marginal statistical significance), but possible systematic
errors due to uncertainty in the model parameters likely dominate these
statistical error bars. The white dwarf halo can be reduced to around 1.5% of
the halo dark matter by changing the initial mass function slightly. The local
thin disk white dwarf density in our solution can be made consistent with the
canonical value by assuming a larger thin disk scaleheight of 500 pc.Comment: revised version, accepted by ApJ, results unchanged, discussion
expande
Unprecedented 21st century drought risk in the American Southwest and Central Plains
In the Southwest and Central Plains of Western North America, climate change is expected to increase drought severity in the coming decades. These regions nevertheless experienced extended Medieval-era droughts that were more persistent than any historical event, providing crucial targets in the paleoclimate record for benchmarking the severity of future drought risks. We use an empirical drought reconstruction and three soil moisture metrics from 17 state-of-the-art general circulation models to show that these models project significantly drier conditions in the later half of the 21st century compared to the 20th century and earlier paleoclimatic intervals. This desiccation is consistent across most of the models and moisture balance variables, indicating a coherent and robust drying response to warming despite the diversity of models and metrics analyzed. Notably, future drought risk will likely exceed even the driest centuries of the Medieval Climate Anomaly (1100–1300 CE) in both moderate (RCP 4.5) and high (RCP 8.5) future emissions scenarios, leading to unprecedented drought conditions during the last millennium
- …