1,868 research outputs found
Lexical-processing efficiency leverages novel word learning in infants and toddlers
Children who rapidly recognize and interpret familiar words typically have accelerated lexical growth, providing indirect evidence that lexical processing efficiency (LPE) is related to wordâlearning ability. Here we directly tested whether children with better LPE are better able to learn novel words. In Experiment 1, 17â and 30âmonthâolds were tested on an LPE task and on a simple wordâlearning task. The 17âmonthâoldsâ LPE scores predicted word learning in a regression model, and only those with relatively good LPE showed evidence of learning. The 30âmonthâolds learned novel words quite well regardless of LPE, but in a more difficult wordâlearning task (Experiment 2), their LPE predicted wordâlearning ability. These findings suggest that LPE supports wordâlearning processes, especially when learning is difficult
Lattice-gas Monte Carlo study of adsorption in pores
A lattice gas model of adsorption inside cylindrical pores is evaluated with
Monte Carlo simulations. The model incorporates two kinds of site: (a line of)
``axial'' sites and surrounding ``cylindrical shell'' sites, in ratio 1:7. The
adsorption isotherms are calculated in either the grand canonical or canonical
ensembles. At low temperature, there occur quasi-transitions that would be
genuine thermodynamic transitions in mean-field theory. Comparison between the
exact and mean-field theory results for the heat capacity and adsorption
isotherms are provided
The link between catchment precipitation forecast skill and spread to that of downstream ensemble hydrological forecasts
Operational rainfall and flood forecasting systems across the world are increasingly using ensemble approaches.
Such systems are operated by the Flood Forecasting Centre (FFC) and Scottish Flood Forecasting Service (SFFS)
across Great Britain producing ensemble gridded hydrological forecasts for the next 5-6 days. In order to maximise
the practical day-to-day use of these systems for decision-making and warning, duty hydro-meteorologists require
a sound understanding of both the meteorological and hydrological ensemble forecast skill. In this work, a
common verification framework is defined and used in order to understand the relative levels of skill in both
rainfall and river flow forecasting systems.
A blended 24-member ensemble precipitation forecast, produced by the Met Office, is used to drive the
operational distributed hydrological model in ensemble mode. The hydrological forecasts provide output every
15 minutes out to 6 days on a 1km grid. The blended rainfall forecast is a mixture of the 2.2 km MOGREPS-UK
ensemble up to 36h and the 32 km global MOGREPS-G ensemble at longer lead-times. The forecasts are
interpolated on to a common 2 km grid and the hydrological model used is the Grid-to-Grid model (G2G)
developed by the Centre for Ecology & Hydrology. To establish an upper bound on skill, assessments over a daily
lead-time interval are studied first, and will be the focus here. Spatial and regional variations in forecast skill are
compared between the precipitation (e.g. daily accumulations) and the river flow forecasts. Also of interest is the
impact of catchment size and how to pool and present the skill metrics in a meaningful way for end-users. For
precipitation, the impact of observation type: gridded gauge-only analyses and a radar-derived (gauge calibrated)
precipitation product, is compared to quantify the uncertainty that comes from the observations. Of particular
interest is understanding how the spread in the precipitation forecast is modulated by the downstream hydrological
model. Is it inflated, does it remain comparable, or is it reduced? The work aims to establish the basis for a
real-time monitoring tool that can assist hydro-meteorologists in their interpretation of operational ensemble
forecasts, and facilitate associated decision making processes
What fraction of stars formed in infrared galaxies at high redshift?
Star formation happens in two types of environment: ultraviolet-bright
starbursts (like 30 Doradus and HII galaxies at low redshift and Lyman-break
galaxies at high redshift) and infrared-bright dust-enshrouded regions (which
may be moderately star-forming like Orion in the Galaxy or extreme like the
core of Arp 220). In this work I will estimate how many of the stars in the
local Universe formed in each type of environment, using observations of
star-forming galaxies at all redshifts at different wavelengths and of the
evolution of the field galaxy population.Comment: 7 pages, 0 figs, to appear in proceedings of "Starbursts - From 30
Doradus to Lyman break galaxies", edited by Richard de Grijs and Rosa M.
Gonzalez Delgado, published by Kluwe
Faster Approximate String Matching for Short Patterns
We study the classical approximate string matching problem, that is, given
strings and and an error threshold , find all ending positions of
substrings of whose edit distance to is at most . Let and
have lengths and , respectively. On a standard unit-cost word RAM with
word size we present an algorithm using time When is
short, namely, or this
improves the previously best known time bounds for the problem. The result is
achieved using a novel implementation of the Landau-Vishkin algorithm based on
tabulation and word-level parallelism.Comment: To appear in Theory of Computing System
Fluorescence resonance energy transfer between organic dyes adsorbed onto nano-clay and Langmuir-Blodgett (LB) films
In this communication we investigate two dyes N,N' -dioctadecyl thiacyanine
perchlorate (NK) and octadecyl rhodamine B chloride (RhB) in Langmuir and
Langmuir-Blodgett (LB) films with or with out a synthetic clay laponite.
Observed changes in isotherms of RhB in absence and presence of nano-clay
platelets indicate the incorporation of clay platelets onto RhB-clay hybrid
films. AFM image confirms the incorporation of clay in hybrid films. FRET was
observed in clay dispersion and LB films with and without clay. Efficiency of
energy transfer was maximum in LB films with clay.Comment: 15 pages 5 figures, 1 tabl
Forecasting snowmelt flooding over Britain using the Grid-to-Grid model: a review and assessment of methods
In many regions of high annual snowfall, snowmelt modelling can prove to be a vital component of operational
flood forecasting and warning systems. Although Britain as a whole does not experience prolonged periods of
lying snow, with the exception of the Scottish Highlands, the inclusion of snowmelt modelling can still have a
significant impact on the skill of flood forecasts.
Countrywide operational flood forecasts over Britain are produced using the national Grid-to-Grid (G2G)
distributed hydrological model. For Scotland, snowmelt is included in these forecasts through a G2G snow
hydrology module involving temperature-based snowfall/rainfall partitioning and functions for temperature-excess snowmelt, snowpack storage and drainage. Over England and Wales, the contribution of snowmelt is included by pre-processing the precipitation prior to input into G2G. This removes snowfall diagnosed from weather model
outputs and adds snowmelt from an energy budget land surface scheme to form an effective liquid water gridded
input to G2G.
To review the operational options for including snowmelt modelling in G2G over Britain, a project was
commissioned by the Environment Agency through the Flood Forecasting Centre (FFC) for England and Wales
and in partnership with the Scottish Environment Protection Agency (SEPA) and Natural Resources Wales (NRW). Results obtained from this snowmelt review project will be reported on here. The operational methods used by the
FFC and SEPA are compared on past snowmelt floods, alongside new alternative methods of treating snowmelt.
Both case study and longer-term analyses are considered, covering periods selected from the winters 2009-2010,
2012-2013, 2013-2014 and 2014-2015.
Over Scotland, both of the snowmelt methods used operationally by FFC and SEPA provided a clear improvement
to the river flow simulations. Over England and Wales, fewer and less significant snowfall events occurred, leading to less distinction in the results between the methods. It is noted that, for all methods considered,large uncertainties remain in flood forecasts influenced by snowmelt. Understanding and quantifying these
uncertainties should lead to more informed flood forecasts and associated guidance information
Childrenâs depressive symptoms and their regulation of negative affect in response to vignette-depicted emotion-eliciting events
The present study examined the relationship between sub-clinical depressive symptoms and children's anticipated cognitive and behavioral reactions to two written vignettes depicting emotion-eliciting stressors (i.e., fight with one's best friend and failure at a roller blade contest). Participants (N = 244) ranging in age between 10 and 13 were presented each vignette and then asked to rate their anticipated utilization of each of seven emotion-regulation strategies (ERs), along with the anticipated mood enhancement effects of each strategy. In addition, ratings of participants' perceived coping efficacy to manage the stressful situation were collected. Results indicated that participants were more likely to endorse ERs for which they have greater confidence in their mood enhancement effects. Moreover, marked differences were observed between ratings for conceptually distinct cognitive ERs. Consistent with expectations, results revealed that participants displaying higher levels of depressive symptoms were more likely to endorse cognitive and behavioral ERs that are negative, passive, and/or avoidant in nature. Children's ratings of the anticipated mood enhancement effects of several ERs were inversely related to their level of depressive symptoms, as was their perceived self-efficacy to manage the stressor. © 2007 The International Society for the Study of Behavioural Development
Statistical methods in cosmology
The advent of large data-set in cosmology has meant that in the past 10 or 20
years our knowledge and understanding of the Universe has changed not only
quantitatively but also, and most importantly, qualitatively. Cosmologists rely
on data where a host of useful information is enclosed, but is encoded in a
non-trivial way. The challenges in extracting this information must be overcome
to make the most of a large experimental effort. Even after having converged to
a standard cosmological model (the LCDM model) we should keep in mind that this
model is described by 10 or more physical parameters and if we want to study
deviations from it, the number of parameters is even larger. Dealing with such
a high dimensional parameter space and finding parameters constraints is a
challenge on itself. Cosmologists want to be able to compare and combine
different data sets both for testing for possible disagreements (which could
indicate new physics) and for improving parameter determinations. Finally,
cosmologists in many cases want to find out, before actually doing the
experiment, how much one would be able to learn from it. For all these reasons,
sophisiticated statistical techniques are being employed in cosmology, and it
has become crucial to know some statistical background to understand recent
literature in the field. I will introduce some statistical tools that any
cosmologist should know about in order to be able to understand recently
published results from the analysis of cosmological data sets. I will not
present a complete and rigorous introduction to statistics as there are several
good books which are reported in the references. The reader should refer to
those.Comment: 31, pages, 6 figures, notes from 2nd Trans-Regio Winter school in
Passo del Tonale. To appear in Lectures Notes in Physics, "Lectures on
cosmology: Accelerated expansion of the universe" Feb 201
- âŠ